As part of UC San Diego’s Design Lab, I collaborated on ReMap—a Swift (macOS) interface that uses Accessibility APIs and speech recognition to provide contextual assistance in searching for learning videos and navigating them. The previous iteration of ReMap only allowed for textual search, so by incorporating multimodal interaction we aimed to make in-task help-seeking easier and faster.

Users can speak search queries, add application-specific terms deictically (e.g., “how to erase this”), and navigate search results via speech (e.g., “next video”).

JavaScript, NodeJS, Swift, Accessibility APIs,
REST APIs, Behavioral Research

C. Ailie Fraser, Julia M. Markel, N. James Basa, Mira Dontcheva, and Scott Klemmer. 2020. ReMap: Lowering the Barrier to Help-Seeking with Multimodal Search. UIST ‘20.

C. Ailie Fraser, Julia M. Markel, N. James Basa, Mira Dontcheva, and Scott Klemmer. 2019. ReMap: Multimodal Help-Seeking. UIST ‘19, New Orleans, LA, USA. (Demo, Poster).