Publications
As part of UC San Diego’s Design Lab, I collaborated on ReMap—a Swift (macOS) interface that uses Accessibility APIs and speech recognition to provide contextual assistance in searching for learning videos and navigating them. The previous iteration of ReMap only allowed for textual search, so by incorporating multimodal interaction we aimed to make in-task help-seeking easier and faster.
Users can speak search queries, add application-specific terms deictically (e.g., “how to erase this”), and navigate search results via speech (e.g., “next video”).
JavaScript, NodeJS, Swift, Accessibility APIs,
REST APIs, Behavioral Research