Enabling Easily Learnable Eyes-free Interaction by Exploiting Human Experience
As computing moves towards mobile devices, new challenges emerge for Human-Computer Interaction. Constrained by the shortcomings of the visual and audio channels, users will require new modalities for receiving information from their devices. As a result, there is a tremendous opportunity for the tactile communication medium. The impressive body of haptic research to date has taken an information theoretic approach towards increasing the bandwidth of information transfer through the skin. In this talk, I outline a new direction of haptic research that breaks away from a bandwidth focused approach, instead examining how human experience can be exploited to generate tactile messages with pre-learned meaning. I will talk about how music, human touch and speech can be mapped to the tactile channel. I conclude with future directions for this research, looking at how this approach can be generalized to end-user generated tactile messages for Computer-Mediated Tactile Communication.
Speaker Bios
Kevin Li is a PhD candidate in Computer Engineering at the University of California San Diego. His research focuses on building and evaluating tools to raise the ceiling for mobile interaction, enabling users to access information in scenarios where they previously could not.
- Date:
- Haut-parleurs:
- Kevin Ansia Li
- Affiliation:
- University of California
-
-
Jeff Running
-
-