Enabling Always-Available Input with Muscle-Computer Interfaces
- Scott Saponas ,
- Desney Tan ,
- Dan Morris ,
- Ravin Balakrishnan ,
- Jim Turner ,
- James A. Landay
UIST '09 Proceedings of the 22nd annual ACM symposium on User interface software and technology |
Published by ACM
Previous work has demonstrated the viability of applying offline analysis to interpret forearm electromyography (EMG) and classify finger gestures on a physical surface. We extend those results to bring us closer to using muscle-computer interfaces for always-available input in real-world applications. We leverage existing taxonomies of natural human grips to develop a gesture set covering interaction in free space even when hands are busy with other objects. We present a system that classifies these gestures in real-time and we introduce a bi-manual paradigm that enables use in interactive systems. We report experimental results demonstrating four-finger classification accuracies averaging 79% for pinching, 85% while holding a travel mug, and 88% when carrying a weighted bag. We further show generalizability across different arm postures and explore the tradeoffs of providing real-time visual feedback.