Looking At You: Fused Gyro and Face Tracking for Viewing Large Imagery on Mobile Devices
- Neel Joshi ,
- Abhishek Kar ,
- Michael Cohen
CHI '12 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems |
Published by ACM
We present a touch-free interface for viewing large imagery on mobile devices. In particular, we focus on viewing paradigms for 360 degree panoramas, parallax image sequences, and long multi-perspective panoramas. We describe a sensor fusion methodology that combines face tracking using a front facing camera with gyro scope data to produce a robust signal that defines the viewer’s 3D position relative to the display. The gyroscopic data provides both low-latency feedback and allows extra polation of the face position beyond the field of-view of the front-facing camera. We also demonstrate a hybrid position and rate control that uses the viewer’s 3D position to drive exploration of very large image spaces. We report on the efficacy of the hybrid control vs. position only control through a user study.