Articulated Human Pose Tracking with Inertial Sensors
Thus far, capturing human body motion has only been possible with precisely-positioned sensors and cameras in well-calibrated studio environments. This constraint has fundamentally restricted the application of motion-capture technology to low-no mobility scenarios. To precisely track motion, the state-of-the-art is to either use well-calibrated inertial or magnetic-position sensors tightly strapped to the body or high-speed cameras confined to a finite capture volume. In recent work, we aim to break this barrier around motion-capture technology and make it accessible outside of traditional studio environments. Thus, we enable new applications in VR-AR, health-care and sports training. In this talk, we will describe a new system with sensors integrated into everyday garments that can be used for articulated motion tracking in unconstrained and mobile settings. We will show how to quantify the displacement between sensors and body segments within a multi-rooted kinematic chain, and utilize deep-learning techniques for full-pose reconstruction. Our results are based on over 3 hours of data collected via 215 trials on 12 test subjects within a custom-studio room set up for this purpose.
- 日期:
- 演讲者:
- Xuesu Xiao
- 所属机构:
- Texas A&M University
-
-
Shuayb Zarar
Principal Applied Scientist Manager
-
-