Modeling, Tracking and Interactive Animation of Faces and Heads using Input from Video
- Irfan Essa ,
- Sumit Basu ,
- Trevor Darrell ,
- Alex Pentland
M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 370. Appears: Proceedings of Computer Animation '96 Conference, Geneva, Switzerland, June 1996 |
We describe tools that use measurements from video for the extraction of facial modeling and animation parameters, head tracking, and real-time interactive facial animation. These tools share common goals but rely on varying details of physical and geometric modeling and in their input measurement system. Accurate facial modeling involves fine details of geometry and muscle coarticulation. By coupling pixel-by-pixel measurements of surface motion to a physically-based face model and a muscle control model, we have been able to obtain detailed spatio-temporal records of both the displacement of each point on the facial surface and the muscle control required to produce the observed facial motion. We will discuss the importance of this visually extracted representation in terms or realistic facial motion synthesis. A similar method that uses an ellipsoidal model of the head coupled with detailed estimates of visual motion allows accurate tracking of head motion in 3-D. Additionally, by coupling sparse, fast visual measurements with our physically-based model via an interpolation process, we have produced a real-time interactive facial animation/ mimicking system.