HoloAssist: an Egocentric Human Interaction Dataset for Interactive AI Assistants in the Real World
- Xin Wang ,
- Taein Kwon ,
- Mahdi Rad ,
- Bowen Pan ,
- Ishani Chakraborty ,
- Sean Andrist ,
- Dan Bohus ,
- Ashley Feniello ,
- Felipe Vieira Frujeri ,
- Neel Joshi ,
- Marc Pollefeys
Building an interactive AI assistant that can perceive, reason, and collaborate with humans in the real world has been a long-standing pursuit in the AI community. This work is part of a broader research effort to develop intelligent agents that can interactively guide humans through performing tasks in the physical world. As a first step in this direction, we introduce HoloAssist, a large-scale egocentric human interaction dataset, where two people collaboratively complete physical manipulation tasks. The task performer executes the task while wearing a mixed-reality headset that captures seven synchronized data streams. The task instructor watches the performer’s egocentric video in real time and guides them verbally. By augmenting the data with action and conversational annotations and observing the rich behaviors of various participants, we present key insights into how human assistants correct mistakes, intervene in the task completion procedure, and ground their instructions to the environment. HoloAssist spans 166 hours of data captured by 350 unique instructor-performer pairs. Furthermore, we construct and present benchmarks on mistake detection, intervention type prediction, and hand forecasting, along with detailed analysis. We expect HoloAssist will provide an important resource for building AI assistants that can fluidly collaborate with humans in the real world. Data can be downloaded on GitHub (opens in new tab).
Publication Downloads
HoloAssist
January 25, 2024
A large-scale egocentric human interaction dataset, where two people collaboratively complete physical manipulation tasks.