Enabling interaction between mixed reality and robots via cloud-based localization
We introduce a way to enable more natural interaction between humans and robots through Mixed Reality, by using a shared coordinate system. Azure Spatial Anchors, which already supports colocalizing multiple HoloLens and smartphone devices in the same space, has now been extended to support robots equipped with cameras.
This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person’s perspective. We hope that this can be a building block in the future of humans and robots being collaborators and coworkers.
Check out the code at aka.ms/ASALinuxSDK (opens in new tab).
- Date:
- Haut-parleurs:
- Marc Pollefeys, Juan Nieto, Helen Oleynikova, Jeff Delmerico
- Affiliation:
- Microsoft Mixed Reality and AI Lab Zurich
-
-
Helen Oleynikova
Senior Scientist
-
Jeffrey Delmerico
Senior Scientist
-
Juan Nieto
Principal Research SDE
-
Marc Pollefeys
Partner Director of Science
-
Patrick Misteli
Program Manager II
-
-