This project explores the use of new touchless technology in medical practice.
With advances in medical imaging over the years, surgical procedures have become increasingly reliant on a range of digital imaging systems for navigation, reference, diagnosis and documentation. The need to interact with images in these surgical settings offers particular challenges arising from the need to maintain boundaries between sterile and non-sterile aspects of the surgical environment and practices. Traditional input devices such as keyboard, mouse and touch screen surfaces are reliant physical contact. Such contact-based interaction introduces the possibility for contaminated material to be transferred between the sterile and non-sterile. This constraint creates difficulties for surgical staff who are scrubbed up and are dependent upon others to manipulate images on their behalf. This can create inefficiencies, which in turn can entail potential medical complications. Additionally, it can interfere with the surgeon’s interpretive and analytic use of the images.
The aim of the project then is to explore the use of touchless interaction within surgical settings, allowing images to be viewed, controlled and manipulated without contact through the use of camera-based gesture recognition technology. In particular, the project seeks to understand the challenges of these environments for the design and deployment of such systems, as well as articulate the ways in which these technologies may alter surgical practice. While our primary concerns here are with maintaining conditions of asepsis, the use of these touchless gesture-based technologies offers other potential uses. For example, such systems offer interesting possibilities for interacting with emerging 3D imaging technologies. They also enable new possibilities for how the surgeons spatially configure themselves with respect to the various screens within surgical settings by enabling interaction at a distance. Such technologies then also offer potential to re-imagine the spatial environments within which image-based surgery takes place.
-
- Trial of “touchless” gaming technology in surgery (opens in new tab), Adam Brimelow, BBC News Health (May 31, 2012)
- Touchless technology put to test by surgeons (opens in new tab) (video), Adam Brimelow, BBC News (May 31, 2012)
- Kinect imaging lets surgeons keep their focus (opens in new tab), MacGregor Campbell, New Scientist, Tech (May 17, 2012)
- برنامج 4 تك – الحلقة (opens in new tab) – Tech program 4 – Episode 73 (opens in new tab) – BBCArabicNews (May 11, 2012) [in Arabic]
- Interacting without Touching (opens in new tab), Inside Microsoft Research (March 8, 2012)
- Microsoft’s TechFest Trots Out ‘What is Now Possible’ for Computers (opens in new tab), The Seattle Times, Business/Technology (March 7, 2012)
- Microsoft Installe Kinect Dans les Salles D’operation (opens in new tab), 01net (March 7, 2012) [in French]
- Microsoft Shows Off Kinect-Based Projects at TechFest Research Fair (opens in new tab), The Tech Journal (March 6, 2012)
- Microsoft showcases new Kinect-centric projects at its TechFest Research Fair (opens in new tab), ZDNet (March 6, 2012)
-
See also the Medical Image Analysis (opens in new tab) project page.
-
This project is funded by MS Connections. We are working with a number of key clinical and reserach partners:
- Lancaster University (opens in new tab)
- King’s College London (opens in new tab)
- Guy’s and St Thomas’ NHS Trust (opens in new tab), London
- Addenbrooke’s Hospital (opens in new tab), Cambridge
人员
Antonio Criminisi
Partner Research Lead Mesh Labs (Mixed Reality & AI) Microsoft Cambridge