Virtual reality (VR) is an incredibly exciting way to experience computing, providing users with intuitive and immersive means of interacting with information that attempts to mirror the way we naturally experience the world around us. In the past few years, powerful VR systems have dropped in price and are on the verge of becoming mainstream technologies with potential uses in all kinds of applications. However, most VR technologies focus on rendering realistic visual effects. In fact, the hallmark of most VR systems is the head-mounted display that completely dominates a user’s visual field. But what happens if a VR user is blind? Does that mean that they are completely shut out of virtual experiences?
In this project, published in the paper “Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds,” we investigated a new controller that mimics the experience of using a white cane to enable a user who is blind to explore large virtual environments in the same way they navigate the real world—by using their senses of touch and hearing. Our paper has been accepted at the ACM CHI Conference on Human Factors in Computing Systems (CHI 2020) and received an Honourable Mention Award.
Making a white cane for the virtual world with users’ needs in mind
Spotlight: blog post
White cane users undergo intensive training to effectively use their canes to navigate and explore the world. They learn to hold the cane differently for different situations, listen to the sounds generated as the cane taps or sweeps along the ground and obstacles, and feel subtle changes in vibrations as the cane encounters different materials. By combining this experience with their other senses (sound, smell, and touch), they can use their cane to effectively navigate their environment.
In 2018, we introduced the concept of a haptic white cane controller for VR in a paper that demonstrated how users who are blind could utilize their skills with a white cane to explore a small virtual space. This year, we expand on this work to make the controller more natural, allowing for immersive navigation of large, complex environments comprising multiple rooms. The controller is mounted to a harness that users wear around their waist, and they can then hold the controller like they would an ordinary white cane. This allows them to use the mobility and orientation skills that they’ve learned for the real world to navigate a virtual world, using the virtual cane to detect walls, doors, obstacles, and changes in surface textures. See Figure 1 below for a detailed breakdown of the controller’s components.
Putting the pieces together: How the controller emulates a real-world environment
Our controller uses a lightweight, three-axis brake mechanism (controlling dimensions of movement side-to-side, up-down, and forward-backward) to provide users the general shape of virtual objects. Each of the braking mechanisms has a unique construction that enables it to address different needs (see our paper for in-depth explanations of each of these). In Figure 2, we show how one of these brakes operates using a coiled cord to provide tension, and further details of how the brake utilizes friction using a capstan can be found in Figure 3. The flexibility of the three-axis system enables people to adapt the controller to different grips, depending on the context of use.
In addition to braking of the cane movement when it collides with a virtual object, we mounted a multifrequency vibrator to the controller to mimic the high frequencies felt when the cane rubs against different textures. The controller feels and sounds differently depending on the texture of the surface the virtual cane encounters. When you drag a cane across concrete, it sounds and feels very different from when you drag it across a wood floor or carpeting, and the controller mimics this experience. Finally, we provide 3D audio that is based on the geometry of the environment using Project Triton, technology developed at Microsoft Research. With this capability, a radio playing around the corner in another room sounds as if it’s coming from that location and traveling around a corner.
Putting all these components together, our controller allowed users who are blind to effectively explore a complicated virtual world of 6 meters by 6 meters to play a scavenger hunt game, locating targets and avoiding obstacles and traps. In user testing, we found that seven out of eight users were able to play the game, successfully navigating to locate targets while avoiding collisions with walls and obstacles (see Figure 4 for details).
Creating a white cane for the virtual space does come with challenges that we did not anticipate. For example, there are several types of white canes commonly used by people who are blind. These vary in weight, stiffness, and the kinds of tips they use. Some people prefer nylon or roller tips that easily glide over the ground, while others prefer the enhanced sensitivity of a metal tip. In our research, the sounds and feel of our controller were based on a carbon fiber cane with a metal tip. Users who were accustomed to the feedback from a metal-tip cane in the real world could easily identify the experiences they were having in VR. However, people who used a nylon- or roller-tip cane had a harder time identifying VR objects and surfaces because the feeling and sounds were very different to what they were used to. In future work, we would like to provide users the ability to change the virtual tip and cane materials to match what they typically use in the real world.
Overall, we found that by using our system, we could provide users who are blind with a compelling VR experience through multimodal haptic and audio feedback. Our prototype system suggests that VR doesn’t have to be limited only to those who have certain capabilities. To be clear, our prototype controller is still a long ways off from being a commercial product and there are many obstacles that we must overcome before something like this would be ready for commercialization. However, as VR becomes more common, it is critical that we try to include as many people as possible in our designs. This project shows one way that we can make this a reality.