By Andrew Maimone (opens in new tab), Researcher; Andreas Georgiou (opens in new tab), Researcher; Joel Kollin (opens in new tab), Principal Research Hardware Development Engineer
Last week at the SCIEN Workshop on Augmented and Mixed Reality (opens in new tab), a group of industry and academic researchers met to discuss the future of virtual and mixed reality (VR/MR). One goal was clear: we’d all like to put on devices that look and feel like ordinary eyeglasses but can bring us to new places. We also want to maintain the full fidelity of human vision while experiencing VR/MR.
We’re still in the early days on the journey toward this vision, and there isn’t a clear route to solving all the optical challenges. For example, today’s near-eye displays feature a trade-off between bulkiness and field of view. They provide only part of the information the brain uses to judge depth. It’s not clear how to make a display that looks like eyeglasses when many must wear actual eyeglasses to see clearly. Typically, when we try to solve one of these challenges, we take a step back in another area.
In a new publication (opens in new tab) to be presented at SIGGRAPH 2017 (opens in new tab), we propose a new solution to VR/MR displays that has the potential to address all these problems at once: holography. When light waves from a 3D scene meet on a flat surface, they interact to form a sophisticated pattern, known as a hologram — much like a series of waves combining to form a detailed set of ripples in a pool. The hologram can then be “played back” with a beam of laser light to re-create all the intricacies of the 3D scene, including the depths of all the objects.
One of the greatest challenges to building near-eye displays is balancing the trade-off between the field of view and device form factor. A wide field of view is desirable so that we can see the whole scene at once; at the same time, we desire a lightweight, compact and aesthetically pleasing design that is both comfortable to wear all day and socially acceptable. Today we find these qualities only in isolation: sunglasses-like displays with very narrow fields of view (<20 degrees horizontal), or goggle- or helmet-sized displays with wide fields of view (>= 80 degrees horizontal). In our work, we created a simple prototype display with a sunglasses-like form factor, but a wide field of view (80 degrees horizontal), shown below. The key insight from our admittedly crude prototype is that we can create an optical design with visual defects, that is very compact, and then correct the defects in the hologram calculation. In the image below, we show a photograph of our prototype display as well as a simple augmented image shot through the display — note the image is clear edge-to-edge.
Another key challenge in building near-eye displays is reproducing the focal depth cues. When viewing real objects, our eyes can focus on different depths, and the blurriness of out-of-focus objects is used to judge depth. However, most near-eye displays present all objects at the same focus. This limitation causes visual discomfort, reduces the realism of the scene and can cause focus conflicts between real and virtual objects. Many solutions have been proposed to address the problem — varifocal displays, multifocal displays and light-field displays — however, these solutions provide only a coarse approximation of the focal cues of the real scene. Holographic displays can provide highly accurate focal control that is individually addressable on a pixel by pixel basis, like a real scene.
In the image below, we show a hologram on a prototype display where each pixel in the image is at exactly the correct focus. The camera is focused at the dragon’s chest region, which appears in focus, while the nearer and farther regions appear out of focus. In the blue-bordered inset images, we show additional images of the chest, middle body and tail regions when the camera was focused at that region and focused away from that region. See the video (opens in new tab) for an animation of the camera focusing through the depth of the model.
Another important challenge to solve in near-eye displays is handling problems with the user’s vision. If we ultimately wish to make a display the size of eyeglasses, we must build the functionality of eyeglasses into the display. A current solution is to provide adjustable image focus control by mechanically moving the distance between the display panel and the lens. However, this solution does not scale to providing a display the size of eyeglasses, and does not correct for more advanced vision problems such as astigmatism. We demonstrate that holographic displays can correct for advanced vision problems entirely in software by pre-distorting the emitted light waves so that they appear correct when viewed by the eye.
In the image below, we demonstrate a vision correction capability on a prototype display. In the left image, for reference, we show a holographic display viewed with normal vision. In the center image, we show the display as it would look to a viewer with astigmatism — note in the blue-bordered inset image that the horizontal lines become blurred vertically. In the right image, we show the display viewed with astigmatism, but with compensation in the hologram; note that it looks virtually the same as when viewed with normal vision.
For more information, please visit our project page (opens in new tab) and video (opens in new tab).
The Microsoft Research paper is not necessarily indicative of any Microsoft product roadmap, but relates to basic research around holographic displays.
Related