10 Aug 2017
New optics to widen field of view critical to next-generation virtual reality, says Oculus research lead.
by Ford Burkhart in San Diego
“The world is a light show.” That’s according to Scott McEldowney, lead optics researcher at Seattle-based virtual reality (VR) pioneer Oculus Research, who addressed the “hot topics” opening plenary session at this week’s SPIE Optics+Photonics conference in San Diego.
McEldowney described the major optics challenges – for example delivering a 140 degree field of vision – faced by those in creating the next generation of VR and augmented reality (AR) hardware. Another key issue to solve is depth of focus, something described as "intractable".
“Every object reflects light to us,” he said. “We are primarily visual creatures, and how we deliver the photons to the retina is crucial.” One goal of VR is to “build a time machine” to carry the viewer elsewhere. To do that, McEldowney said, “we need to work with perceptual scientists on new experiments.”
“It’s my guess that a wider field of view will be very compelling,” he added. He considers reaching a 140 degree field of view “is reasonable.”
“New optical technology will be required,” McEldowney said, “and we don’t know what the enabling technology will be, but we will move past the 100 degree mark.”
But he warned that it will be critical to identify suitable trade-offs as the technology is developed, because a wider field of view can work against image quality.
Pixel density is often lacking in VR, he explained. “We need eight times the pixel density, and it is going to take us a long time to get there. How do we advance to the next step?”
Depth of focus problem
As researchers try to create that better “time machine,” they are aware that it is focus effects that cue the brain's visual response. So, depth of focus is the next critical step in VR, McEldowney said. And yet: “depth of focus is the problem that is intractable.”
In an attempt to begin solving the problem, he said that Oculus has developed a platform for depth-of-focus experiments, to measure “vergence”, where each eye is focusing, individually. And the toughest challenge is to deliver content at different focal distances.
To illustrate that phenomenon, he showed a video of a receding stone wall, with parts of the wall image in focus and other parts blurred. The objective will be to see the image starting to change as we look at and focus at different distances, McEldowney said.
The lens in our eyes changes shape as we look at objects, and means that when our focus is fixed at, say, two meters, it creates a conflict between the image and the place where the viewer’s brain wants the eyes to focus.
Summarizing the elements that are critical to developing superior AR and VR devices, McEldowney highlighted good optics and displays; appealing graphics; eye-tracking capacity; fine audio; effective ergonomics; and strong computer vision.
For holographic displays, multifocal and varifocal displays, and light-field displays, he expects “some level of accommodation in the next few years.”
And, aside from the difficulties with the depth of focus factor, he said that developments were progressing well. He concluded: “We are only just beginning to have what VR could be.”
About the Author
Ford Burkhart in a freelance writer based in Tucson, Arizona.
|Hyundai Mobis develops Driver State Warning technology|
|Metasurface gratings offer new route to polarization imaging|
|Quanergy lidar sensors land security mission|
|Trumpf platform brings high productivity to denture manufacture|
|Partnership to enable ‘plug-and-play’ 3D facial recognition|
|LASER 2019: German firms tout auto lidar components|