I’m one of those extremely motion sensitive people who, so far, haven’t found Virtual Reality to be a friendly environment. People keep telling me that faster refresh rates and higher resolution displays have gone a long way to eliminate VR sickness for many people—but you couldn’t prove it by me.
Jack McCauley, the co-founder Oculus VR who now spends most of his time in his own R&D laboratory in Livermore, Calif., says that’s because VR sickness is not all about the display. This week in Napa, Calif., McCauley addressed the MEMS Executive Congress—a gathering of people who run the companies that design, build, use, and invest in MEMS technology. He said the problem is head-tracking: current VR systems just don’t do a good enough job.
“High refresh rates and high render rates do not relieve motion sickness, they only add unnecessary cost to VR systems,” he stated on a slide.
McCauley, who left Oculus shortly after its 2014 acquisition by Facebook, is a tinkerer (he was officially named a “Junior Tinkertoy Engineer” by Tinkertoy at age nine). And inventing through tinkering is how he intends to solve the head-tracking problem.
Giving attendees a peek at his cluttered Livermore laboratory, as well as a peek into his thought processes, McCauley said that MEMS are the key to solving the head tracking problem. “If you are a MEMS manufacturer and you aren’t involved with VR, you should be,” he said. But it wasn’t just the basic motion tracking sensors he was talking about.
Indeed, all of the VR headgear coming out relies on sensors, like three-axis gyroscopes, to detect head motions including micro adjustments in posture. But these sensors don’t necessarily know that the wearer is moving about a room, particularly if he’s doing so without moving his head much. Motion tracking based on these head-mounted sensors might get better as people begin to add algorithms that describe the kinematics of the body; that’s going to take a lot of research, McCauley indicated. But in the meantime, an easier way to fix this shortcoming is to add an external camera that looks for markers on the headgear. That’s what McCauley says he did in building the webcam-based tracking system for the Oculus DK2.
But a camera-based approach, McCauley said, has a fundamental limitation: It pushes way too much data to the processor, and the system just can’t keep up. The image analysis software has to evaluate at least 60 frames of video per second, extract position info, and then give it to the VR software, which then estimates a little bit ahead of where the player was in order to render the next image. “You can only run that so fast,” he said.
A better approach, McCauley determined, is that used in the HTC Vive, due to ship to consumers next year. The Vive tracking system, which the company calls Lighthouse, relies on two infrared laser scanners placed in the room, and photo sensors on the Vive headset that register the passing of the laser beam.
“The computation is simple,” McCauley said. “No frame buffer, no processing to speak of; you just calculate the timing.”
The Vive system, however, has limitations. It uses motors to move the mirrors that direct the laser beams; they end up sweeping the room about 100 times per second.
“I want to run at 1 kilohertz,” McCauley said. He decided motors wouldn’t get him there, so he turned to looking at MEMS mirrors. Here’s where the tinkering came in. He took apart a laser printer to look at the motor assembly, then took apart a picoprojector that used MEMS mirrors to steer beams. He determined that MEMS scanning is a far better approach.
It’s a work in progress.
“I’m going to use a couple of MEMS mirrors and a laser,” he said. “It will be portable; I will be able to stick it on the wall and walk around in my [VR] play volume.”
He’s also been thinking about another big problem with VR: improper audio modeling—but his speaking time was up, so he took those thoughts with him back to his lab.