The December 2022 issue of IEEE Spectrum is here!

Close bar

Oculus Co-founder Jack McCauley’s Next Challenge: The Perfect Head-Tracker for VR

“I’m going to use a couple of MEMS mirrors and a laser,” he says

3 min read
Oculus Co-founder Jack McCauley’s Next Challenge: The Perfect Head-Tracker for VR
Jack McCauley
Photo: MEMS Industry Group

I’m one of those extremely motion sensitive people who, so far, haven’t found Virtual Reality to be a friendly environment. People keep telling me that faster refresh rates and higher resolution displays have gone a long way to eliminate VR sickness for many people—but you couldn’t prove it by me.

Jack McCauley, the co-founder Oculus VR who now spends most of his time in his own R&D laboratory in Livermore, Calif., says that’s because VR sickness is not all about the display. This week in Napa, Calif., McCauley addressed the MEMS Executive Congress—a gathering of people who run the companies that design, build, use, and invest in MEMS technology. He said the problem is head-tracking: current VR systems just don’t do a good enough job.

“High refresh rates and high render rates do not relieve motion sickness, they only add unnecessary cost to VR systems,” he stated on a slide.

McCauley, who left Oculus shortly after its 2014 acquisition by Facebook, is a tinkerer (he was officially named a “Junior Tinkertoy Engineer” by Tinkertoy at age nine). And inventing through tinkering is how he intends to solve the head-tracking problem.

Giving attendees a peek at his cluttered Livermore laboratory, as well as a peek into his thought processes, McCauley said that MEMS are the key to solving the head tracking problem. “If you are a MEMS manufacturer and you aren’t involved with VR, you should be,” he said. But it wasn’t just the basic motion tracking sensors he was talking about.

Indeed, all of the VR headgear coming out relies on sensors, like three-axis gyroscopes, to detect head motions including micro adjustments in posture. But these sensors don’t necessarily know that the wearer is moving about a room, particularly if he’s doing so without moving his head much. Motion tracking based on these head-mounted sensors might get better as people begin to add algorithms that describe the kinematics of the body; that’s going to take a lot of research, McCauley indicated. But in the meantime, an easier way to fix this shortcoming is to add an external camera that looks for markers on the headgear. That’s what McCauley says he did in building the webcam-based tracking system for the Oculus DK2.

But a camera-based approach, McCauley said, has a fundamental limitation: It pushes way too much data to the processor, and the system just can’t keep up. The image analysis software has to evaluate at least 60 frames of video per second, extract position info, and then give it to the VR software, which then estimates a little bit ahead of where the player was in order to render the next image. “You can only run that so fast,” he said.

A better approach, McCauley determined, is that used in the HTC Vive, due to ship to consumers next year. The Vive tracking system, which the company calls Lighthouse, relies on two infrared laser scanners placed in the room, and photo sensors on the Vive headset that register the passing of the laser beam.

“The computation is simple,” McCauley said. “No frame buffer, no processing to speak of; you just calculate the timing.”

The Vive system, however, has limitations. It uses motors to move the mirrors that direct the laser beams; they end up sweeping the room about 100 times per second.

“I want to run at 1 kilohertz,” McCauley said. He decided motors wouldn’t get him there, so he turned to looking at MEMS mirrors. Here’s where the tinkering came in. He took apart a laser printer to look at the motor assembly, then took apart a picoprojector that used MEMS mirrors to steer beams. He determined that MEMS scanning is a far better approach.

It’s a work in progress.

“I’m going to use a couple of MEMS mirrors and a laser,” he said. “It will be portable; I will be able to stick it on the wall and walk around in my [VR] play volume.”

He’s also been thinking about another big problem with VR: improper audio modeling—but his speaking time was up, so he took those thoughts with him back to his lab.

The Conversation (0)

Digging Into the New QD-OLED TVs

Formerly rival technologies have come together in Samsung displays

5 min read
Television screen displaying closeup of crystals

Sony's A95K televisions incorporate Samsung's new QD-OLED display technology.

Televisions and computer monitors with QD-OLED displays are now on store shelves. The image quality is—as expected—impressive, with amazing black levels, wide viewing angles, a broad color gamut, and high brightness. The products include:

All these products use display panels manufactured by Samsung but have their own unique display assembly, operating system, and electronics.

I took apart a 55-inch Samsung S95B to learn just how these new displays are put together (destroying it in the process). I found an extremely thin OLED backplane that generates blue light with an equally thin QD color-converting structure that completes the optical stack. I used a UV light source, a microscope, and a spectrometer to learn a lot about how these displays work.

Keep Reading ↓Show less