Over the last year, we’ve seen a transition from the peak of virtual reality (VR) hype into the much more tempered—and consequently less impressive—reality of commercial VR systems. The experience is still good, or as good as can be expected, but VR in general still suffers from a variety of minor annoyances, including a fat cable that connects your headset to your computer, external tracking systems that have to be set up, and the fact that you’re immersed in an artificial reality means that everything around you in real life is suddenly a collision hazard.
Intel’s Project Alloy is a virtual reality headset that solves all of these problems. It’s completely tetherless—all of the head and body tracking is contained inside the headset itself. It can sense objects in your environment and convert them into virtual representations of themselves, allowing you to wander around your living room without permanent damage to your shins.
Intel first announced Project Alloy in August at its 2016 Developer’s Conference as “an all-in-one virtual reality solution made from the ground up.” Inside of it is an entire computer featuring a Core M processor, along with a dedicated vision processor that isn’t yet, but will soon be, a chip from Movidius. The vision processor handles the data coming in from fish-eye cameras and a pair of Intel’s RealSense 200 series depth sensors, although Intel says they’ll soon be replaced with the smaller and more efficient RealSense 400. The cameras are used to assist an inertial measuring unit (IMU) with motion tracking, meaning that the headset won’t work quite as well in the dark, although there’s no reason that Intel can’t use the RealSense data for pose estimation as well. Otherwise, you’re looking at a conventional 1080p-per-eye display running at 90 frames per second, along with a battery that will keep things running for—Intel hopes—“hours” (which probably means two).
While the experience of Project Alloy is completely in virtual reality (as opposed to augmented reality, which augments the real world with a virtual overlay), a more accurate description that Intel has started to toss around is “merged reality.” In other words, the real world gets merged into the virtual simulation. This is what the RealSense is for: It’s a depth sensor that can create a 3D representation of the world in front of it. Point it at your couch and coffee table and it knows exactly where they are and what they look like.
With this information, Project Alloy can incorporate the couch and coffee table directly into the VR simulation. They don’t have to look like your couch and coffee table in the simulation, but by creating some simulated objects that are about the same size and in the same place, not running into stuff in VR means you don’t run into stuff in real life either, whether it’s a coffee table, couch, chairs, walls, or anything else.
The RealSense in Project Alloy is good for more than just static obstacle avoidance in merged reality. The sensor can also identify other people and merge them into the VR, as well as your own body: Being able to see your hands in VR, for example, helps to make the experience more immersive, although these features weren’t enabled in the version we tried at CES.
Photo: Evan Ackerman
Upstairs in Intel’s massive CES booth, IEEE Spectrum senior editor Tekla Perry and I got to spend a few minutes trying out Project Alloy for ourselves. The demo took place in a small room with a couch and some tables, and as soon as I put the headset on, I was standing on some sort of quite decently rendered spaceship-y platform. In VR, a control console took up the middle of the floor where the table was in real life, while the couch was some kind of engineering work station. It wasn’t a perfect replacement, such that reaching out with the controller to touch the virtual console didn’t reliably result in making physical contact with the real table, but everything was accurate enough that I could comfortably make a circuit of the room without running into anything.
Photo: Tekla Perry
The other part to being able to do this, of course, is the fact that the Project Alloy headset is completely self contained. You're not dragging a wired connection to a beefy desktop PC behind you, which makes walking around possible. In order to do this, Intel has stuffed the headset with batteries and some beefy computing equipment, which you pay for in weight and comfort, although I found the balance of the overall system to be unobjectionable. It also means that you’re sacrificing performance, and once the headset batteries run dry, no more VR for you. I’d guess that for many gamers, this won’t be acceptable, but for people who are more interested in the overall VR experience, I found Project Alloy’s compromises to be well worth it, at least for the 3 minutes I had the thing on.
We should mention that Intel is not the first company to have had this “merged reality” idea: At the 2014 Game Developer's Conference in San Francisco, we tried out this thing, called the Sulon Cortex:
Photo: Evan Ackerman
In many ways, the Cortex is just like Project Alloy, without all of the integration: There’s a laser sensor on top to detect walls, an IMU to track the motion of the user, and computers and batteries to power it all. With the Cortex on, the walls of your real environment are mapped onto the walls of your virtual environment, such that you can see a virtual wall, walk up to it and put your hand out, and touch a real wall. Not bad for 2014, although the Sulon website was last updated in March of last year, so we’re not sure what’s happening with it.
As for Project Alloy, Intel doesn’t have any plans to produce a commercial version itself, but a version should be available for developers sometime around the second quarter of this year. Plus, an unspecified partner is supposed to be releasing something commercial later this year.
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.