Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Tiny Bubbles and Force Fields: Feeling the Virtual World With Ultrasound

Ultrahaptics launches developer’s kit; get ready to feel virtual explosions

3 min read

A user of Ultrahaptics ultrasound system puts his hand into a virtual reality force field and can feel the blast against his palm and fingers
Photo: Ultrahaptics

I pushed my hand forward, palm down, until I could see it on the laptop screen. My real-life hand interrupted a virtual stream of bubbles that I felt gently popping against my palm.

Wow. I left my hand there for a long time—the touch of bubbles popping is actually quite pleasant, particularly in a virtual world, where you’re not getting wet and slimy from bubble soap.

That was my introduction to Ultrahaptics’ ultrasound peripheral for computing devices. Executives from the U.K.-based company were roaming Silicon Valley last week, meeting with researchers from virtual reality and automotive companies and getting ready to start hiring for a Silicon Valley office.

My colleague Evan Ackerman saw an earlier demo at CES: a stove that lets you feel virtual burner controls above the surface. Since then, the company has dramatically improved the resolution at which it can generate virtual objects—from 200 frames per second to 10,000 frames per second. That allows the user to feel distinct textures, not just the general sense that something is virtually there. And this week, the company will start taking preorders for a $2000 developers’ kit—a big price drop from its $20,000 evaluation kit that it expects will get many more applications creators working with its technology.

CEO Steve Cliffe says Ultrahaptics sold those evaluation kits to automotive manufacturers (for audio and other system controls), developers of virtual and augmented reality games, and various consumer electronics and industrial products manufacturers, but kept sales small so it could work closely with the companies. The new developers kit, shipping in January, will include more extensive development tools that go along with the hardware, including what the company calls a “sensation editor” and a few preloaded sensations (including, I hope, those bubbles).

The system uses an array of ultrasound transducers to pulse patterns of soundwaves tuned to be felt by human skin. The demos I saw [video, above] used a Leap motion controller to detect the position of the hand and know where it is in relation to the virtual objects. But Cliffe says it will operate with any kind of motion detection hardware.

After I finally pulled my hand out of the stream of bubbles, Cliffe switched the system to what he calls a force field—a screen of what feels like a strong blast of air about the intensity of what you’d feel from a commercial hand dryer in a public restroom. You can push your hand through, either all at once or a finger at a time. Though not nearly as intriguing a sensation as the bubbles, you can certainly see the applications: The whoosh of air could represent a bomb blast or any number of other things in a VR game, or an appliance manufacturer could use it to keep hands away from a hot cooktop.

Finally, I moved to the company’s nascent VR demo—Ultrahaptics promises more sophistication in its VR experiences soon, but right now, the demo just involved a ball and a block. I turned down the opportunity to try the demo using a VR headset (I didn’t have my motion sickness ReliefBand with me). Instead, I had someone hold the headset so that it was pointed at my hands while I looked at the demo on a laptop display. With all that in mind, I wasn’t expecting much. But, to my surprise, not only did I feel the ball in my hand when I picked it up, but when I turned my hand palm up to cradle the ball, I still felt it sitting in my palm—even though the ultrasound pulses had to have been hitting the back of my hand. “It’s amazing what the brain will do to make sense of what you’re feeling,” Cliffe told me.

Ultrahaptics is continuing to refine its software, and it expects to soon release a more streamlined version of its transducer array. Truthfully, says cofounder and chief technology officer Tom Carter, “There’s nothing special about our hardware; our secret sauce is the software. At this point, we are a hardware company that wants to be a software company.”

Cliffe expects licensees will start releasing products using this technology in 2017, with the first automotive applications coming out in car models in 2018. Further down the line, Ultrahaptics sees its technology featured in ATMs (as a security feature. The upshot: it’s harder to see what someone is doing at an automated teller if you don’t see them touching buttons). Ultrahaptics will also revolutionize elevator controls in hospitals (reducing bacteria transmission), television controls (an ultrasound array in a sound bar would allow fingertip remote control without a physical remote), and telepresence (use your imagination. Or don’t). The company is also funding research into using the technology for the visually impaired. “Because we project the controls on your hand, you don’t have to find the control, it finds you,” Carter said.

The Conversation (0)