Last year at CES, we experienced a very cool demo from Ultrahaptics of an ultrasound-based gesture interface that provides invisible tactile feedback in mid-air. This year, Bristol, England-based start-up is showing how their technology can be embedded into devices like cars, stereos, and stoves. And it's exactly as magical as we were hoping it would be.
Ultrahaptics' tactile interface is based on a Leap Motion sensor that tracks the location of your hand in space paired with an array of ultrasonic transducers. The transducers generate ultrasonic waves that constructively interfere with each other where they intersect, generating targeted points of invisible turbulence that you can feel.
Leap Motion sensor paired with Ultrahaptics’ ultrasonic transducer arrayPhoto: Evan Ackerman/IEEE Spectrum
Ultrahaptics showed off a few new demos in a private suite at CES last week that we got to experience for ourselves. The most impressive one was definitely the stove, where you can control the temperature of four individual burners by waving your hand around above the space where the temperature knobs would be if this wasn't a stove from the future.
Ultrahaptics’ prototype for invisible tactile stove controls. In the future, the square transducer array could be invisibly embedded around the perimeter of the stove itselfPhoto: Evan Ackerman/IEEE Spectrum
There are four discrete controls (one for each burner) that you can feel by moving your hand above Ultrahaptics' ultrasonic transducer array on the right side of the stove. It's not like you can feel an actual knob or anything, but as you move your hand around, you absolutely feel these four tangible regions in space that correspond to the burner controls. Each is a little bit like having an invisible, silent bumble bee right under your palm or fingertips. To turn a burner on, "tap" your hand in mid-air, and the system will register the action with a soft tactile explosion across your palm, the haptic equivalent of a "click." There's also a slider control: if you make a pinching gesture with your finger and thumb, the system recognizes it, and you'll feel a gentle buzzing. Then, just move your hand laterally back and forth to adjust the temperature of the stovetop up and down.
Ultrahaptics’ interface embedded into a speaker system to control volume as well as changing songsPhoto: Evan Ackerman/IEEE Spectrum
This kind of interface is so foreign that it takes a little bit of practice to get comfortable with, but after waving my hand around haphazardly for a minute, I was able to control the stove as dexterously as if there'd been physical controls there. Once you get a sense of what sorts of tactile sensations to expect, and what each of those sensations means, it's a very natural control system to adapt to.
While the video shows actual knobs and dials, that's not really what you should expect from Ultrahaptics' system. The minimum point size that the system can create (constrained by the wavelength of the ultrasound) is 8.6mm in diameter. By moving those points around quickly, or using many of them, you can make three dimensional shapes, although not with very fine amounts of detail. Force fields are realistic, though, and Ultrahaptics suggests that they might be a useful feature to add to stovetops: if it's hot, there's a tangible force field that you'd feel before you got too close to the surface itself.
It's true that you could accomplish all of these control interactions by just waving your hand around in front of the Leap Motion sensor. But without tactile feedback, it's very difficult to execute fine control over specific areas in 3D space, since you have no idea whether or not your hands and fingers are in the right spot. Tactile feedback like this means that you can execute fine control in situations where you don't want to touch anything (like while you're cooking) or when you can't look at a display (like while you're driving). It also opens up the potential for all kinds of different form factors for future electronics, because you can potentially do away with buttons and dials completely.
Right now, Ultrahaptics is using the same sorts of transducers that you might have in your car to alert you when you're about to back into something. They work fine, but they're a bit bulky for consumer device integration. The next generation of transducer (already in development) will be about 1mm in height and 1/40th of the volume, although it's going to take some work to crank the power high enough for tactile use. In terms of software, version 2 (currently under development) will enable "fully tailored sensations," like very fine recreation of textures that update at 10,000Hz (as opposed to the 200Hz of the version 1 software). Feeling textures, fundamentally, are just your fingertips feeling a pattern of changing vibrations, and with fine enough resolution, Ultrahaptics should be able to fool your brain into feeling just about anything.
In 2016, Ultrahaptics will be using the £10 million of funding they just got to develop industry partnerships, although they already have a bunch of (secret, so far) customers. They'll also be reducing the size and power draw of the array to make it easier to integrate into devices. If we're lucky, we should see some products being announced this year, and since Ultrahaptics' tech isn't inherently expensive, some of those products might even be affordable.
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.