Virtual and augmented reality displays are getting very, very good at allowing us to see things that aren’t really there. When paired with a sensing system (like Kinect), we can even interact with these virtual objects. The missing piece here is touch: the ability to feel things that don’t actually exist. Using an array of focused ultrasound that can create patterns of turbulence in the air, computer scientists from the University of Bristol have been able to generate 3D shapes in midair that you can’t see, but that you can touch.
Since the entire point of this system is to create invisible objects, it’s a bit hard to get a sense of what exactly the user is experiencing when they interact with the “acoustic radiation force field.” Essentially, the ultrasound array is creating points in the air where multiple sound waves come together constructively, and the resulting point of sound radiation is strong enough to induce a sheer wave in your skin as it gets reflected. This sets off your tactile sensors, making it feel like you’re touching something.
Test users with zero experience with this sort of thing had no trouble identifying a series of basic 3D shapes, including a cone, a pyramid, a sphere, and a cube. The algorithm that drives the ultrasound array is efficient enough to run in real-time, meaning that haptic feedback can be nearly instantaneous, and the shapes could be made to change dynamically in response to user input. There are a lot of possibilities and potential applications here, especially when paired with virtual or augmented reality, as the researchers discuss in their paper:
Inaccessible Objects such as those in museum cases or inside the
human body, can be visually explored through bi-directional mirrors or neurosurgical props. While these methods allow the user to interact with the objects and intersect them with their hands, they offer no haptic feedback. Augmenting with our system enables superior spatial learning and the ability to highlight valuable information through haptic feedback. Figure 13 (left) depicts a surgeon exploring a CT scan with haptic feedback allowing them to feel tumors.
Touchless Interfaces are becoming increasingly common. They
afford an intuitive, flexible user interface but lack the haptic feedback provided by physical buttons and controls. In many situations, such as in the cockpit of a vehicle (see Figure 13 (center)), the user will want to operate the system while their eyes are busy on another task. Integrating our system would allow the user to feel the geometry of an interface and localize on a specific item.
Virtual Reality has long been a goal of interactive systems. Haptic
feedback provides our sense of proprioception, kinesthesia and
touch making it essential for an effective system. Recent advances
in head mounted displays have greatly improved the realism of the
visual feedback, yet haptic feedback still requires proxy or wearable
haptic devices. This unnatural disconnection breaks the immersion
of a virtual reality. Our system would enable users to freely explore
the virtual world unencumbered while receiving haptic feedback from the objects that they interact with, as shown in Figure 13 (right).
There are still a few kinks to be worked out; for example, there’s a tradeoff between sampling density (the detail of the shape) and rendering quality (the strength of the field) that needs optimization. Also, the field is specifically calibrated to work on fingers and hands, and other parts of your body won’t react to it nearly as strongly, and you need to stay within 20 or 30 centimeters of the emitter array for it to work like it’s supposed to.
The researchers are actively working on these problems, and they seem optimistic that their acoustic radiation force field generator (which is a thing that exists now) is easy enough to use, and capable enough, that it could some consumer applications.
I want one.