Turn Any Surface Into a Trackpad

Surface acoustic wave sensors make clothes and countertops into touch interfaces

3 min read

arm pointing on a wood desktop
Interactive Sensing and Computing Lab

Everyday surfaces, from kitchen countertops to jacket sleeves and more, can be transformed into trackpads with the help of tiny microphones that can pick up sound waves traveling across surfaces, a new study finds.

Scientists have long explored using acoustic sensors to detect the vibrations from touches as simpler alternatives to touchscreens. However, acoustic sensors such as accelerometers, conventional microphones, and contact microphones are typically restricted to low bandwidth, limiting their potential applications.

So instead of depending on traditional sound waves that zip through space, researchers led by Alanson Sample, a computer scientist at the University of Michigan, in Ann Arbor, explored using acoustic waves that ripple across the surfaces of solid objects every time objects touch them. Mobile phones and other devices have long relied on surface acoustic waves (SAWs) to help manipulate radio signals.

SAWSense: Using Surface Acoustic Waves for Surface-bound Event Recognitionyoutu.be

The scientists experimented with voice pickup units (VPUs), acoustic sensors originally designed for earbuds to capture sound waves traveling from the vocal cords to the inner ear, allowing a person wearing them to speak and be heard clearly in a crowded, noisy environment. These sensors are hermetically sealed, helping them reject sound waves traveling through the air while capturing SAWs through contact.

The key components of these 2.65-by-3.5-millimeter VPUs are fabricated using the same methods employed for MEMS. This gives VPUs “a much wider sensing range than traditional contact microphones,” says Sample.

The result was an acoustic contact microphone “that is extremely wideband—20 hertz to 20 kilohertz,” Sample notes. In combination with the research team’s custom-designed signal-processing and machine-learning pipeline, the new system could sense and identify SAWs from a much richer set of touch events, gestures and objects than previous gadgets “anywhere on a continuous surface without interference from background noise,” he says.

All in all, the new system, named SAWsense, “allows us to create an input interface on nearly any object or device and detect a wide range of activities on surfaces like tables and floors,” Sample says. “One advantage is that a product designer only needs to add one sensor to an object rather than multiple buttons or a touch sensor.”

The scientists experimented with adding gesture interfaces to a wide range of surfaces and objects. For instance, on traditional flat surfaces such as a desk, SAWSense could recognize trackpad-style gestures such as taps and swipes. With the aid of a second VPU sensor, it could also detect the direction of these gestures. “We added the VPU sensors to the feet of a laptop, effectively creating a virtual trackpad on the table, enabling users to browse the Web and PDFs,” Sample says.

In addition, the researchers used SAWsense to add touch interfaces to more unusual geometries, such as head pats and belly rubs on a toy dragon. They could also enhance fabric items such as jacket sleeves, showing that SAW are not limited to rigid surfaces.

Moreover, the scientists investigated what activities SAWsense could help recognize besides gestures on surfaces. “For example, we analyzed common meal prep tasks such as chopping, whisking, peeling, blending, mixing, boiling water and found that our system could easily detect these events anywhere on the surface,” Sample says. All in all, it could identify 16 different cooking-related tasks.

In all these examples, SAWsense performed with more than 97 percent accuracy. This new sensing technique “will allow product designers to create lower-cost and more effective touch interfaces into our favorite electronic devices, and enable new smart-home applications by allowing computing systems to better understand users’ activities and needs,” Sample says.

The scientists detailed their findings at the ACM CHI Conference on Human Factors in Computing Systems in Hamburg on 24 April.

The Conversation (0)