Accelerometers introduced smartphone users to many handy new features—recording the distance you walk and automatically rotating the view when we turn the phone sideways, to name just two you probably used today. Though these sensors are generally quite helpful and accurate, computer scientists from the University of Michigan have just found a way to scam them.
A research team figured out that they could fool accelerometers using sound waves—in particular, a single tone played at an accelerometer’s resonant frequency. With it, they can cause two signal processing components within the phone to generate a false report of the accelerometer’s behavior. The group is led by Kevin Fu of the University of Michigan and includes collaborators from the University of South Carolina.
Patrick McDaniel, a security researcher at Pennsylvania State University, says the security risk of the particular scheme devised by Fu’s group is low. But the broader problem is a big one in the industry: Devices and software programs tend to blindly trust any data gathered from built-in sensors.
On Tuesday, the Industrial Control Systems Cyber Emergency Response Team of the U.S. Department of Homeland Security issued a public alert about the findings.
The Michigan group’s work underscores the point that any device that relies on data from a sensor to make a critical decision can potentially be led astray by that sensor. Besides smartphones, accelerometers are also used to activate airbags in motor vehicles, and to measure the rate and depth of chest compressions during CPR.
“If you're trusting your sensor inputs and you have no way to validate those inputs, you're going to have problems,” McDaniel says.
The University of Michigan team tested 20 models of capacitive micro-electromechanical (MEMS) accelerometers from five manufacturers: Bosch, STMicroelectronics, InvenSense, Analog Devices, and Murata Manufacturing. (The model numbers are listed here.)
They found that 75 percent of the accelerometers could be fooled by an attack that allowed them to slightly alter the sensors’ signals for a brief moment, and 65 percent were vulnerable to a more severe attack that allowed the team to control their signals indefinitely.
Fu says he hopes the work—which he calls a proof of concept—will start a conversation in the industry: “We need to question, why do we trust our sensors?” he says.
Their trick was possible because sound waves impart a physical force to any object they encounter. The University of Michigan team essentially used the vibrations produced by sound waves to alter accelerometers’ records of what was happening to them.
An accelerometer contains a physical mass placed on springs. When a device moves, the mass does too. The movement causes the capacitance—the ability to store charge—to change in the springs, which can be interpreted as movement. By producing vibrations through sound waves that moved that mass in a particular way, the group launched a series of attacks on the unsuspecting sensors.
The group first had to identify the resonance, or preferred frequency, of each accelerometer. At the resonance frequency, each sound wave reinforces the action of the previous one on the mass—leading to a much larger signal than you’d get at other frequencies. To find the resonance of the accelerometers, the team played tones at progressive frequencies from 2 kilohertz to 30 kilohertz, until they found a frequency where the accelerometer produced an outsized reaction.
Next, the team subjected the sensors to two types of attacks using sound waves at the resonant frequency. The first, called output biasing, exploits a feature of the low pass filter, a signal processing component that filters out high frequency interference. This technique can be used to slightly alter readings produced by the accelerometer for several seconds.
The second, called output control, takes advantage of the phone’s amplifier, which typically handles the raw signal even before it reaches the low pass filter. This method can be used to take control of the accelerometer indefinitely and produce false signals.
To show that it was possible to spoof accelerometers with these techniques, the group used each method to spell out “WALNUT” in a chart of the sensors’ acceleration over time.
Next, they wanted to use sound waves to hack actual devices, so they reached for a Samsung Galaxy S5, which comes with an MPU-6500 accelerometer from InvenSense. They loaded a a music video with the accelerometer’s resonant frequency embedded in it, and remotely prompted the phone to play the video.
At the same time, they ran a game on the phone called Spy Toys that relies on the accelerometer to control a toy car. While the video played, the toy car accelerated or decelerated in accordance with the pulses of the signal they had embedded in the video.
In their final demo, they used an off-the-shelf speaker to play a tone that caused a FitBit to log 2,100 steps in just 40 minutes, earning them 21 reward points on a health tracking site (they declined to cash in their points, citing ethical concerns).
Though these scams are certainly possible, they are not subtle—the attacker must be within close range of the device they wish to target, and has to know the model and resonance of the accelerometer inside.
In the group’s own example, an attacker would have to stand behind the owner and blare an audio track to take control of the Spy Toys car, or somehow prompt the owner’s phone to start playing the resonance frequency—perhaps by sending them to a website that automatically plays the track once they arrive.
“It falls into that kind of cool, but not something that would keep me awake at night, type of vulnerability,” McDaniel says.
Still, to prevent any issues, Fu suggests accelerometer designers choose a resonance in the ultrasound range, which is more difficult to generate with off-the-shelf speakers. And encasing devices in foam is a good way to stop sound waves from reaching a device’s accelerometer, though not always practical.
Through the University of Michigan, the research team is also attempting to sell software programs to manufacturers that it says can prevent such attacks in products that are already on the market.
Just in case, McDaniel thinks entrepreneurs and consumers should be wary of turning too many decisions and responsibilities over to any devices that rely on sensor data, until the industry figures out how to better validate that data.
“If you're using that sensor input for a security critical decision, well then that's something we really need to worry about,” McDaniel says. “If we can't be sure they're trustable, we need to limit the kind of security decisions we're making off of them.”