Gadget Hears What You’re Eating

1980s man holding a banana to his ear.

Your Fitbit (or whatever it is the activity-enlightened wear these days) can make a pretty good guess at how many calories you’re burning through. And it can do it without any input from you. But if you want to keep track of how many you’re putting in, you’ll still need to do some work yourself, even if it’s only choosing from a menu on an app.

Inspired by that asymmetry, State University of New York at Buffalo computer scientist Wenyao Xu and colleagues at Northeastern University in China developed Autodietary, a necklace-like gadget that attempts to tell what you’re eating.

The device senses sounds from your neck to categorize your meal. It then digitizes and segments the audio data and sends it to a smartphone via a Bluetooth connection. Software on the phone uses machine-learning techniques to analyze differences in sounds as the first three mechanical parts of the digestion process—biting, chewing, swallowing—take place. Each type of food yields a sonic signature distinct enough that so far Xu’s team can tell the difference between apples, carrots, potato chips, cookies, peanuts, walnuts, and water with about 85 percent accuracy. (It was best at identifying water, worst at peanuts.)

To get that far required a good deal of work. Xu and his team started by studying how people eat—how many times they chew things, how quickly they chewed—in order to properly program the device. The system also had to be able to filter out the noise of other body sounds, says Xu. Once they got it working, they also had to reduce its power consumption—inserting an eating-sound-activated trigger into the system, so the device won’t waste power while you contemplate your next forkful.

And there’s more to be done, according to a report that appeared in IEEE Sensors Journal last month. Xu and his colleagues intend to reduce the size of the processing and transmission portion of the device to about the scale of a USB fob. They also hope to use the system to categorize more types of foods and even figure out the volume of food you’re eating.

“Our ambition is to categorize all foods,” says Xu. “But sound may not be enough.” They’ll need to incorporate other types of sensors to tell the difference between very similar foods. After all, Corn Flakes probably sound just like Frosted Flakes, but the latter would make a big difference to your diet.


The Human OS

IEEE Spectrum’s biomedical engineering blog, featuring the wearable sensors, big data analytics, and implanted devices that enable new ventures in personalized medicine.

Eliza Strickland
New York City
Emily Waltz
Megan Scudellari

Newsletter Sign Up

Sign up for The Human OS newsletter and get biweekly news about how technology is making healthcare smarter.