Whether you’re trying to lose weight, eat healthier, or track your diet for other reasons, there’s no truly easy and automatic way to log your food intake. Today’s best options are apps that require you to pick the food items you’re consuming or manually enter that information.
But what if the app knew, without your help, that you’d just spent a half hour crunching on cookies?
That’s the idea behind the “Diet Eyeglasses” revealed earlier this month at the IEEE Body Sensor Networks conference in San Francisco. The smart glasses have built-in sensors that detect muscle activity related to chewing, which could enable the continuous and unobtrusive monitoring of every morsel users chow down on.
Professor Oliver Amft, a researcher at Germany’s University of Passau, 3-D printed his smart glasses. They look like normal spectacles, but feature sensors made from flexible woven fabric electrodes attached to the glasses’ frame. The sensors use electromyography (EMG) to detect contractions of facial muscles involved in chewing. After experimenting with different locations for the sensors, Amft and his students found that tucking them into the frame behind the ears produces the best contact with the user’s skin, and thus the best signal. The small processor and rechargeable battery are also attached behind the ear.
To test out the specs, eight research subjects chewed various foods, swallowed water, coughed, spoke, and made random motions with their heads. The smart glasses were able to clearly differentiate the EMG patterns produced by chewing from those produced by other activities. They could also tell the difference between the different foods included in the test: cookies, bananas, toast, carrots, and a beloved European candy called Jelly Babies. The foods’ textures were distinct enough to produce signature EMG patterns, the researchers showed.
However, the researchers say it’s doubtful that the glasses alone can distinguish between a wide variety of food types. You can see the difficulty of relying solely on texture information: Could such a system tell the difference between eggs and oatmeal? But the glasses could be used in combination with other sensors, the researchers suggest, to create “personalized dietary monitoring systems.”
There are a number of options for sensors that use other modalities to track food consumption. One gadget covered recently in this blog listens in on the user’s meal; its microphones attach to the user’s neck and detect the sounds of biting, chewing, and swallowing to classify food items. In another approach, a team of experts in natural language processing and machine learning devised a system that could parse a user’s natural description of a meal and retrieve the relevant nutritional information.
Senior Editor Eliza Strickland joined IEEE Spectrum in March 2011 and was initially assigned the Asia beat. She got down to business several days later when the Fukushima Daiichi nuclear disaster began. Strickland shared a Neal Award for news coverage of that catastrophe and wrote the definitive account of the accident's first 24 hours. She next moved to the biomedical engineering beat and managed Spectrum's 2015 special report, “Hacking the Human OS." That report spawned the Human OS blog about emerging technologies that are enabling a more precise and personalized kind of medicine. The blog reports on wearable sensors, big-data analytics, and neural implants that may turn us all into cyborgs. Over the years, Strickland watched as artificial intelligence (AI) technology made inroads into the biomedical space, reporting on crossovers between AI and neuroscience research and IBM Watson's ill-fated efforts in AI health care. These days she oversees Spectrum's coverage of all things AI. Strickland has reported on science and technology for nearly 20 years, writing for such publications as Discover,Nautilus, Sierra, Foreign Policy, and Wired. She holds a master's degree in journalism from Columbia University.