Tired of trying to tap icons on small smartwatch screens? Some day you could just swipe right through the air above them thanks to miniaturized radar technology and its accompanying gesture recognition technology in development at imec, the Belgium-based R&D center.
Imec’s radar chips operate at around 145 GHz, well above the bands used for car radar. That high up in the electromagnetic spectrum, the chip can take advantage of a full 10 gigahertz of bandwidth, which leads to millimeter-scale resolution, its inventors say.
“Gestures allow a lot of capabilities where screens are becoming too small for fine movements,” says Kathleen Philips, program director for IoT at imec. “Radar is great for measuring movement; this particular radar is great for measuring micromovements.”
The radar detects motion by sending out a pulse of radiation and measuring the timing of the returning signal. Other gesture recognition technologies exist, such as cameras and time-of-flight sensors, but these are susceptible to lighting conditions and, in a smartwatch scenario, could be easily blocked by your sleeve.
The new silicon radar can see through your clothes and has a range of up to 10 meters and a resolution of 15 millimeters. That combination lend it to another potential application, vital signs monitoring. The chip is sensitive enough to detect respiration and heartbeat from the minute movement of your chest, and Philips suggests it could be used to monitor the physical state of drivers to prevent accidents from medical emergencies or fatigue, among other scenarios.
Today’s version of the chip consists of a single transmitter and four receivers, but imec plans to redesign it to have four of each. For some applications, it might also make sense to put the transmitters and receivers on separate chips. When running continuously, the radar burns 500 mW, but it hasn’t been optimized and Philips says it would likely be used at a low duty cycle on a power-constrained portable such as a smartwatch.
The imec team used machine learning algorithms to train the system to recognize a few gestures so far, including swipe (left and right), tap (push down), and click (close your hand). When I tried it myself at a demonstration in Antwerp the radar was a bit finicky, and its inventors believe that is because it was only trained on a few people—proving the maxim that in machine learning, good training data is key.
A correction to this article was made on 16 May 2019.