The February 2023 issue of IEEE Spectrum is here!

Close bar

AI-Enabled Device Emits Radio Waves to Wirelessly Monitor Sleep Patterns at Home

A laptop-sized system could make it easier to diagnose and study sleep disorders

2 min read
A photo illustration shows a laptop-sized device that monitors sleep patterns mounted to a window in a home office
Photo Illustration: Shichao Yue/MIT

Around 50 million people in the U.S. suffer from sleep disorders. In order for physicians to diagnose these disorders, patients must spend a night in a sleep lab hooked up to electrodes and sensors, which can be unpleasant and nerve-racking.

MIT researchers have now come up with a way to wirelessly capture data on sleep patterns from the comfort of a patient’s home. Their laptop-sized device bounces radio waves off a person, and a smart algorithm analyzes the signals to accurately decode the patient's sleep patterns.

The device could allow experts to monitor someone’s sleep for weeks or months rather than once every few months in an overnight lab. Apart from enabling physicians to diagnose and study sleep disorders, they could also use it to understand how drugs or illnesses such as Parkinson’s disease, Alzheimer’s disease, epilepsy, and depression affect sleep quality.

“Doing this wirelessly in your own bedroom, you could really see the impact of drugs, and progression of diseases by long-term monitoring,” says Dina Katabi, a professor of electrical engineering and computer science at MIT who led the work.

During sleep, we cycle through three different sleep stages: light, deep, and REM. Fitness bands and phone apps use accelerometers to track a person’s sleep patterns, but they don’t produce data on sleep stages that is accurate enough for medical use, Katabi says.

The new RF system combines information on breathing, pulse, and movements to decipher sleep stages with 80 percent accuracy, about the same as lab-based EEG tests. The researchers tested the system on 25 volunteers over 100 nights of sleep. They presented their work at the International Conference on Machine Learning on Aug 9.

The device transmits RF waves at one-thousandth the power of Wi-Fi signals and picks up signals reflected from walls, furniture, and sleeping subjects, whose tiniest movements change the frequency of the reflected signal. The deep neural network algorithm extracts the relevant sleep-related signals from the jumble of reflected signals and translates the data into meaningful sleep stages.

The MIT team has previously used the same radio-based system to measure walking speed and to detect and analyze emotions.

The Conversation (0)
Illustration showing an astronaut performing mechanical repairs to a satellite uses two extra mechanical arms that project from a backpack.

Extra limbs, controlled by wearable electrode patches that read and interpret neural signals from the user, could have innumerable uses, such as assisting on spacewalk missions to repair satellites.

Chris Philpot

What could you do with an extra limb? Consider a surgeon performing a delicate operation, one that needs her expertise and steady hands—all three of them. As her two biological hands manipulate surgical instruments, a third robotic limb that’s attached to her torso plays a supporting role. Or picture a construction worker who is thankful for his extra robotic hand as it braces the heavy beam he’s fastening into place with his other two hands. Imagine wearing an exoskeleton that would let you handle multiple objects simultaneously, like Spiderman’s Dr. Octopus. Or contemplate the out-there music a composer could write for a pianist who has 12 fingers to spread across the keyboard.

Such scenarios may seem like science fiction, but recent progress in robotics and neuroscience makes extra robotic limbs conceivable with today’s technology. Our research groups at Imperial College London and the University of Freiburg, in Germany, together with partners in the European project NIMA, are now working to figure out whether such augmentation can be realized in practice to extend human abilities. The main questions we’re tackling involve both neuroscience and neurotechnology: Is the human brain capable of controlling additional body parts as effectively as it controls biological parts? And if so, what neural signals can be used for this control?

Keep Reading ↓Show less