Smartphone Camera Senses Patients' Pulse, Breathing Rate

AI app could enable doctors to take contactless vitals during telemedicine visits

3 min read

Illustration of a phone, stethoscope, heartbeat symbols
Illustration: iStockphoto

Telehealth visits increased dramatically when the pandemic began—by over 4000% in the U.S., by one account. But there’s a limit to what doctors can accomplish during these virtual appointments. Namely, they can’t check patients’ vital signs over the phone.

But new technologies in the works could change that by equipping phones with reliable software that can measure a person’s key biometrics. This month at a conference held by the Association for Computing Machinery, researchers presented machine learning systems that can generate a personalized model to measure heart and breathing rates based on a short video taken with a smartphone camera.  

With just an 18-second video clip of a person’s head and shoulders, the algorithm can determine heart rate, or pulse, based on the changes in light intensity reflected off the skin. Breathing rate, or respiration, is gleaned from the rhythmic motion of their head, shoulders and chest. 

Daniel McDuff, a principal researcher at Microsoft Research, and PhD student Xin Liu at the University of Washington, developed the system. “Currently there’s no way to do remote vitals collection except for a very small minority of patients who have medical-grade devices at home,” such as a pulse oximeter to detect heart rate and blood oxygen level, or a blood pressure cuff, says McDuff.

Most people don’t own those devices, so for the vast majority of virtual appointments, patients must arrange separate in-person appointments to get these measurements. “That’s doubly inefficient. It takes twice the amount of time as a typical in-person visit, and with less human interaction,” McDuff says.

Video-based software that can collect vitals during a telehealth appointment would greatly streamline virtual health care. Work on this type of technology arose around 2007, when digital cameras became sensitive enough to pick up small pixel-level changes in skin that indicate blood volume. The field saw a fresh wave of interest after telehealth visits increased during the early part of the COVID-19 pandemic.

Several groups globally have been developing non-contact, video-based vitals sensing. A group out of Oxford is developing optical remote monitoring of vitals for patients in hospital intensive care units or undergoing kidney dialysis. Rice University researchers are developing a device that monitors vehicle drivers for heart attacks.

Google in February announced that its Android-based health tracking platform Google Fit will measure heart and respiratory rate using the phone’s camera. The user places a finger over the rear-facing camera on the phone to get heart rate, and video of the user’s face gathers breathing rate. The software is meant for wellness purposes rather than medical use or doctor visits. 

The challenge facing researchers in this field is developing technologies that work consistently at a high level of accuracy in real world settings, where faces and lighting vary. The approach developed by McDuff and Liu aims to address that. 

In their approach, heart rate is determined by measuring light reflected from the skin. “Variations in blood volume influences how light is reflected from the skin,” says McDuff. “So the camera is picking up micro changes in light intensity and that can be used to recover a pulse signal. From that we can derive heart rate variation and detect things like arrhythmias.”

The algorithm must account for variables such as skin color, facial hair, lighting, and clothing. Those tend to trip up just about any kind of facial recognition technology, in part because the datasets on which machine learning algorithms are trained aren’t representative of our diverse population. 

McDuff’s model faces an added challenge: “Darker skin types have higher melanin, so the light reflectance intensity is going to be lower because more light is absorbed,” he says. That results in a weaker signal-to-noise ratio, making it harder to detect the pulse signal. “So it's about having a representative training set, and there’s also a fundamental physical challenge we need to solve here,” says McDuff.

To address this challenge, the team developed a system with a personalized machine learning algorithm for each individual. “We proposed a learning algorithm to learn a person’s physiological signals quickly,” says Liu. The system can provide results with just 18 seconds of video, he says.

Compared with a standard medical grade device, the proposed method has a mean absolute error of one to three beats per minute in estimating heart rate, Liu says. This is acceptable in many applications.   

The system isn’t ready for medical use, and will need to be validated in clinical trials. To improve the robustness of the system, one approach the team is taking is to train models on computer generated images. “We can actually synthesize high fidelity avatars that exhibit these blood flow patterns and respiration patterns, and we can train our algorithm on the computer generated data,” says McDuff.  

The technology could have both medical and fitness applications, the researchers say. In addition to telehealth visits, remote vitals can be useful for people with chronic health conditions who need frequent, accurate biometric measurements. 

The Conversation (0)