The December 2022 issue of IEEE Spectrum is here!

Close bar

Eye Scans to Detect Cancer and Alzheimer’s Disease

Computer vision and machine-learning algorithms detect biomarkers in the white of the eye

4 min read
photo: Person holds up a box with smartphone camera facing his face to take a selfie of his eyes to detect early signs of pancreatic cancer via a new app
Photo: Dennis Wise/University of Washington

It is said that eyes are the window to the soul. Well, what about the window to our health?

We recently reported on two AI systems trained to detect eye diseases, specifically diabetic retinopathy and congenital cataracts. Now, we’ve found groups extending that concept to illnesses beyond the eye. Two new projects pair imaging systems with advanced software in an effort to catch early symptoms of pancreatic cancer and Alzheimer’s disease.

At the University of Washington, a team led by computer scientist Shwetak Patel created a smartphone app to screen for pancreatic cancer with a quick selfie. Developed over the last year and a half, the team recently tested their system in a clinical study of 70 people. They were able to identify cases of concern with 89.7 percent sensitivity and 96.8 percent accuracy.

While the app is not yet ready to be used as a diagnostic tool based only a single small trial, it might soon provide a tool for doctors monitor disease progression in patients undergoing treatment with a simple photograph, rather than a blood test, says first author Alex Mariakakis, a UW graduate student.

The app, called BiliScreen, monitors levels of bilirubin, a metabolic byproduct that builds up in cases of jaundice, causing a yellowing of the skin and whites of the eyes. Increased bilirubin in the blood is one of the earliest symptoms of pancreatic cancer, but it is currently detected by a blood test in a doctor’s office and done only for patients at high-risk or with other symptoms.

The BiliScreen app shows bilirubin levels based on eye color measurementsThe BiliScreen app detects levels of bilirubin in the whites of the eyes.Photo: Dennis Wise/University of Washington

To create an easy-to-use, non-invasive early screen for pancreatic cancer—which claimed the life of study co-author James Taylor’s father—the researchers designed a three-step system. First, users take a selfie with their smartphone using one of two accessories to control for environmental conditions: either a cardboard box to block out the light, or colored glasses to give a color reference. In the study, the team found that the box was slightly better at controlling light conditions than the glasses.

Once an image is captured, the software relies on computer vision algorithms to isolate the sclera, or white of the eye, from the skin and pupil. The clinical trial version of the software was based on an algorithm from Microsoft Research called GrabCut, says Mariakakis. A newer iteration, which works even better to isolate the sclera, uses a fully convolutional neural network to identify the most important local features in each image.

“You give the algorithm enough cases of labeled images…with people of different skin tones, eye colors, and orientation, and it can figure out where the sclera is,” he says.

Next, that image information is matched to levels of bilirubin taken from a blood draw and fed into a machine-learning algorithm to train it to detect images of concern. In the initial 70-person study, roughly half the participants had elevated levels of bilirubin and half did not. BiliScreen was able to detect those individuals with high levels with good accuracy and sensitivity, but both measures could be improved with more data, adds Mariakakis. The team is now seeking funding for additional clinical trials.

At Cedars-Sinai and NeuroVision Imaging LLC in California, researchers have developed a sophisticated camera and retinal imaging appraoch to detect early signs of Alzheimer’s disease (AD). This system, recently detailed in a proof-of-concept trial published in the journal JCI Insight, relies on a specialized ophthalmic camera that is not yet available on a smartphone.

In June 2010, the Cedars-Sinai team, led by Maya Koronyo-Hamaoui and Yosef Koronyo, made a novel discovery: Beta-amyloid protein deposits, a neurotoxic protein that builds up in the brains of Alzheimer’s patients, are also present in the postmortem retina of such patients and even in early-stage cases. Further investigation has shown that beta-amyloid deposits in the eye appear in clusters and seem to be more abundant in certain regions then others in the human retina, says Koronyo-Hamaoui.

She and collaborators at Cedars-Sinai founded a company, NeuroVision, to develop a way to detect and quantify those plaques in the eye, in the hopes of being able to see early signs of the disease before cognitive decline becomes obvious. Today, their system consists of a modified scanning laser ophthalmoscope with a special filter to capture a fluorescence signal emitted by the plaques when tagged with a marker, and advanced software to process the images.

In the JCI Insight study, the team used their imaging system on 16 live patients and the donated eyes and brains of 37 deceased patients, 23 of whom were confirmed with Alzheimer’s disease. Overall, they found a 4.7-fold increase in the post-mortem retinas of patients with AD compared to controls.

The next step is larger scale studies to see if the method is practical for diagnosing AD.  The group is conducting additional clinical trials in the U.S., Europe, and Australia, Koronyo-Hamaoui told IEEE Spectrum in an email. “We are testing larger cohorts of living patients for the possible relationship between retinal amyloid index and the gold standard amyloid-PET brain imaging and other AD biomarkers.”

She believes that the imaging could someday be adapted to less expensive cameras, but currently a doctor’s office would require a high-definition camera and image processing and quantification tools in order to use the system.

The Conversation (0)

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
Vertical
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic
DarkGray

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}