AI Cardiologist Aces Its First Medical Exam

A neural network outperforms human cardiologists in a task involving heart scans

3 min read
Photo-illustration of an AI computer analyzing heart scans.
Photo-illustration: Rima Arnaout

Rima Arnaout wants to be clear: The AI she created to analyze heart scans, which easily outperformed human experts on its task, is not ready to replace cardiologists. 

It was a limited task, she notes, just the first step in what a cardiologist does when evaluating an echocardiogram (the image produced by bouncing sound waves off the heart). “The best technique is still inside the head of the trained echocardiographer,” she says.

But with experimental artificial intelligence systems making such rapid progress in the medical realm, particularly on tasks involving medical images, Arnaout does see the potential for big changes in her profession. And when her 10-year-old cousin expressed the desire to be a radiologist when she grows up, Arnaout had some clear advice: “I told her that she should learn to code,” she says with a laugh. 

When both the AI and expert cardiologists were asked to classify the images, the AI achieved an accuracy of 92 percent. The humans got only 79 percent correct.

Arnaout, an assistant professor and practicing cardiologist at UC San Francisco, is keeping up with the times through her research in computational medicine; she published this new study in the journal Digital Medicine

In the study, Arnaout and her colleagues used deep learning, specifically something called a convolutional neural network, to train an AI system that can classify echocardiograms according to the type of view shown.

This classification is a cardiologist’s first step when examining an image of the heart. Because the heart is such a complex structure—it’s an asymmetrical organ with four chambers, four valves, and blood constantly flowing in and out through several vessels—echocardiographers take videos from many different positions. When the doctors are ready to analyze those videos, they first have to figure out which view they’re looking at and which anatomical features they can see. 

Typically the cardiologist would look at a relatively high-resolution video of the echocardiogram, showing a shifting image captured as the imaging tool was moved around the patient’s chest. But the AI had a much harder task. It was given still images taken from video clips, and the images had been shrunk to just 60 by 80 pixels each. 

Six echocardiogram images showing different views of the heart.

Image: A Madani et al.
The AI had to sort heart scan images into categories based on which view of the heart they presented.

When both the AI and expert cardiologists were asked to sort these tiny black-and-white images into 15 categories of views, the AI achieved an accuracy of 92 percent. The humans got only 79 percent correct. “These were excellent echocardiographers,” Arnaout says, “but it’s a hard task. We’re not used to seeing the images shrunken down and out of context.”

The AI only performed this first step in the analysis of a heart image and the making of a diagnosis. A human cardiologist looks at many of these scans to examine more than 20 structures within the heart, then synthesizes that information to arrive at a conclusion.

Arnaout is now working on a new version of the technology that can take the next steps to identify different diseases and heart problems. “A human echocardiographer can look at any heart, no matter what the defect, and figure out what’s going on,” Arnaout says. “I’m interested in building a platform that can do that.”

Even if she accomplishes her goal, though, she doesn’t think human cardiologists will be put out of their jobs. “As cardiologists, we read the images and then go see the patient,” she says. “So we’re both reading images and practicing medicine. I don’t think that second piece will be taken over so quickly.”

The Conversation (0)

This CAD Program Can Design New Organisms

Genetic engineers have a powerful new tool to write and edit DNA code

11 min read
A photo showing machinery in a lab

Foundries such as the Edinburgh Genome Foundry assemble fragments of synthetic DNA and send them to labs for testing in cells.

Edinburgh Genome Foundry, University of Edinburgh

In the next decade, medical science may finally advance cures for some of the most complex diseases that plague humanity. Many diseases are caused by mutations in the human genome, which can either be inherited from our parents (such as in cystic fibrosis), or acquired during life, such as most types of cancer. For some of these conditions, medical researchers have identified the exact mutations that lead to disease; but in many more, they're still seeking answers. And without understanding the cause of a problem, it's pretty tough to find a cure.

We believe that a key enabling technology in this quest is a computer-aided design (CAD) program for genome editing, which our organization is launching this week at the Genome Project-write (GP-write) conference.

With this CAD program, medical researchers will be able to quickly design hundreds of different genomes with any combination of mutations and send the genetic code to a company that manufactures strings of DNA. Those fragments of synthesized DNA can then be sent to a foundry for assembly, and finally to a lab where the designed genomes can be tested in cells. Based on how the cells grow, researchers can use the CAD program to iterate with a new batch of redesigned genomes, sharing data for collaborative efforts. Enabling fast redesign of thousands of variants can only be achieved through automation; at that scale, researchers just might identify the combinations of mutations that are causing genetic diseases. This is the first critical R&D step toward finding cures.

Keep Reading ↓ Show less