The December 2022 issue of IEEE Spectrum is here!

Close bar

3-D Printing Software Turns Heart Scans into Surgical Models

A computer program builds on human expertise to make 3-D printed heart models from MRI scans

2 min read
3-D Printing Software Turns Heart Scans into Surgical Models
New system from MIT and Boston Children’s Hospital researchers converts MRI scans into 3D-printed heart models (shown here).
Photo: Bryce Vickmark

A new 3-D printing system can transform medical scans of a patient’s heart into a physical models that help plan surgeries. The efficient system relies on a computer algorithm that requires just a pinch of human guidance to figure out a patient’s heart structure from MRI scans.

The process begins with an MRI scan of a patient’s heart that shows the organ as hundreds of cross-sectional slices. Each cross section has dark and light regions that could possibly indicate the edges of anatomical structures within the heart. The new software, developed by MIT and the Boston Children’s Hospital, can correctly identify an individual heart’s anatomical structures by following the lead of a human expert who interprets a small patch equivalent to just one-ninth of the area of each cross section, according to an MIT press release.

One of the best results came from a human expert interpreting just 14 patches and allowing the computer algorithm to infer the rest of the patient’s heart structure across the rest of the MRI scan’s 200 cross sections. The software results were in agreement with human experts interpreting all 200 cross sections 90 percent of the time. A few tweaks allowed the “segmentation” or digital recreation of a virtual 3-D heart model.

“I think that if somebody told me that I could segment the whole heart from eight slices out of 200, I would not have believed them,” said Polina Golland, a professor of electrical engineering and computer science at MIT and leader of the project, in the MIT press release. “It was a surprise to us.”

The group worked with high-precision MRI scans developed by Medhi Moghari, a physicist at Boston Children’s Hospital. Moghari’s scans were 10 times as precise as previous scans. Interpreting all the features of 200 such highly-detailed cross sections would typically take human experts between 8 and 10 hours. Though that interpretation allows researchers to create a virtual 3-D model of the heart that serves as the basis for a 3-D printed model, that’s way too much time.

But the new software from a team led by Danielle Pace, an MIT graduate student in electrical engineering and computer science, managed to create a fairly accurate digital 3-D model of each patient’s heart in just an hour. The 3-D printing process takes several additional hours.

The researchers plan to report on their system at the International Conference on Medical Image Computing and Computer Assisted Intervention in October. They hope to improve the software’s accuracy by examining patches that appear in several MRI cross sections.

Seven cardiac surgeons at Boston Children’s Hospital will also test the usefulness of 3-D printed heart models in a clinical study this fall. They will draw up surgical plans for 10 patients who have already undergone surgery at Boston Children’s hospital and compare the plans with the documented surgeries that were performed. Such surgical plans will either be based on physical 3-D printed models or virtual 3-D models, with the models based on either human expertise or the computer software.

Virtual models of hearts have already proven their worth in basic research. But separate clinical trials aim to test how a personalized computer model for each individual patient could improve medical care, as previously reported by Natalia Trayanova for IEEE Spectrum. The MIT and Boston Children’s Hospital research represents yet another promising step forward in this area.

The Conversation (0)

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less