Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Fujifilm SonoSite Wants to Bring AI to Ultrasound

Fujifilm SonoSite partners with AI2 Incubator to develop a better method for finding veins

3 min read

Closeup of a doctor performing an ultrasound on a patient.
Photo: iStock

Have you ever needed an IV and had to undergo multiple pricks before the nurse could find a vein? Technology to avoid that painful trial and error is in the works. Fujifilm’s ultrasound diagnostics arm SonoSite announced yesterday that it has partnered with a startup company to develop artificial intelligence that can interpret ultrasound images on a mobile phone.

The companies say the first target for their AI-enabled ultrasound will be finding veins for IV (intravenous) needle insertion. The technology would enable technicians to hold a simple ultrasound wand over the skin while software on a connected mobile device locates the vein for them.

For this project, Fujifilm SonoSite tapped the Allen Institute for Artificial Intelligence (AI2), which has an incubator for AI startup companies. “Not only do we have to come up with a very accurate model to analyze the ultrasound videos, but on top of that, we have to make sure the model is working effectively on the limited resources of an Android tablet or phone,” says Vu Ha, technical director of the AI2 Incubator.

In an interview with IEEE Spectrum, Ha did not disclose the name of the startup that will be taking on the task, saying the fledgling company is still in “stealth mode.

Ha says the AI2 startup will take on the project in two stages: First, it’ll train a model on ultrasound images without any resource constraints, with the purpose of making it as accurate as possible. Then, the startup will go through a sequence of experiments to simplify the model by reducing the number of hidden layers in the network, and by trimming and compressing the network until it is simple enough to operate on a mobile phone. 

The trick will be to shrink the model without sacrificing too much accuracy, Ha says.

If successful, the device could help clinicians reduce the number of unsuccessful attempts at finding a vein, and enable less trained technicians to start IVs as well. Hospitals that do a large volume of IVs often have highly trained staff capable of eyeballing ultrasound videos and using those images to help them to find small blood vessels. But the number of these highly trained clinicians is very small, says Ha.

“My hope is that with this technology, a less trained person will be able to find veins more reliably” using ultrasound, he says. That could broaden the availability of portable ultrasound to rural and resource-poor areas. 

SonoSite and AI2 are home to two of the many groups of researchers putting AI to work on medical imaging and diagnostics. The U.S. Food and Drug Administration (FDA) has approved for commercial use a deep-learning algorithm to analyze MRI images of the heart, an AI system that looks for signs of diabetic retinopathy in the images of the retina, an algorithm that analyzes X-ray images for signs of wrist fracture, and software that looks for indicators of stroke in CT images of the brain, to name a few.  

Notably, the FDA in 2017 also approved for commercial use smartphone-based ultrasound technology made by Butterfly. The device, which costs less than US $2,000, can be used to take sonograms for 13 different clinical applications, including blood vessels. Butterfly has announced publicly that it is developing deep-learning-based AI that will assist clinicians with image interpretation. But the company has not yet commercially launched the technology. 

At least four other portable or mobile-device-based ultrasound technologies have been approved by the FDA, including that of Fujifilm SonoSite, and the Lumify from Philips

But the adoption of these devices has been relatively slow. As Eric Topol, director of the Scripps Research Translational Institute, told Spectrum recently, the smartphone ultrasound is a “brilliant engineering advance” that’s “hardly used at all” in the health care system. Complex challenges such as reimbursement, training, and the old habits of clinicians often hinder the uptake of new gadgets, despite engineers’ best efforts. 

The Conversation (0)