Biomedical

How to Steal DNA With Sound

Researchers devise method for snooping on DNA synthesis using acoustic recordings. But is it a real threat?

University of California Irvine researchers pose in front of a Applied Biosystems 3400 DNA Synthesizer. One researcher holds a microphone.
University of California Irvine researchers Mohammad Al Faruque, Arnav Malawade, John Chaput, and Sina Faezi capture sound emitted by the AB 3400 DNA Synthesizer.
Photo: Steven Zylius/University of California Irvine

Engineers at the University of California say they have demonstrated how easy it would be to snoop on biotech companies making synthetic DNA. 

All you need is an audio recording, they say. Place a smartphone near a DNA synthesizer, record the sound, run the recording across algorithms trained to discern the clicks and buzzes that particular machine makes, and you’ll know exactly what combination of DNA building blocks it is generating. 

The researchers demonstrated their spying technique on the Applied Biosystems 3400 DNA Synthesizer, a widely used older model. They presented their results last week in San Diego at the 2019 Network and Distributed Systems Security Symposium

The purpose of their exercise is to convince engineers that they must design bioinstrumentation with these kinds of security leaks in mind. Hacking data via online networks isn’t the only way to steal proprietary data, the researchers say.

“Acoustics for this particular machine was the problem,” says Mohammad Al Faruque, an electrical and computer engineering professor at the University of California Irvine whose team developed the algorithms. But any type of emission from a machine can potentially be analyzed, he says. A machine can emit electromagnetic fields, or have a thermal footprint, or a vibration footprint—all of which could give thieves or bioterrorists the clues they need to poach sensitive information, he says. People making machines need to take a “wholistic design approach that considers these emissions,” Al Faruque says. 

In the case of the DNA synthesizer, acoustics revealed everything. Noises made by the machine differed depending on which DNA building block—the nucleotides Adenine (A), Guanine (G), Cytosine (C), or Thymine (T)—it was synthesizing. That made it easy for algorithms trained on that machine’s sound signatures to identify which nucleotides were being printed and in what order. 

These sequences of nucleotides, called oligonucleotides, are synthesized into DNA and integrated into living organisms, like bacteria or viruses. The right code can turn ordinary yeast into an opioid production factory. It can make a crop resistant to disease. Using DNA synthesizers, scientists can even produce, from scratch, the genetic code for entirely new organisms. Synthetic biology companies closely guard these key DNA sequences in an effort to protect their core intellectual property.

IEEE Spectrum reached out to two such companies to ask how realistic acoustic snooping of DNA synthesizers might be. Neither viewed the technology as much of a threat.  

“To me this was an academic exercise,” says Emily Leproust, CEO of DNA maker Twist Bioscience. “In the real world this is not an issue,” she says. Twist’s DNA synthesizers, which the company developed in-house, use ink jet printing and make almost no sound, she says. Each machine can print more than a million oligonucleotides at a time, so it would be very difficult to acoustically parse an individual sequence from it, she says.

I don’t feel this represents a threat to industrial synthetic biology at all,” adds Barry Canton, co-founder of synthetic microorganism company Ginkgo Bioworks. “Oligonucleotides are short, and so they only represent short snippets of our sequences. You would need to combine many oligonucleotides together, in the right order, to reassemble the sequences that are truly valuable. This is why lifting oligo sequences as described is not as much of a threat as it may seem. It's a little bit like trying to reassemble a document from shredded paper,” he says.

Ginkgo last year announced that it would develop a range of biosecurity measures for its industry, including algorithms that screen customer’s orders. The goal is to stop nefarious characters from ordering the genetic ingredients to make a deadly virus or some other hazardous organism.

But even if a thief succeeded in stealing the codes for key DNA segments, getting them to work in an organism takes a ton of testing and expertise. “If you are a bad actor or someone from another country trying to spy, this isn’t going to be the way to do it,” Twist Bioscience’s Leproust says. “You'd be better off waiting until the company has done the experiments, and then hack their [computer] system” or physically steal the organism, she says.

That type of espionage has actually happened in the biotech industry. FBI agents several years ago caught a group of Chinese nationals digging up proprietary corn seeds grown in secret test fields owned by U.S.-based companies. The seeds contained the DNA for some of the most elite traits in agriculture—traits worth millions of dollars, representing nearly a decade of research and development. (The U.S. government in 2014 charged the Chinese nationals with conspiring to steal trade secrets. One of them pleaded guilty. The other five allegedly fled the country.)  

In that case, thieves physically stole the genetic codes after they had been laboriously integrated into the finished corn seed product. By contrast, thieves who pilfer the genetic code for a segment of DNA from a synthesizer have a ton of work ahead of them. 

Still, the potential for a heist is there, the authors of the new report say. An aspiring bioterrorist might try to steal genetic codes that would turn an organism into a producer of a lethal toxin or make a virus more potent. Or the technology could be used in reverse—to spy on suspected criminals attempting to DIY some viruses or drugs.

Perhaps we could use such algorithms to spy on ourselves. “I would welcome a microphone in my lab if it was able to listen to the routine sounds a machine is making and then tell me when there’s a deviation in those sounds, because that would be a way to get early warning of a failure or degraded performance,” says William Groveran assistant professor of bioengineering at University of California Riverside, who co-authored the report.

Grover teaches bioinstrumentation, and says young engineers should be taught to think about security in terms of all the ways a machine can leak information, not just the traditional sense.  

“I’m far more interested in pushing the industry toward countermeasures for this,” adds Philip Brisk, associate professor of computer science and engineering at University of California Riverside, and an author on the report. “Hopefully people will take notice of this and figure out ways to put some type of cloth-like material on the internals of these things to damp the acoustic and the problem will go away before it happens.”