The Ultimate Virus: How Malware Encoded In Synthesized DNA Can Compromise A Computer System
from the digital-code-is-digital-code dept
DNA is a digital code, written not as 0s and 1s (binary) but in the chemical letters A, C, G and T — a quaternary system. Nature’s digital code runs inside the machinery of the cell, which outputs the proteins that are the building blocks of living organisms. The parallels between DNA and computer code are one reason why we speak of computer viruses, since both are sequences of instructions that subvert the hardware meant to run other, more benign programs. Wired reports on new work which brings out those parallels in a rather dramatic fashion:
a group of researchers from the University of Washington has shown for the first time that it’s possible to encode malicious software into physical strands of DNA, so that when a gene sequencer analyzes it the resulting data becomes a program that corrupts gene-sequencing software and takes control of the underlying computer.
A certain amount of cheating was involved in order to obtain this undeniably impressive outcome. For example, the researchers took an open source compression utility, and then intentionally added a buffer overflow bug to it. They crafted a specific set of DNA letters such that when it was synthesized, sequenced and processed in the normal way — which included compressing the raw digital readout — it exploited the buffer overflow flaw in the compression program. That, in its turn, allowed the researchers to run arbitrary code on the computer system that was being used for the analysis. In other words, the malware encoded in the synthesized DNA had given them control of a physical system.
While they may have added the buffer overflow exploit to the compression program themselves, the researchers pointed out they found three similar flaws in other commonly-used DNA sequencing and analysis software, so their approach is not completely unrealistic. However, even setting up the system to fail in this way, the researchers encountered considerable practical problems. These included a requirement to keep the DNA malware short, maintaining a certain ratio of Gs and Cs to As and Ts for reasons of DNA stability, and avoiding repeated elements, which caused the DNA strand to fold back on itself.
Clearly, then, this is more a proof of concept than a serious security threat. Indeed, the researchers themselves write in their paper (pdf):
Our key finding is that it is possible to encode a computer exploit into synthesized DNA strands.
However, in the longer term, as DNA sequencing becomes routine and widespread, there will be greater scope for novel attacks based on the approach:
If hackers did pull off the trick, the researchers say they could potentially gain access to valuable intellectual property, or possibly taint genetic analysis like criminal DNA testing. Companies could even potentially place malicious code in the DNA of genetically modified products, as a way to protect trade secrets, the researchers suggest.
If nothing else, this first DNA malware hack confirms that there is no unbridgeable gulf between the programs running in our cells, and those running on our computers. Digital code is digital code.
Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+
Comments on “The Ultimate Virus: How Malware Encoded In Synthesized DNA Can Compromise A Computer System”
Lesson Learned:
Disable your PC or smartphone’s gene sequencer for the time being.
I hope Microsoft issues a patch for XP.
Re: Lesson Learned:
On the other hand…
A recent study found that 20 per cent of sausages sampled from grocery stores across Canada contained meats that weren’t on the label. Including horse meat. Meanwhile, DNA tests showed Subway chicken sandwiches could contain just 50% chicken. The majority of the remaining DNA was from soy.
Companies are already putting web browsers and grocery list apps and whatnot in refrigerators. We may be just one manufactured scare and sales pitch away from gene sequencers too. And Russia using your fridge to attack the Baltic States.
Re: Re: Lesson Learned:
From mutant chickens to Russia fridgepocalypse. I like this thread.
Re: Re: Lesson Learned:
Better horse or soy DNA than human DNA. The last hot dog report showed a number of brands with human DNA in them. :O
Reminds me of how rouge OTA broadcasts can hack TVs. Moral of the story: all inputs are potential attack surface.
Re: Re:
Or numbers are numbers. We already knew that numbers from different number-systems are interchangeable to some degree.
This is merely a proof that a 4-digit (arguably a higher number, but that is more complex), can translate into a 2-digit system.
It also provides a caution against expanding the ip’able subjects too far, if minimal chance of random repetition is indeed a measure any more…
Re: Do you eat the rouge ones last?
Well duh. Why would you monitor red broadcasts anyway?
Re: Re: Do you eat the rouge ones last?
+1 for being a Smarties-pants.
So if you happen to have a sequence that fits the exploits you’d become the Skynet?
Here’s hoping John Oliver has the right sequence 😀
Re: Re:
It’s only a matter of time until a supercar is equipped with a DNA reader instead of a key fob or fingerprint reader. It’s only a matter of time until we hear of cloud-connected supercars picking up malware. The two technologies mix.
Soon after, Jeremy Clarkson tries one out. And so begins the next reboot of Planet of the Apes.
Re: Re: Re:
How is it this doesn’t have more LOL votes?
Re: Re: Re: Re:
American readers may not be familiar with Clarkson’s co-hosts often calling him an orangutan. And so they might flag my post as abusive, possibly negating any LOL votes.
Either that or I’m just not funny.
Re: Re: Re:2 Re:
Funny, funny looking, what’s the difference? =P
Re: Re: Re:2 Re:
Because probably a lot of people marked it insightful rather than funny. I did.
I’m predisposed to believe that the fundamental issue of software vulnerabilities is due to poor “engineering tolerances”.
For instance, how many times has careless input routines (user or I/O) broken a program or created security holes?
When OOP (objected oriented programming) was pushed in earnest in the early 90s, I thought it might have been too soon (and likely too sloppy). To be clear, OOP, or something like it would eventually be necessary. However, it seemed OOP (in place of structured programming techniques) created a situation where programmers often didn’t know or didn’t have a handle on the code that was in their own software. Maybe the timing of OOP’s dominance promoted undisciplined programming behavior and traditions that we are still suffering from now…
Or, I could be oversimplifying.
Re: Re:
It’s possible it made things slightly worse (or better)… but "Buffer overflows were understood and partially publicly documented as early as 1972, when the Computer Security Technology Planning Study laid out the technique: ‘The code performing this function does not check the source and destination addresses properly, permitting portions of the [kernel] to be overlaid by the user. This can be used to inject code into the [kernel] that will permit the user to seize control of the machine.’"
Re: Re: Re:
Exploits weren’t considered much of an issue early on because they only thing they could do is crash YOUR computer. Total damage? You rebooted your computer and MAYBE lost anything you were typing that you failed to save. No biggie.
These days, idiots are putting more and more IMPORTANT systems online that have no business being on a public net. Some shouldn’t even be hooked to a general purpose computer… some idiot employee WILL eventually run malware on it.
Anything TRULY important should be hooked to at most a dedicated computer that has no ability to run anything but the dedicated software for the system it controls. Don’t give the idiots a chance to exploit the system or someone will.
Re: Re: Re: Re:
The quoted text was about 1972, when essentially nobody had their own computer. Everything back then was cloud-based—sorry, "time-shared".
Re: Re: Re:2 Re:
‘Your’ meaning an individual or company or whatever. Depending on the time period, it could mean any or all of those. Before the late 70s, it mostly wouldn’t refer to an individual. Through most of the 80s and 90s, it would be mostly individual PCs. I was referring in a general sense to the fact that most systems weren’t tied into networks. Once nets started catching on, most networks were local. Today, morons are hooking almost everything into the global network, whether they should or not.
First I’m seeing this news but I bet someone in the media will use this study to say we must start sequencing everyone or the world will end.
Nothingburger here. The only news is that the gene sequencing software has exploits, which one expects from such highly specialized, low customer count software.
So they crafted the input from DNA? Cute but dumb. If they streamed the same GCATs into stdin they’d get the same result.
Re: Re:
“You wouldn’t download a ribonucleic acid, would you?”
Seen it already
I saw it in the historical archives where the gene sequences reprogrammed the optical emitter of the tricorder to produce a holographic image and audio message.
This is depressing because the first impression I got from this article is “hey, there may be a way to counter the possibly-inevitable mandatory DNA sequencing for every citizen in the future!”
It's possible to look at data without running data as a program
All this means is we need to adequately error trap our DNA sequence analysis software much the way we’d error trap a website or a compiler.
Sure, any given analysis software may have vulnerabilities, but I suspect that if its sufficiently maintained, if there’s multiple applications in use it’d be inefficient to try to protect GMO strains by coding them with malware.
You could still use the GMO as a data device to transport the malware. But that’s very tradecraft.