The Only Safe Place for Manufacturing Killer Viruses
By: Patrick Kiger
WARNING: You may be a bit shocked in about 500 words from now, when I propose putting a few of the world's leading virus researchers into permanent exile in orbit — confining them to space stations guarded by particle-beam robots. But bear with me.
You may have seen this New York Times article about the ominously named National Science Advisory Board for Biosecurity's decision to reverse its previous stance and recommend that scientists be allowed to publish details of how they created a new, super-virulent version of the H5N1 bird flu virus. The natural version of H5N1 is a potent killer. But the new manmade version, in contrast, can spread directly among people — a truly nifty achievement, if you're the sort who gets excited about the sort of invention that inspires this cheery Gizmodo headline: "Engineered Avian Flu Could Kill Half the World’s Humans."
I'm feeling a little nostalgic for the good old days, when the genetically engineered deadly super-microbes As it turns out, the potential contagion that might wipe out a good portion of our species was hatched in the Netherlands, a nation that we generally think of in terms of benign eccentricities — wooden shoes, chocolate-flavored vodka, and that scene in the 1994 movie Pulp Fiction, in which John Travolta's character extols the virtues of the "Royale with Cheese." Meet Ron Fouchier, a virologist at Rotterdam's Erasmus Medical Center, who acknowledged in a recent Science magazine interview that he and his team have created "probably one of the most dangerous viruses you can make." As the article explains:
The virus is an H5N1 avian influenza strain that has been genetically altered and is now easily transmissible between ferrets, the animals that most closely mimic the human response to flu. Scientists believe it's likely that the pathogen, if it emerged in nature or were released, would trigger an influenza pandemic, quite possibly with many millions of deaths.
As this story from the German magazine Spiegel details, Fouchier and his colleagues created the new organism not because they wanted to wipe out humanity, but in an effort to identify mutations responsible for the speed at which deadly pandemics spread. Eventually, Fouchier hopes to develop an "early warning system" that would sound the alarm about killer infectious diseases before they spread.
Armies of scientific, media and government critics have raised alarms about the work by Fouchier and other scientists who've been pursuing similar experiments. They're worried that security breaches will result in superbugs escaping from the laboratory and infecting the population, or that terrorists or malevolent regimes will get their hands upon the scientists' research and use it to unleash a new version of the Black Death, an epidemic which killed off much of the population of Europe in the mid-1300s. In January, Fouchier and others who've been working on superbugs reluctantly agreed to impose a temporary moratorium on pandemic pathogens' potent ability to infect people and animals. He talks of using such knowledge to develop an "early warning system" that would help to stem future outbreaks of lethal infectious diseases.
But the debate has continued to rage: Should scientists be able to publish papers describing their work with super-viruses in detail? Initially, the National Science Advisory Board for Biosecurity, concerned that the info might be used by the likes of Al Qaeda, recommended that it be published only in redacted form, with key details withheld. But at the end of March, the board abruptly reversed its position, saying that it had decided while that the papers still presented security concerns, "the data described in the...manuscripts do not appear to provide information that would immediately enable misuse of the research in ways that would endanger public health or national security.
Personally, that wiggle-word "immediately" sets off alarms in my head. Just as I'm not too reassured by Fouchier's tepid reassurances that the virus is not likely to escape from his lab, I'm not convinced that there isn't somebody out there in a lab someplace who won't be able to figure out how to connect the dots and create an epidemic. As the New York Times recently reported, some experts fear that the growing, largely unregulated community of DIY biologists makes that scenario all the more likely.
Here's my solution: If scientists really want to genetically engineer monster viruses, fine. But they should have to do it by remote control, with the actual experiments conducted by robots in an orbiting space station, while other robots stand guard with particle beam weapons to prevent North Korean astronauts from getting any notion about breaking in. All the data transmissions from the orbital lab would be encrypted, and whatever papers the scientists produced wouldn't be published on the Internet or in scientific journals. If you really have a legitimate need to know the details of their findings, you'd have to go to a secure facility someplace and have them read aloud to you. And then the FBI and the Centers for Disease Control and Prevention would keep you under continuous surveillance from that point on, just to make sure you don't do something stupid with it.
But that's just what I think. Post your opinion below. We promise not to redact it. Or at least we probably won't.