>We now have the tools to create viruses in labs. What happens when someone creates a virus that spreads extremely easily, has greater than 50% mortality, and has an incubation period of several weeks? Something like this, released by a bad guy and without the world having time to prepare, could wipe out more than half the population in a matter of months. Misguided biotech could effectively end the world as we know it
Sam is a smart guy, so I really don't want to come off as sounding like a jerk here, but this grossly underestimates the technical feasibility of creating such a virus. Computer folks routinely overestimate how much biologists actually know about the systems we study. We know jack about how the vast majority of biology works. We have the most fleeting glimpses of understanding that are regularly crushed by the complexity of dynamic systems with nested feedback loops and multiple semi-overlapping redundancies. I won't say it's impossible, but we don't even know enough to know whether the three things: high mortality, long incubation, and ease of transmission are even possible. While we can imagine it, there might be biological and epidemiological factors that prevent such a thing from existing.
This also commits the logical fallacy of ascribing superpowers to the bad guys cooking up viruses while assuming the good guys are sitting on their duffs letting bad things happen. H5N1 was a pretty good example of international collaboration. There were academic competitors and industrial labs working around the clock collaboratively on it in the early days before much was known. Whole vaccine divisions at pharmas were all over it. If we're instead talking about a mythical time in the future when we do understand enough biology to engineer something like this, one would have to assume the good guys possess the knowledge to develop countermeasures.
I'm not arguing that pandemics aren't something we should worry about. Europeans were almost wiped out by the plague and in modern times Africa has been decimated by HIV. These are real problems that the human race has faced and will likely face again, irrespective of lab-created stuff. Biotechnology is the primary mechanism by which we're going to be able to survive when the next one comes, wherever it comes from.
Sam is a smart guy, so I really don't want to come off
as sounding like a jerk here, but this grossly
underestimates the technical feasibility of creating
such a virus.
Nature already created viruses like this [1]. Although the mortality rate was "only" 30% but no biotech was required. Also, people are seriously studying what would happen if H5N1 was released into the wild [2]
That being said, "Don't Panic (tm)". The likelihood of any of these scenarios is extremely extremely low and people have been thinking about and preparing for them for decades.
As for what is much more likely to happen with naturally occurring viruses like the H7N9, natural variants of the H5N1, etc... see my comment from a few days ago [3].
Disclaimer: I'm the first author of [1] and a collaborator of the two first authors of [2].
You are right that no biotech is required to find terrible viruses from history. Access to those viruses is pretty well controlled though: you can get a stern letter from the CDC if you try to order DNA that looks like smallpox or RNA that looks like the 1918 Spanish flu. It's happened to my roommate during his virology research--they do monitor these things[1].
BUT, more importantly--you don't even need fancy biotech to engineer a terrible virus. No sequencing, recombinant DNA, fancy BSL-4 labs, well-educated virologists, none of that. All you need is a captive population, something any self-respecting evil warlord should be able to get. (Certainly the North Koreans, the Taliban, etc. have access to plenty of prisoners.)
The same simple mechanism used by Fouchier (the guy who created the controversial bird/ferret superflu) can be applied to humans: serial passage. Put any moderately bad flu virus in a certain number of prisoners, then expose them via air circulation to other prisoners. Take the sickest people from the second group and expose them to another uninfected group via air circulation. After five or six passages the virus in the last group will show extraordinary virulence and transmissive capabilities because you've applied artificial selection.
To seed the virus and start an attack, you take a few of these prisoners, expose them to the worst virus, and put them on planes to your target country while they are still in the incubation/transmissive period.
Obviously, this is a nightmare scenario that I hope never happens, but the idea that terrorists or evil dictators need fancy science to engineer superbugs is false. The same methods farmers have used for centuries to grow taller corn and leafier lettuce can be applied to viruses by anybody with enough prisoners and moral depravity.
[1]: It should be noted that in-house synthesizing costs are coming down, though, and we won't be able to rely on the safeguard of companies automatically BLASTing ordered sequences against a CDC blacklist for much longer.
Thanks for injecting some actual peer reviewed research into this discussion. As I mentioned elsewhere, I was mostly responding to the notion that one can engineer a virus with specific properties as opposed to relying on natural pathogens. No argument that there are no shortage of naturally occurring bugs that can kick our butts.
Can you comment on the models that are used for simulating outbreaks? Do they factor in natural evolutionary pressure and change of the virus? In the case of smallpox, you have something fairly stable, but influenza is highly variable over time. I can't imagine how one does that but it would be cool to know if it's possible.
Do they factor in natural evolutionary pressure and change of the virus?
Not at all. Meta population models use some type of stochastic discrete PDEs and agent based models just infection probabilities. This is an over simplification but just so you have an idea. You can get the gritty details in the papers above.
Starting from scratch, sure (everyone always seems to think of DNA and RNA as super-Legos; I blame Hollywood), but I think it's worth being at least slightly afraid of things like lab-engineered influenzas, given that flu is a very small, well-studied, and easily mutated RNA virus. Cooking up a more deadly (and, yeah, I recognize that we haven't proven that the Unholy Grail of "high mortality, long incubation, and ease of transmission" even exists in any single wild-type bug) influenza -- even if we don't necessary know the methods of action -- isn't a big lift as far as major bioengineering projects go.
If you're not trying to create a "supervirus," but just shotgun a series of viruses to create maximum health disruption, then (as you know!) it's already within our technical grasp. Expensive, time-consuming, and probably failing 90% of the time, but accessible nonetheless. Considering how bad our hit rate is on the trivalents, I could see a plague of multiple, high-lethality strains with a wide variety of hemagglutinin and neuramidase antigenic shifts as a plausible bioterror scenario.
It's not a doomsday scenario, since we have such good reporting and analysis infrastructure, and much of the heavy engineering on the problem (like alternate vaccine production methods) was done during the earlier bird flu outbreaks, but it would be a very nasty kind of terror attack.
You are correct, I was mostly interpreting his post to be about a purposefully engineered virus with specific properties. One could throw darts and hope to get "lucky" by recombining and tweaking existing pathogens. That individual would be wrong a lot, but with concerted effort might find something. I'm not sure that is new technology though. This could have been done at least 15 years ago.
One point that I didn't have time to make was that high mortality is not generally evolutionarily advantageous. Even if you cooked up a strain that was especially nasty, it could take considerable effort to prevent it from mutating in the wild into something less so, since the mutated virus would have a survival advantage of not killing its host. Unlike machines, a creator can't really control what happens to a biological system in the wild as it interacts with the environment. In the scenario you describe, there is considerable uncertainty as to whether it would actually spread as engineered.
This is the crux of what I've clearly done a poor job of saying: we know so little that all these scenarios still rely on incredible amounts of luck more than technology.
His scenario is hypothetical with quick estimates of mortality and incubation period. This virus has been created in the lab and the formula is known. How is he then underestimating the technical feasibility?
>If we're instead talking about a mythical time in the future when we do understand enough biology to engineer something like this, one would have to assume the good guys possess the knowledge to develop countermeasures.
This is naive.
Understanding of weapons != Knowledge of countermeasures (since were talking about logical fallacy)
The first one is still wrong. "Underestimate the technical feasibility" means, "They think it is less feasible (harder) than it actually is." I think you mean the opposite: they think it is easier than it actually is. I.e. you're trying to say it's harder than we think.
You can say instead: "underestimates the difficulty" or "overestimates the feasibility."
James, you are correct but I think JunkDNA's point is still being made effectively. In fact this was likely an intentional statement meant to illustrate the difficulty involved in intentionally creating a thing--if an error could be easily introduced in a handful of words, how likely is it that a malicious DNA creator will get their supervirus working exactly right?
That is very optimistic of you. Personally I think it was more likely that it was simply a typing error, as he has acknowledged. That's fine, we all make them. I had some difficulty parsing what he was trying to say, which is the only reason I brought it up.
The rest of your argument is very far fetched. One person's difficulty formulating a sentence could not be less related to biologists' collective capabilities in tailoring viruses.
The scenario might seem far-fetched today, but what if biotech made the same kind of progress over the next 50 years as computer technology has over the last 50? A human-engineered super virus might seem as unlikely to a virologist today as an iPhone would seem to Alan Turing.
I address this in my second paragraph. You can't assume the advances all happen on the negative side (the ability to perfectly engineer a deadly virus) without corresponding advances on the positive side (enough understanding of biology to combat new viruses).
I think you have a mistaken assumption, though, namely that advances in CREATING dangerous things will be paralleled by advances in the ability to prevent bad things.
Nuclear weapons have been around for over 50 yeras. We do not yet have __in place__ any ability (other than treaties and fear) to prevent a nuclear holocaust. Missile shields, "Star Wars" -- all of those are of questionable capability, and none of them are deployed.
Given that our only way of preventing nuclear winter is to agree not to launch (and go to war to prevent Bad Guys from getting them?), an option which is not available when dealing with a disease, I'm not optimistic about our future ability to prevent a superbug from wiping out humans.
Certainly, I will grant that it is possible to blindly shoot in the dark and get lucky creating something that we don't understand, but is nevertheless deadly. Absolutely that could happen, and another commenter upthread gives some very plausible scenarios for this that I had not considered. But we've had the technology for blindly shooting in the dark in the lab for probably 20 years at least.
But that's not the point I think Sam was making. I read the article as discussing a purposeful designing of a virus with specific properties. My contention is that the knowledge required to engineer something that can evade the immune system, spread easily, and cause high mortality is very likely the same mechanistic knowledge that would help you to defeat such a virus.
The ballistic technology that can deliver a warhead to a target 12,000 miles away can also deliver a constellation of remote sensing satellites into orbit.
Remote sensing is what held off nuclear holocaust during the Cold War. The ability to reliably and quickly detect and respond to nuclear first strike creates the "mutually assured destruction" strategic framework aka deterrence.
I think the analogy is broken. In case of nuclear armrace it's a different technology that destroys (bombs), and a different that saves (missile protection?). In case of genetics it's one and the same - engineering organisms.
The thing that defeats viruses is evolutionary pressure, not "engineering organisms".
At this point in time I'd still say that we only have the ability to reintroduce old disappeared pathogens. We do not actually have the ability to design new effective viruses.
for example, in a test case scientists have gone from flu sequence to midscale production (i.e. enough for first responders) of flu vaccine within something like 4 days. The backstory is great - the lead researcher arranged for FedEx to pick up the finished DNA sequences at his home at midnight (I don't remember why it couldn't be picked up at the lab but there was a reason) and he was worried that his neighbors would think that he was a drug dealer.. This means one could go to large scale production (enough for a population) within weeks to months.
Unlike the seasonal flu shot, this vaccine is tailored to the emergent strain.
Frank Herbert wrote this story 30 years ago (White Plague -- 1982). I'd be surprised if he were the first to worry about it. So we may be well along in the 50-year progression is all.
Uh, nope. What does "almost wiped out" mean to you? 30% of population dead is "almost wiped out"? It's horrible, of course, but there were still 70% who stayed alive. Well, I could say that I can jump almost three meters high with that logic. :-)
In the 1970s the world's population was about 3.5 billion and today it is double that. Sure it would be very disruptive, but there would still be plenty of creative people around to pick up the pieces.
There is a point where the breakdown in social order is going to lead to a much greater problem. If the food and energy systems we rely on to supply cities break down then you'll be looking at problems other than the outbreak causing deaths.
If society breaks down then you'll suddenly be looking at countries being unable to support anything close to that number of people.
I think I'd also like to add where's the motivation? Basically someone who did this would have to be what? A Serial killer with a biotechnology fetish?
I'm sure that a profile could be constructed of someone that would have the motivation to do it, but that would of course be a big limitation of possible people doing it, and then out of that group limit it to the people capable of doing it, and so forth - and actually would it really be likely that someone with the skill to do it would also be in the group motivated to do it? 12 monkeys aside I actually think not.
There's a science fiction novel waiting to be written that incorporates the ideas from The White Plague and Ribofunk and the 21st century IT world.
Imagine the genetically-engineered counterpart to the CryptoLocker ransomware. Private and public antivirus research. MacAffee for your white blood cells. Imagine ad-supported biotech.
The internet has created a sort of commoditization of fraud. Scam spam, ad-ware, spyware, etc, often run by companies operating right out in public. Professionals openly discuss security from both sides of the line.
Imagine this sort of bizarre professional attitude in the a low-cost engineered-virus biotech field. What would the Windows of this world be - a suite of programmable bacteria with a solid API?
>If we're instead talking about a mythical time in the future when we do understand enough biology to engineer something like this, one would have to assume the good guys possess the knowledge to develop countermeasures.
That pressuposes that the "good guys" would be the victims here. How about the opposite?
What about regular "democratic" superpowers doing it to poorer countries they want to control, as they have done similar things all along the colonial and post-colonial era?
The way dictatorships in Latin America and Middle East got weapons, supplies and a helping hand from the US for example against their people or neighborhooding countries.
> This also commits the logical fallacy of ascribing superpowers to the bad guys cooking up viruses while assuming the good guys are sitting on their duffs letting bad things happen
It's the other way around. The "good guys" are cooking up viruses and publishing them as science (in this case at least)
> We know jack about how the vast majority of biology works. We have the most fleeting glimpses of understanding that are regularly crushed by the complexity of dynamic systems with nested feedback loops and multiple semi-overlapping redundancies.
The author is only arguing for ramping up funding of defensive biotechnologies. It is much easier to create a new virus than to find a cure for an existing one. The immune system is simply vastly more complicated than a virus.
> This also commits the logical fallacy of ascribing superpowers to the bad guys cooking up viruses while assuming the good guys are sitting on their duffs letting bad things happen.
[...]
> If we're instead talking about a mythical time in the future when we do understand enough biology to engineer something like this, one would have to assume the good guys possess the knowledge to develop countermeasures.
------------------------
The amount you need to destroy is more or less a constant. When the efficacy of your technologies is limited, then a technology that only gives you a small percentage edge over your opponent's technology - say Iron vs Bronze - is survivable for the defender provided they have an edge in some other area, though not necessarily pleasant. However, as the efficacy of technology increases, you only need a small percentage edge over your opponent in terms of relative efficacy of technologies to have more than enough power to destroy all that you need to in order to remove them forever.
Consider that armies could be separated by hundreds of years of technology in the past, and still fight on a roughly equal footing. Technology did not more very fast, nor was it very powerful. Then imagine what an army of today would do to an army of a hundred, or even fifty years ago. In the Iran-Iraq war two armies with Cold-War level technology faced each other off for eight years. The Iraqi army were, however, swept aside very quickly by a more advanced force.
The timespan in which there's a rough parity in power shortening. Even small difference in development with respect to time can rapidly become insurmountable when you're dealing with a high rate of change and powerful technologies. What you're defending is more or less static: people, land, resources, they're not getting any more durable, while weapons are always becoming more powerful.
You only need one world ending plague. The defender has to be on top every time, the attacker only has to surpass them once. There is no chance to adapt to what they make, or to try again any more than biological evolution can adapt people to a bullet in the head - because any minor adaptation in that direction makes no difference when compared to the sorts of forces that are imparted.
And I don't think we should have much confidence in the idea that the defender is going to be on top every time.
The questions seem to me to be ones of whether a reluctance to destroy the world is characteristic of organised systems, and whether organised systems will always have the edge over individual effort. If we get to a level where someone can create a suitably devastating bio-weapon in their garden shed, will we also be in a position where that's effectively analogous to creating any other outdated weapon in your shed?
I don't know, computer viruses don't lead me to much hope on that point. Enormous energies are being expended to restrain the energies of a few, with no clear victory in sight. The playing field is not always slanted the way we might like.
Sam is a smart guy, so I really don't want to come off as sounding like a jerk here, but this grossly underestimates the technical feasibility of creating such a virus. Computer folks routinely overestimate how much biologists actually know about the systems we study. We know jack about how the vast majority of biology works. We have the most fleeting glimpses of understanding that are regularly crushed by the complexity of dynamic systems with nested feedback loops and multiple semi-overlapping redundancies. I won't say it's impossible, but we don't even know enough to know whether the three things: high mortality, long incubation, and ease of transmission are even possible. While we can imagine it, there might be biological and epidemiological factors that prevent such a thing from existing.
This also commits the logical fallacy of ascribing superpowers to the bad guys cooking up viruses while assuming the good guys are sitting on their duffs letting bad things happen. H5N1 was a pretty good example of international collaboration. There were academic competitors and industrial labs working around the clock collaboratively on it in the early days before much was known. Whole vaccine divisions at pharmas were all over it. If we're instead talking about a mythical time in the future when we do understand enough biology to engineer something like this, one would have to assume the good guys possess the knowledge to develop countermeasures.
I'm not arguing that pandemics aren't something we should worry about. Europeans were almost wiped out by the plague and in modern times Africa has been decimated by HIV. These are real problems that the human race has faced and will likely face again, irrespective of lab-created stuff. Biotechnology is the primary mechanism by which we're going to be able to survive when the next one comes, wherever it comes from.
EDIT: Fixed wrong word usage in 2nd sentence.