“Designer bugs”: how the next pandemic might come from a lab

0
322

"Designer bugs": how the next pandemic might come from a lab

"Designer bugs": how the next pandemic might come from a lab

Finding the best ways to do good. Made possible by The Rockefeller Foundation.

This week, diplomats from around the world are meeting in Geneva, Switzerland, as part of an annual gathering of state parties for the Biological Weapons Convention (BWC). The BWC has an important mandate: It prohibits the 182 countries that have signed on and ratified the convention from developing, producing, and stockpiling biological weapons.

The BWC, and the biosecurity community broadly, has historically been more focused on existing pathogens with clear potential to be used as biological weapons, such as anthrax and the agents causing botulism and Q fever. In addition, health security experts are worried about the “next big one” — the next global pandemic. Pandemic diseases are often zoonotic, meaning they jump from animals to humans. Zoonotic diseases like Ebola, Zika, SARS, and HIV are created when, say, the wrong pig meets up with the wrong bat — and then meets the wrong human.

The emergence of such diseases depends a great deal on spontaneous genetic mutations and circumstantial factors. So here’s a scary thought: Possible future pandemics may not depend on the chance meeting of different animal species and chance mutations, but may be deliberately designed instead. New tools from the field of synthetic biology could endow scientists with the frightening ability to design and manufacture maximally dangerous pathogens, leapfrogging natural selection.

The threat is very much on the minds of security officials. This past May, the Johns Hopkins Center for Health Security (CHS) led an exercise involving former US senators and executive branch officials on how the country would respond to an international outbreak of an engineered pathogen. In this fictional scenario, a terrorist group constructed a virus that was both deadly and highly contagious. More than a year into the made-up pandemic, the worldwide death toll was soaring past 150 million, the Dow Jones had fallen by 90 percent, and there was a mass exodus from cities amid famine and unrest.

In biotech, the story of the past several decades has been one of exponential progress. Just 75 years ago, we were not even confident that DNA was the primary material governing genetic heredity. Today, we are able to read, write, and edit genomes with increasing ease.

But biotechnologies are dual-use — they can be used for both good and ill. We fear that with even just current capabilities, an engineered pandemic could join the growing list of seismic changes made possible by biotechnological advances. Sufficiently capable actors could work to resurrect the deadliest pathogens of the past, like smallpox or Spanish flu, or modify existing pathogens such as bird flu to be more contagious and lethal. As genome engineering technologies become more powerful and ubiquitous, the tools necessary for making these modifications will become increasingly accessible.

This leads to the terrifying specter of independent actors intentionally (or unintentionally) engineering pathogens with the potential to inflict worse harm than history’s deadliest pandemics. No obvious physical or biological constraints preclude the construction of such potent biological weapons. According to biosecurity expert Piers Millett, “If you’re deliberately trying to create a pathogen that is deadly, spreads easily, and that we don’t have appropriate public health measures to mitigate, then that thing you create is amongst the most dangerous things on the planet.”

Mitigating this risk is shaping up to be one of the major challenges of the 21st century — not only because the stakes are high, but also because of the myriad obstacles standing between us and a solution.

The technologies that help us might also hurt us

Natural pandemics can be horrific and catch us completely off guard. For example, three years elapsed between the first officially documented US AIDS cases in 1981 and the identification of HIV as its cause. It took another three years to develop and approve the first drug treating HIV. While antiretroviral treatments now allow those living with HIV to manage the disease effectively (that is, if they can afford the treatment), we still lack a promising HIV vaccine.

Yet as ill-equipped as we may be to fight newly emergent natural pathogens, we are even less prepared to cope with engineered pathogens. In the coming decades, it may become possible to create pathogens that fall well outside the range of infectious agents modern medicine has learned to detect, treat, and contain.

Worse yet, malicious actors might build disease-causing microbes with features strategically tailored to thwart existing health security measures. So while advances in the field of synthetic biology will make it easier for us to invent therapeutics and other technologies that can defend us from pandemics, those very same advances may allow state and nonstate actors to design increasingly harmful pathogens.

For example, new gene-synthesis technologies loom large on the horizon, allowing for the automated production of longer DNA sequences from scratch. This will be a boon for basic and applied biomedical research — but it also will simplify the assembly of designer pathogens.

"Designer bugs": how the next pandemic might come from a lab

Compared to other weapons of mass destruction, engineered pathogens are less resource-intensive. Although malicious actors would currently need university-grade laboratories and resources to create them, a bigger obstacle tends to be access to information. The limits of our knowledge of biology constrain the potential of any bioengineering effort. Some information, like how to work proficiently with a specific machine or cell type, can be acquired only through months or years of supervised training. Other information, like annotated pathogen genome sequences, may be easy to access through public databases, such as those maintained by the National Center for Biotechnology Information.

If information such as pathogen genome sequences or synthetic biology protocols is available online, this could make it much easier for malicious actors to build their own pathogens. But even if they’re not online, hackers can also steal sensitive information from the databases of biotechnology companies, universities, and government laboratories.

Preventing damage from engineered pathogens is complicated by the fact that it takes only one lapse, one resourceful terrorist group, or one rogue nation-state to wreak large-scale havoc. Even if the majority of scientists and countries follow proper protocols, a single unilateral actor could imperil human civilization.

And some wounds can be self-inflicted. Between 2004 and 2010, there were more than 700 incidents of loss or release of “select agents and toxins” (i.e., scary stuff) from US labs. In 11 instances, lab workers acquired bacterial or fungal infections. In one instance, a shipment of a harmful fungus was lost — and, according to the FBI, destroyed — in transit. In a world in which well-meaning but sometimes careless biologists are creating dangerous organisms in the lab, such accidental release events could prove even more frightening.

A global problem

Like naturally occurring pandemics, engineered pandemics will not respect national borders. A contagious pathogen released in one country will emigrate. Actions that protect against engineered pathogens are an example of a global public good: Since a deadly engineered pathogen would adversely affect countries around the world, doing something to prevent them is a service that benefits the whole world.

A fundamental challenge of global public goods is that they tend to be underprovided. With global public goods, individual countries prefer to free ride over unilaterally providing global public goods if they can get away with it.

This doesn’t mean that countries won’t do anything to provide global public goods; they just won’t do as much as they should. For example, a country such as the United States will consider the potential damage an engineered pathogen could wreak on its 325 million people, and it will take actions to prevent this from happening. However, the actions it takes won’t be as extensive as they would be if it were to consider the toll an engineered pathogen could take on the planet’s 7.6 billion people.

To address this dilemma, world leaders created the Biological Weapons Convention in the 1970s. The BWC has the important goal of constraining bioweapons development; in practice, it has been ineffective at verifying and enforcing compliance.

Unlike the BWC, the major nuclear and chemical weapons treaties have extensive formal verification mechanisms. The Nuclear Non-Proliferation Treaty (NPT), effective since 1970, verifies the compliance of signatories through the International Atomic Energy Agency, which has a staff of about 2,560. The Chemical Weapons Convention (CWC), effective since 1997, verifies compliance through the Organisation for the Prohibition of Chemical Weapons, which won the Nobel Peace Prize in 2013. It has a staff of 500. By contrast, the Implementation Support Unit for the BWC, the convention’s sole administrative body, currently has just four employees.

And bioweapons have specific characteristics that make verification and enforcement difficult compared to chemical and nuclear weapons.

Consider nuclear technology. Nuclear power plants require low levels of uranium enrichment (typically around 5 percent), whereas nuclear weapons require highly enriched uranium (typically above 90 percent). Highly enriched uranium requires large industrial facilities with precise centrifuges. When granted access, it is comparatively easy for inspectors to determine when a facility is being used for the production of highly enriched uranium.

Partly for these reasons, no country has ever developed nuclear weapons while being a party to the NPT. Of the nine nuclear weapons nations, the US, USSR (whose weapons are now exclusively owned by Russia), UK, France, China, and likely Israel had nuclear weapons before the treaty was enforced. India (first test in 1974) and Pakistan (first test in 1998) never signed the NPT. North Korea withdrew from the treaty in 2003, three years before its first nuclear test in 2006.

In contrast, bioengineered organisms require fewer resources and smaller facilities to make, and it is harder to readily distinguish between organisms that are being developed for scientific purposes from those that are being developed with malicious intent.

Historically, the BWC does not have a good track record of preventing the possession of bioweapons. The Soviet Union maintained a large bioweapons program after it signed on to the BWC in 1975. The South African apartheid regime held bioweapons in the 1980s and ’90s while being a party to the BWC.

Fearing that invasive verification by the BWC could compromise sensitive intellectual property and hurt the competitiveness of its cutting-edge biotechnology sector, the US chose to withdraw from negotiations at the BWC’s Fifth Review Conference in 2001. The US later rejoined those negotiations, but serious measures to improve the BWC’s verification and enforcement mechanisms have not been implemented, and the agreement remains largely ineffective.

Despite this concern about the invasiveness of verification, there is a growing consensus that the BWC must become more effective. The 2015 Bipartisan Report of the Blue Ribbon Study Panel on Biodefense, chaired by Joe Lieberman, the 2000 Democratic vice presidential candidate, and Tom Ridge, the first secretary of homeland security under George W. Bush, called for the vice president and the secretary of state to chair a series of meetings with relevant Cabinet members and experts to come to an agreement on verification protocols that would satisfy US concerns while adequately enforcing compliance with the treaty. The study led to the introduction of the National Biodefense Strategy Act of 2016, which is still awaiting a vote.

In September 2018, the Trump administration released a National Biodefense Strategy, though this document contained little specific information on how the US would strengthen the BWC and didn’t mention Cabinet-level meetings chaired by the vice president, as was recommended by the blue ribbon panel.

"Designer bugs": how the next pandemic might come from a lab

Some have questioned the seriousness of the threat posed by bioweapons. For example, in his recent book, Harvard University professor Steven Pinker suggests that “Bioterrorism may be [a] phantom menace.” He claims that terrorists wouldn’t weaponize pandemic pathogens, since their goal is typically “not damage but theater.” Others have suggested that even if terrorists wanted to engineer a pathogen as a weapon, they’d lack the requisite biological knowledge and practical know-how to get the job done.

While it is true (and quite fortunate) that these factors reduce at least the present risk of a biological attack, it is cold comfort. In the coming decades, it will only become easier for nonstate actors to acquire and deploy powerful biotechnologies for ill. And beyond terrorists, state actors also pose serious risks.

For example, Japan launched devastating bioattacks against China during World War II. Japanese Unit 731 dropped bombs filled with swarms of plague-infested fleas on Chinese cities, likely killing hundreds of thousands of civilians. The unit’s commander, Shiro Ishii, found plague to be a potent weapon because it could present itself as a natural epidemic and kill large numbers of people through person-to-person transmission.

In addition, the US had a bioweapons program from 1943 to 1969 that, among other things, made propaganda videos bragging about testing biological weapons on human subjects. The Soviet Union’s covert bioweapons program that it maintained after signing on to the BWC had more employees at its peak in the 1980s than Facebook currently has.

We don’t know what we don’t know — but here’s what we can do

Many questions remain unanswered when it comes to the potentially catastrophic risks posed by engineered pathogens. For example, what is the full spectrum of microbes that cause human disease? And which types of microbes would most likely be used as bioweapons? Research centers such as the Center for Health Security at Hopkins, the Future of Humanity Institute, and the Nuclear Threat Initiative are working hard to answer such questions.

But just because we don’t have answers to all the questions — and don’t even know all the questions to begin with — doesn’t mean there aren’t things we can do to mitigate our risks.

Thinking and acting globally

For starters, we should develop a process to address advancements in biotechnology in the BWC. Currently, the BWC lacks a dedicated forum where the treaty implications of new developments in biotechnology can be discussed. Other international agreements like the CWC have dedicated scientific advisory boards to track and respond to new science and technological changes. The BWC has no such board.

There’s some movement on this issue — the Johns Hopkins Center for Health Security hosted an event in Geneva earlier this week to discuss how the BWC can evolve to address rapid advances in biotechnology. Still, it is crucial to establish a permanent institutional capacity within the BWC to address biotechnological change.

This all connects to another priority: give the BWC’s Implementation Support Unit more resources. The four-person implementation support unit, the convention’s sole administrative body, has immense responsibilities that far exceed its current resources. These responsibilities include supporting and assisting nations as they implement the treaty, administering a database of assistance requests, facilitating communication between the parties, and much more.

But the resources remain minuscule, especially compared to other international treaties. The annual cost allocated to BWC meetings and its implementation support unit is less than 4.5 percent of the cost allocated to the CWC. This inadequate budget sends a grim signal about how seriously the world is currently taking the growing risks from bioweapons.

Another global priority should be finding ways to regulate dual-use gene synthesis technologies. To facilitate their research, biologists regularly order short, custom pieces of DNA from companies that specialize in their manufacture. In 2009, the International Gene Synthesis Consortium proposed guidelines for how gene synthesis companies should screen customers’ orders for potentially dangerous chunks of DNA, such as those found in harmful viruses or toxin genes. Most companies voluntarily follow these guidelines, and they represent 80 percent of the global market.

However, even companies currently applying recommended screening procedures only test whether ordered sequences match those of known pathogens. An engineered pathogen with a novel genome could potentially slip past this filter.

Presently, the gene synthesis market is expanding internationally and synthesis costs are falling. It is urgent that governments both independently and multilaterally act to mandate proper screening of sequences and customers. As Kevin Esvelt of MIT writes, “adequately screening all synthesized DNA could eliminate the most serious foreseeable hazards of biotech misuse by nonstate actors.”

Dealing with biorisk on the ground and in the lab

Beyond developing new global standards and practices, we need to adopt more flexible countermeasures to face off the threat of bioengineered pathogens. As noted in a recent CHS report, “One of the biggest challenges in outbreak response, particularly for emerging infectious diseases, is the availability of reliable diagnostic assays that can quickly and accurately determine infection status.”

Diagnostics based on cutting-edge genome sequencing methods could provide detailed information about all the viruses and bacteria present in a blood sample, including even completely novel pathogens. Meanwhile, as genome sequencing technology becomes less expensive, it could be more widely applied in clinics to provide unprecedented real-time insights into genetic diseases and cancer progression.

We also need to invest more in developing antivirals that hit a wider range of targets. Such broad-spectrum drugs may stand a better chance of slowing the proliferation of an engineered bug than treatments specific to single known pathogens.

And we should also develop “platform” technologies that allow rapid vaccine development. Currently, the process of designing, testing, and manufacturing a vaccine to prevent the spread of a new pathogen takes years. Ideally, we could immunize all at-risk individuals within months of identifying the pathogen. Accelerating vaccine development will require us to innovate new and likely unconventional technologies, such as vectored immunoprophylaxis or nucleic acid vaccines.

Even as we pursue and accelerate such research, we should also be mindful of the possibility of self-inflicted wounds. To avert a terrible accident, the international biomedical community should establish firmer cultural guardrails on the research into pathogens.

Currently, career advancement, financial gain, and raw curiosity motivate biologists at all levels to push the envelope, and we all stand to gain from their efforts. However, these same incentives can sometimes lead researchers to take substantial and perhaps unjustified risks, such as evolving dangerous strains of influenza to be more contagious or publishing instructions for cultivating a close cousin of the smallpox virus. It’s important for biologists to do their part to promote a culture in which this adventurous intellectual spirit is tempered by caution and humility.

Encouragingly, synthetic biology luminaries like Esvelt and George Church of Harvard University are doing just that, pioneering technological safeguards to mitigate accidental release risks and advocating policies and norms that would make 21st-century biology a less perilous pursuit. As the tools of synthetic biology spread to other disciplines, their example is one that others should follow.

Underlying the prescriptions above is the need to approach the problem with the sense of urgency it warrants. As our biotechnological capabilities grow, so too will the threat of engineered pathogens. An engineered pandemic won’t announce itself with a towering mushroom cloud, but the suffering of the individuals it touches will be no less real.

R. Daniel Bressler is a PhD candidate in the sustainable development program at Columbia University. His research is at the intersection of dual-use technologies, environmental change, and the capacity for collective action in the international system to deal with these issues. Find him on Twitter @DannyBressler1.

Chris Bakerlee is a PhD candidate in molecules, cells, and orgamisms at Harvard University, where he uses genetic engineering to study how evolution works. Find him on Twitter @cwbakerlee.

Sign up for the Future Perfect guide to charitable giving. Over 5 days, in 5 emails, we’ll dive into questions like how much to give, where to give, and other ways to do good. Start here.

Sourse: breakingnews.ie

“Designer bugs”: how the next pandemic might come from a lab

0.00 (0%) 0 votes

LEAVE A REPLY

Please enter your comment!
Please enter your name here