5 Questions: David Relman on risks of creating new pathogens

Biosecurity expert David Relman, MD, asserts that a better approach is needed for assessing the risks and benefits of research involving the creation of new and dangerous infectious agents.

David Relman

David Relman, MD, a professor of infectious diseases and of microbiology and immunology, and co-director of the Center for International Security and Cooperation, is a member of the Cambridge Working Group. The coalition of scientists argues that the creation of potential pandemic pathogens in labs should be curtailed until the benefits and risks of such work, as well as how it’s conducted, can be better assessed.

Coincidentally, when the group formed earlier this summer, the public was just beginning to learn about back-to-back mistakes involving dangerous pathogens — anthrax, H5N1 flu virus and smallpox — in government labs. In a statement, the Cambridge group says the incidents “remind us of the fallibility of even the most secure laboratories, reinforcing the urgent need for a thorough reassessment of biosafety.”

Yet another group, calling itself Scientists for Science, was created in the wake of the Cambridge statement to promote the usefulness of experiments on dangerous pathogens and to argue against limiting them.

But as Relman, a biosecurity expert and the Thomas C. and Joan M. Merigan Professor, points out, the two sides are not too far apart in their views. He recently responded to some questions from Paul Costello, chief communications officer at the School of Medicine, about the debate. [Listen to Costello's 1:2:1 podcast with Relman.]

Q: How did this whole issue begin to germinate?

Relman: It was really the re‑ignition of an issue that has percolated for many, many decades in biology. If you were to step back and think about some of the earlier incarnations of this discussion, it would take you back to the 1970s — to the early debate about the possible risks associated with recombinant DNA technology.

In recent years, it’s become a more frequent topic of discussion, I think, because of the ongoing revolution in the biological sciences and the heightened risks and benefits that arise from our newfound capabilities in the laboratory. Scientists today routinely achieve goals that one could only dream about back in the 1970s.

Q: What concerns you the most about the creation of these pathogens? What’s the greatest fear you have?

Relman: My greatest fear is that someone will create a highly contagious and highly pathogenic infectious agent that does not currently exist in nature, publish its genetic blueprint, allow it to escape the laboratory by accident, or else enable a malevolent person or persons to synthesize the agent with the intention of releasing it in a deliberate manner. Although these may be unlikely scenarios, they could have catastrophic consequences, which is why I and others feel that we need to sensitize everyone to these possibilities and decide how to manage these risks ahead of time.

I want to be clear: I am not opposed to laboratory work on dangerous pathogens, especially if they are known to exist in nature. Rather, I am opposed to high-risk experiments and, in particular, those that seek to create novel, dangerous pathogens that cannot be justified by well-founded expectations of near-term, critical benefits for public health — benefits that clearly outweigh the risks, and benefits that cannot be achieved through other means.  

Unfortunately, there are increasing examples of errors and mistakes that have occurred through either careless or simply imperfect human behavior. These errors take on much greater consequence when the work involves transmissible agents.

Q: I heard you say on NPR’s “Morning Edition” that you wanted a larger public debate of this issue. What do you mean by that? How would you like to see the public engaged?

Relman: In the 1970s — in 1975, in particular — there was the famous Asilomar Conference, which took place down on the Monterey Peninsula near Pacific Grove. There was a deliberate effort by the scientists at that meeting, who were the pioneers of recombinant DNA technology, to include outsiders in the early discussions about the appropriateness of the experiments and the management of risk.

They invited the press. They invited some lawyers. They invited clergy and a few others that they thought might represent other segments of society, because the work posed risks for us all. The same can be said of biological work being done today. Shouldn’t we all deserve some place at the table?

The real challenge now is to decide who can represent the key stakeholders in society and how you gather them together. 

I think it’s very important that we include nonscientists, but we then need to make sure that they understand the science well enough to be able to participate in a discussion that includes technical issues.

Q: The Scientists for Science contend that, in contrast to recombinant DNA research at the time of Asilomar, studies on dangerous pathogens are already subject to extensive regulations. How do you respond to that?

Relman: What they say is true. Furthermore, I think the vast majority of those who have spoken about risks in science fully appreciate the much larger benefits that are usually a part of the science that we all do. 

I began my career working on pathogens, on disease‑causing bacteria. I fully believe that that kind of work, in general, is absolutely essential. We don’t have any disagreement on that. The place where we may disagree is on whether we are willing to acknowledge that there may be experiments — probably few and far between — that perhaps ought not to be undertaken because of an unusual degree of risk. Just because a scientist can think up an experiment doesn’t mean it should be performed.

Q: You said in the NPR interview, “Every time that one of these experiments comes up, it just ups the ante a bit. It creates additional levels of risk that force the question, Do we accept all of this?” Can you expand on that?

Relman: Yes. I’m concerned because it almost feels as though over the past, say, five years, we’ve been on what’s really a slippery slope toward tacit acceptance of any experiment that any scientist might propose as of interest.

We tend to fall back on the assumption that all science that a well‑meaning, or seemingly well‑meaning, scientist undertakes is done for the benefit of the knowledge that might ensue. What I worry about, though, is the increasing number of experiments that are undertaken not because someone has thought about the discrete benefits for society, but because the experiment just seemed cool, neat, or based on an I-wonder-if-it-can-be-done kind of approach.

When you’re dealing with disease‑causing organisms — especially when you deliberately manipulate them to see if they can become even more virulent or transmissible — you have the obligation to say, “An experiment shouldn’t simply be done because it’s interesting or cool or I wonder if it can be done.”

You have to be extremely mindful of the risks that you’ve accepted for yourself and for everyone else, in doing that experiment. You have to stop and say, “What are the really concrete benefits that we will all realize in the next year or so, not just at some ill-defined time in the future, which I can then try to weigh against the risks that this experiment also brings?”

About Stanford Medicine

Stanford Medicine is an integrated academic health system comprising the Stanford School of Medicine and adult and pediatric health care delivery systems. Together, they harness the full potential of biomedicine through collaborative research, education and clinical care for patients. For more information, please visit med.stanford.edu.

2023 ISSUE 3

Exploring ways AI is applied to health care