Policy Forum
Apr 2006

Infectious Disease Research and Dual-Use Risk

Maureen Kelley, PhD
Virtual Mentor. 2006;8(4):230-234. doi: 10.1001/virtualmentor.2006.8.4.pfor1-0604.


The anthrax attacks in the United States in late 2001 served as a wake-up call for national security experts and reminded the scientific and medical communities that infectious disease outbreaks are not only caused by Mother Nature. One response to these attacks was to increase funding for research into infectious diseases, including select agents. Select agents are those biological agents and toxins that pose a potential threat to the public if deliberately or accidentally released [1]. Examples of select agents include the avian influenza virus, yellow fever virus, the pox viruses, the Marburg and Ebola viruses, the Japanese encephalitis virus, the anthrax bacillus, and the botulinum neurotoxins [2]. Yet conducting clinical and scientific research on select agents creates a serious ethical dilemma both for the researchers who use the agents and for the public; this situation is referred to as the dual-use risk.

The dual-use dilemma is this: in an effort to respond to existing and emerging infectious diseases, the same scientific information or products intended for good can also fall into the wrong hands and be used to threaten a population in an act of bioterrorism. All of the biological agents mentioned above exist in nature and either pose active infectious threats to large populations or could threaten wider populations in future natural outbreaks. Consequently, aggressive vaccine and treatment research is necessary to prevent and prepare for possible outbreaks. In the US, the Bioterrorism Preparedness and Response Act of 2002 was introduced to facilitate the development of new countermeasures to potential bioterrorist threats by accelerating development programs and stockpiling effective countermeasures [3]. By definition however, such research entails the use of live samples of an agent, inactive samples of an agent, or infected research animals. When reverse genetic engineering is used, genomic data for select organisms may be available and shared between investigators. Sometimes researchers will inadvertently create a more virulent strain of an organism while searching for mechanisms to disarm it or to create less virulent strains. Dual-use research may also make a nonpathogen virulent, create a strain that is resistant to antibiotics or antivirals, or develop a strain that is able to evade diagnosis [4]. Dual-use research thus presents several important problems in research ethics and public health policy.

Responsibility and Dual Use

At a fundamental level, dual use raises an old moral question in a new way. Should a person be held morally responsible for outcomes that are not intended and may be largely outside of her control? There are forward-looking and backward-looking versions of this question. The backward-looking version arises when a negative outcome occurs and we wish to hold someone responsible, to lay blame, and to punish. The forward-looking version asks whether a person is morally bound to take preemptive precautions to avoid unwanted future outcomes. Recent public discussions about responsibility and dual-use research in the context of the bioterrorism threat resemble earlier moral debates in the history of science.

Both versions of the responsibility question were posed to Albert Einstein, Neils Bohr, and the other scientists of the Manhattan Project, first during the early work on atomic energy and then in the wake of the atomic bombing of Hiroshima and Nagasaki. Einstein and Bohr, in particular, struggled publicly and privately to sort out this question of personal responsibility. As Einstein said in an address at a Nobel Anniversary dinner, "Today, the physicists who participated in forging the most formidable and dangerous weapon of all time are harassed by an equal feeling of responsibility, not to say guilt" [5]. In the context of infectious disease treatment and vaccine development, if a researcher's intentions are good—to contribute to scientific progress or, in our case, to save lives in the event of an infectious disease outbreak—then how can we hold the researcher responsible for unintended and unforeseen malevolent use of the same scientific discovery?

We might ask whether the risk is truly unforeseen. We have some empirical evidence that the risk is probable, though the degree of probability is hard to estimate. We can look to the anthrax attacks in the US in 2001 and the sarin nerve gas attacks in Japan in 1995 as evidence of what is possible. And there are reports of terrorist groups' attempts to acquire the scientific expertise needed to carry out future attacks [6]. Research journal articles from the 1950s and 1960s that describe methods for isolating, culturing, identifying, and producing bacteria, including Bacillus anthracis were found in former terrorist camps in Afghanistan. Documents were also discovered there that outlined plans for recruiting individuals with PhD-level expertise and attending scientific symposia and conferences. The Japanese terrorist group Aum Shinrikyo, responsible for the 1995 attack on the Tokyo subway, has reportedly achieved a much more sophisticated level of weapons development and may be pursuing an independent bioweapons program in its own laboratories [6].

Although a large-scale biological attack has not yet occurred, reports like these raise valid concerns. If marginally skilled terrorists can accomplish bioweapons development with 50-year-old publications, what could expert terrorists accomplish using current findings and procedures? Given the possibility of the deliberate release of a select agent, one can reason from the public health perspective and argue that good intentions will not mitigate forward-looking responsibility for the consequences of malevolent applications of biodefense research. From the vantage point of ethical foresight, it would be irresponsible for the clinical and scientific communities not to anticipate this dual-use risk and seek preventive, protective measures that will minimize it.

Managing the Dual-Use Dilemma

The scientific community's response and that of US federal agencies to dual-use research are still evolving, but measures to date reveal a 2-pronged strategy: (1) educating the scientific community, and (2) increasing security. To lay the groundwork for the first strategic arm, the US National Academy of Sciences published a detailed report that included recommendations for training molecular biologists and other researchers in the life sciences and educating laboratory staff about the risks of dual use [4]. A new advisory agency—the National Science Advisory Board for Biosecurity (NSABB)—has also been formed to oversee and guide the biosecurity response and the education effort. This advisory body is charged with developing a code of conduct that will help researchers in the life sciences to take preventive measures to minimize the risk that research organisms will be stolen or diverted for malevolent purposes [7].

Many universities and research centers are launching educational awareness initiatives aimed at clinical and scientific researchers in infectious diseases who work with select agents [8]. Better security measures will also be essential. This means more rigorous background checks for staff, graduate students, and faculty and tighter security for laboratories, stored samples, and research data. Most recently it has meant the expansion of the oversight role for Institutional Biosafety Committees, or IBCs. The details of IBC responsibilities in dual-use research are still being clarified, but IBCs will likely be asked to review research protocols on select agents, much in the way that institutional review boards oversee human subjects research. In the case of dual-use research, IBCs will be responsible for maintaining standards of biosafety and minimizing dual-use risk from within each research institution or university.

A Threat to Scientific Openness

The most controversial intervention in the name of biosecurity pertains to the publication of research data and methods. Until now, the editorial boards of medical and scientific journals have taken responsibility for judging, on a case-by-case basis, whether particular studies should be edited or withdrawn because the data or methods might aid terrorists. Within the last year, several controversial studies have caught the attention of the public, the scientific community, and NSABB.

In one case, the authors synthesized full-length poliovirus cDNA and then transcribed the artificial viral cDNA into viral RNA, thus making viral replication possible. The virus was then used to infect healthy cells to demonstrate that the artificial virus had the biochemical and pathogenic characteristics of the polio virus [9]. The publication of the study raised serious concerns that scientists were offering potential blueprints to terrorists for creating the same or similar viruses. Related concerns have been voiced about the online publication of genomic sequences. Genome analysis and nucleotide sequencing are important tools in the study of pathogenic microorganisms and in the development of diagnostic tools and vaccines [10].

Data sharing is central to the practice of science, but should the public have unrestricted access to information that might be used for malevolent purposes? What is the best way to monitor access without hampering free interchange and dialogue within the scientific community? Significantly restricting access to information in response to biosafety concerns could potentially have the dangerous effect of decreasing the transparency of scientific research to the wider public—an important feature of any citizen-supported institution in a liberal society.

Publication is the catalyst for many of the problems of security and responsibility discussed earlier. In cases where published data are available, or easier methods are already widely known, no benefit would come from censoring the scientific data. The tough case is novel information. Even in cases where novel methods are introduced, publication remains the primary means for scientists to share methods and results in the field of preventive measures and countermeasures.

Furthermore, excessive censorship and bureaucratic constraints on research and publication may have a chilling effect on select agent research. One way to balance intellectual freedom of publication and dual-use risk is to rely on existing institutional checks and balances. Editors can be trusted to exercise self-restraint in the publication of data and methods, with the aid of more detailed scientifically and ethically informed guidelines on dual-use risks. IBCs can serve as resources for scientists at the earlier stages of research, when they should be thinking ahead about potential dual-use concerns in publication. The paper might then be flagged when sent to editors, and consideration given to publishing partial data or methods, with complete methods available upon request for other investigators and research institutions.

These questions currently frame a heated debate within the scientific community, federal research oversight agencies, and, increasingly, the public media. As we work through the details of managing dual use, it is important to keep in mind the instrumental and symbolic value of transparency in research. Curtailing a terrorist's access to scientific research also curtails access by the general public and other scientists. Lack of transparency is, in that sense, a barrier to scientific progress. In light of the uneasy history of science, maintaining professionally significant transparency in politically charged areas of research such as biodefense is a prerequisite for building public trust. And public trust and awareness are central components in emergency preparedness.

The dual-use dilemma has yet to be resolved. The hope is that rigorous public dialogue and discussions within the scientific and medical communities will facilitate the development of educational guidelines, professional agreements, and institutional oversight that will provide reasonable safeguards against dual-use risk without unduly constraining the practice of infectious disease research aimed at the public good.


  1. See the Centers for Disease Control and Prevention, Select Agent Program for the latest information on select agents. Available at: http://www.cdc.gov/od/sap/. Accessed March 16, 2006.

  2. The official list of select agents is determined by Health and Human Services and USDA and can be found in US Federal Regulations: Agricultural Bioterrorism Act of 2002. 7 CFR Part 331 (2005); Possession use and transfer of select agents and toxins, 9 CFR Part 121 (2005); and HHS Final rule. Office of Inspector General. 42 CFR Part 73 (2005).

  3. Bioterrorism Preparedness and Response Act, 42 USC 247d-6(h)(4) (2002).

  4. National Academy of Sciences. Committee on Research Standards and Practices to Prevent the Destructive Application of Biotechnology. Biotechnology Research in an Age of Terrorism: Confronting the "Dual-use" Dilemma. Washington, DC: National Academies Press; 2004. Available at: http://fermat.nap.edu/books/0309089778/html. Accessed March 16, 2006.

  5. Einstein A. Ideas and Opinions. New York, NY: Three Rivers Press; 1954:115.

  6. Petro JB, Relman DA. Understanding threats to scientific openness. Science. 2003;302(5652):1898.

  7. For further information about NSABB, see: http://www.biosecurityboard.gov/.

  8. For a publicly available training module on dual use, designed for graduate students and junior investigators, see: Southeast Regional Center of Excellence for Biodefense and Emerging Infections. The dual-use dilemma in biological research. In: Core I. Policy, Ethics, and Law Core of the Southeast Regional Center of Excellence for Biodefense and Emerging Infections. Available at: http://www.serceb.org/modules/serceb_cores/index.php?id=3. Accessed March 2, 2006.

  9. Cello J, Paul AV, Wimmer E. Chemical synthesis of poliovirus cDNA: generation of infectious virus in the absence of natural template. Science. 2002;297(5583):1016-1018.
  10. Clayton RA, White O, Fraser CM. Findings emerging from complete microbial genome sequences. Curr Opin Microbiol. 1998;1(5):562-566.


Virtual Mentor. 2006;8(4):230-234.



The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.