Dual-use sciene, technological innovation

Technologies can also be dual use because there are benefits that were secondary to their development. An obvious example is the internet: The packet switching that underlies the internet was originally created as a means to communicate between military installations in the event of nuclear war. It has since been released into the civilian domain, allowing you to read this article.

This distinction between military and civilian uses doesn’t always mirror a distinction between good and bad uses. Some military uses, such as those that underpinned the internet, are good. And some civilian uses can be bad: Recent controversies over the militarization of police through the spread of technologies and tactics meant for war into the civilian sphere demonstrate how proliferation in the other direction can be controversial.

Dual use in this sense is about control. Both military and civilian uses could be valuable, as long as a country can maintain authority over its technologies. Because both uses can be valuable, dual use can also be used to justify expenditures, by providing incentives to governments to invest in technology that has multiple applications.

For good or for evil
In the January meeting at the NAS, however, the key distinction was between beneficent and malevolent uses. Today this is the most common way to think about dual-use science and technology.

Dual use, in this sense, is a distinctly ethical concept. It is, at its core, about what kinds of uses are considered legitimate or valuable, and what kinds are destructive. For example, some research on viruses allows us to better understand potential pandemic-causing pathogens. The work potentially opens the door to possible countermeasures and helps public health officials in terms of preparedness. There is, however, the risk that the same research could, through an act of terror or a lab accident, cause harm.

As of 2007, the U.S. National Science Advisory Board for Biosecurity provides advice on regulating “dual-use research of concern.” This is any life sciences research that could be misapplied to pose a threat to public health and safety, agricultural crops and other plants, animals, the environment or materiel.

The challenging ethical question is finding an acceptable trade-off between the benefits created by legitimate uses of dual-use research and the potential harms of misuse.

The recent NAS meeting discussed the spread of dual-use research’s findings and methods, and who, if anyone, should be responsible for controlling its dispersal. Options that were considered included:

  • subjecting biology research to security classifications, even in part;
  • relying on scientists to responsibly control their own communications;
  • export controls, of the type used by the Australia Group with its concerns about military/civilian dual-use of chemicals.

Participants reached no firm conclusions, and it will be an ongoing challenge for the Trump administration to tackle these continuing issues.

The other side of the equation, whether we should do some dual-use research in the first place, has also been considered. On Jan. 9, the outgoing Obama administration released its final guidance for “gain-of-function research” that may result in the creation of novel, virulent strains of infectious diseases – which may also be dual use. They recommended, among other things, that in order to proceed, the experiments at issue must be the only way to answer a particular scientific question, and must produce greater benefits than they do risks. The devil, of course, is in the details, and each government agency that conducts life sciences research will have decide how best to implement the guidance.

For offense or for defense
There’s a third, little discussed meaning of “dual use” that distinguishes between offensive and defensive uses of biotechnology. A classic example of this kind of dual use is “Project Clear Vision.” From 1997 to 2000, American researchers set out to recreate Soviet bomblets used to disperse biological weapons. This kind of research treads the delicate area between a defensive project – the U.S. maintains Project Clear Vision’s goal was to protect Americans against an attack – and an offensive project that might violate the Biological Weapons Convention.

What is offensive and what is defensive is to some degree in the eye of the beholder. The Kalashnikov submachine gun was designed in 1947 to defend Russia, but has since become the weapon of choice in conflicts the world over – to the point that its creator expressed regret for his invention. Regardless of intent, the question of how the weapon is used in these conflicts, offensively or defensively, will vary depending on which end of the barrel one is on.

Regulating science
When scientists and policy experts wrangle over how to deal with dual-use technologies, they tend to focus on the division between applications for good or evil. This is important: We don’t necessarily want to hinder science without valid reason, because it provides substantial benefits to human health and welfare.

However, there are fears that the lens of dual use could stifle progress by driving scientists away from potentially controversial research: Proponents of gain of function have argued that graduate students or postdoctoral fellows could choose other research areas in order to avoid the policy debate. To date, however, the total number of American studies put on hold – as a result of safety concerns, much less dual-use concerns – was initially 18, with all of these being permitted to resume with the implementation of the policies set out on Jan. 9 by the White House. As a proportion of scientific research, this is vanishingly small.

Arguably, in a society that views science as an essential part of national security, dual-use research is almost certain to appear. This is definitely the case in the U.S., where the work of neuroscientists, increasingly, is funded by the national military, or the economic competitiveness that emerges from biotech is considered a national security priority.

Making decisions about the security implications of science and technology can be complicated. That’s why scientists and policymakers need clarity on the dual-use distinction to help consider our options.

Nicholas G. Evans is Assistant Professor of Philosophy, University of Massachusetts Lowell. Aerin Commins is Ph.D. Student, Global Studies Program, University of Massachusetts Lowell. This article is published courtesy of The Conversation (under Creative Commons-Attribution / No derivative).