BIOTERRORISMAI, Bioterrorism and the Urgent Need for Australian Action

By Greg Sadler

Published 27 November 2024

Experts worry that, within a few years, AI will put that capability into the hands of tens of thousands of people. Without a new approach to regulation, the risk of bioterrorism and lab leaks will soar.

Today, you’d have to be a top-notch scientist to create a pathogen. Experts worry that, within a few years, AI will put that capability into the hands of tens of thousands of people. Without a new approach to regulation, the risk of bioterrorism and lab leaks will soar.

The US acted a year ago to reduce that risk. With the return of President Trump and his commitment to repeal important executive orders, it’s time for Australia to take action.

The key action, adopted in an executive order signed by President Biden, is to control not the AI but the supply of the genetic material that would be needed for the design of pathogens.

Biosafety regulation of Australian laboratories needs tightening, too.

When the genome for variola, the virus that causes smallpox, was published in 1994, the capacity to use that information malevolently had not yet evolved. But it soon did. By 2002, ‘mail-order’ DNA could be used to synthesise poliovirus. In 2018, researchers manufactured horsepox using mail-order DNA. Today, the market for synthetic DNA is large and growing.

Both generative AI, such as chatbots, and narrow AI designed for the pharmaceutical industry are on track to make it possible for many more people to develop pathogens. In one study, researchers used in reverse a pharmaceutical AI system that had been designed to find new treatments. They instead asked it to find new pathogens. It invented 40,000 potentially lethal molecules in six hours. The lead author remarked how easy this had been, suggesting someone with basic skills and access to public data could replicate the study in a weekend.

In another study, a chatbot recommended four potential pandemic pathogens, explained how they could be made from synthetic DNA ordered online and provided the names of DNA synthesis companies unlikely to screen orders. The chatbot’s safeguards didn’t prevent it from sharing dangerous knowledge.

President Biden was alert to risks at the intersection of AI and biotechnology. His Executive Order on AI Safety attracted attention in tech circles, but it also took action on biosafety. Section 4.4 directed departments to create a framework to screen synthetic DNA to ensure that suppliers didn’t produce sequences that could threaten US national security.

Before Biden’s executive order, experts estimated that about 20 percent of manufactured DNA evaded safety screening. Now, all DNA manufacturers have obligations to screen orders going to the US and to comply with obligations to know their customers.