The Big Promises and Potentially Bigger Consequences of Neurotechnology

In comparison to DARPA’s decades of interest in the brain, China’s focus on neurotechnology is relatively recent but advancing rapidly. In 2016, the Chinese government launched the China Brain Project, a 15-year scheme intended to bring China level with and eventually ahead of the US and EU in neuroscience research. In April, Tianjin University and state-owned giant China Electronics Corporation announced they are collaborating on the second generation of ‘Brain Talker’, a chip designed specifically for use in BCIs. Experts have described China’s efforts in this area as an example of civil–military fusion, in which technological advances serve multiple agendas.

Australia is also funding research into neurotechnology for military applications. For example, at the Army Robotics Expo in Brisbane in August, researchers from the University of Technology Sydney demonstrated a vehicle which could be remotely controlled via brainwaves. The project was developed with $1.2 million in funding through the Department of Defense.

Beyond governments, the private-sector neurotechnology industry is also picking up steam; 2021 is already a record year for funding of BCI projects. Estimates put the industry at US$10.7 billion globally in 2020, and it’s expected to reach US$26 billion by 2026.

In April, Elon Musk’s Neuralink demonstrated a monkey playing Pong using only brainwaves. Gaming company Valve is teaming up with partners to develop a BCI for virtual-reality gaming. After receiving pushback on its controversial trials of neurotechnology on children in schools, BrainCo is now marketing a mood-altering headband.

In Australia, university researchers have worked with biotech company Synchron to develop Stentrode, a BCI which can be implanted in the jugular and allows patients with limb paralysis to use digital devices. It is now undergoing clinical human trials in Australia and the US.

The combination of big money, big promises and, potentially, big consequences should have us all paying attention. The potential benefits from neurotechnology are immense, but they are matched by enormous ethical, legal, social, economic and security concerns.

In 2020 researchers conducted a meta-review of the academic literature on the ethics of BCIs. They identified eight specific ethical concerns: user safety; humanity and personhood; autonomy; stigma and normality; privacy and security (including cybersecurity and the risk of hacking); research ethics and informed consent; responsibility and regulation; and justice. Of these, autonomy and responsibility and regulation received the most attention in the existing literature. In addition, the researchers argued that the potential psychological impacts of BCIs on users needs to be considered.

While Chile is the first and so far only country to legislate on neurotechnology, groups such as the OECD are looking seriously at the issue. In 2019 the OECD Council adopted a recommendation on responsible innovation in neurotechnology which aimed to set the first international standard to drive ethical research and development of neurotechnology. Next month, the OECD and the Council of Europe will hold a roundtable of international experts to discuss whether neurotechnologies need new kinds of human rights.

In Australia, the interdisciplinary Australian Neuroethics Network has called for a nationally coordinated approach to the ethics of neurotechnology and has proposed a neuroethics framework.

These are the dawning days of neurotechnology. Many of the crucial breakthroughs to come may not yet be so much as a twinkle in a scientist’s eye. That makes now the ideal moment for all stakeholders—governments, regulators, industry and civil society—to be thinking deeply about the role neurotechnology should play in the future, and where the limits should be.

Elise Thomas is an open-source intelligence analyst at the Institute for Strategic Dialogue. This article is published courtesy of the Australian Strategic Policy Institute (ASPI).