AGENTS OF INFLUENCEDemocracies Must Regulate Digital Agents of Influence

By Justin Bassi

Published 7 April 2023

It would be a mistake to limit the public policy debate to traditional state-on-state espionage or major power rivalry. Such platforms and the advent of the eerily relatable artificial intelligence tool ChatGPT are society-changing technologies that cannot be dismissed as benign or treated as a public good closed to any regulatory or governance process.

As Australia becomes the latest country to ban the Chinese-owned content platform TikTok from government devices, it would be a mistake to limit the public policy debate to traditional state-on-state espionage or major power rivalry.

Such platforms and the advent of the eerily relatable artificial intelligence tool ChatGPT are society-changing technologies that cannot be dismissed as benign or treated as a public good closed to any regulatory or governance process.

ChatGPT and GPT-4, released in recent months by the US organization OpenAI, create a sense of intimacy and identification with the user that, as the technology improves, will enable them to affect our thinking in ways that are orders of magnitude greater than today’s social media.

The name ‘chatbots’ hardly does them justice. ‘Synthetic relationships’ is the description used by some concerned technology commentators.

TikTok, meanwhile, is not just another app. It is influenced by an authoritarian political system while, in turn, having enormous influence in shaping public opinion by controlling what users see and hear.

Although chatbots and TikTok are distinct issues, they converge on one thorny question: how should liberal-democratic governments involve themselves in the use of technology so that citizens are protected from information manipulation—and hence influence over our beliefs—on an unprecedented scale?

The answer is that it’s time for democratic governments to step in more heavily to protect our citizens, institutions and way of life. We cannot leave a handful of tech titans and authoritarian regimes to rule the space unchallenged, as AI has the potential not only to be a source of information, but to establish a monopoly on truth that goes beyond human knowledge and becomes a new form of faith.

To date, Western governments have largely leaned towards a hands-off approach. Born partly out of the ideological struggles of the Cold War, we believed that governments should stay out of the way lest they stifle innovation.

Meanwhile, authoritarian regimes in China, Russia and elsewhere have grasped the value of technology as something they can control and weaponize, including to win information battles as part of broader political warfare. As Russian President Vladimir Putin said of AI in 2017: ‘Whoever becomes the leader in this sphere will become the ruler of the world.’