ESPIONAGEHow Digital Identities Challenge Traditional Espionage
It used to be so simple. An intelligence officer could fly to a country, change passports and, with a false identity, emerge as a completely different person. But those days are long since over.
It used to be so simple. An intelligence officer could fly to a country, change passports and, with a false identity, emerge as a completely different person. But those days are long since over. Biometrics and facial recognition technologies can easily detect people travelling on false identities. Even if you can travel on false documents, a simple Google search uncovers your lack of an online profile and digital legend.
Solving this problem is a generational challenge for defense and intelligence agencies. As technology rapidly develops—including encryption, smart cities and generative AI—security agencies and defense communities have a golden window to research, develop and build new identity technologies. Failure to develop the right technology now will change espionage forever.
There are two related capability challenges: creating genuine-looking false profiles for offensive intelligence operations and, conversely, developing the ability to detect artificially generated profiles and identify who is behind them. This is a classic poacher-turned-gamekeeper challenge that domestic and foreign intelligence agencies must navigate: they must develop the capabilities to catch the bad guys while coming up with solutions to defeat those investigative technologies themselves.
In a world where Google, Meta, commercial data brokers and even your bank know a lot more about you than the government does, creating a false identity is hard. However, governments still have one big advantage: they are still the authority on and ultimate backstop for authenticating a person’s identity—typically by validating birth certificates and passports.
Australia, like many countries, has embraced technology as a partial solution to the authenticating identity challenge. It adopted an enhanced tax file number system in 1988 following the defeat of the then government’s attempt to introduce a national identity card. Backed by voiceprint recognition for authenticating identity, the tax file number is now used for data matching across much of the Australian government.
Fraudsters, adulterers, government agents and police forces are perhaps the only people who have an overwhelming need to develop convincing deepfake technologies. An arms race is developing. As companies include more security and authentication in generative-AI tools (such as embedded watermarks and improvements in deepfake detection tools), it is harder to use such tools nefariously. Even if you can spoof official forms of identification (as Russian sleeper agents were recently shown to have done via South American countries) and also generate a realistic digital twin, your online profile (or lack thereof) will still give you away.