Governments Continue Losing Efforts to Gain Backdoor Access to Secure Communications
Clipper was based on the idea of a “golden key,” namely, a way for those with good intentions – intelligence services, police – to access encrypted data, while keeping people with bad intentions – criminals, terrorists – out.
Clipper Chip devices never gained traction outside the U.S. government, in part because its encryption algorithm was classified and couldn’t be publicly peer-reviewed. However, in the years since, governments around the world have continued to embrace the golden key concept as they grapple with the constant stream of technology developments reshaping how people access and share information.
Following Edward Snowden’s disclosures about global surveillance of digital communications in 2013, Google and Apple took steps to make it virtually impossible for anyone but an authorized user to access data on a smartphone. Even a court order was ineffective, much to the chagrin of law enforcement. In Apple’s case, the company’s approach to privacy and security was tested in 2016 when the company refused to build a mechanism to help the FBI break into an encrypted iPhone owned by a suspect in the San Bernardino terrorist attack.
At its core, encryption is, fundamentally, very complicated math. And while the golden key concept continues to hold allure for governments, it is mathematically difficult to achieve with an acceptable degree of trust. And even if it was viable, implementing it in practice makes the internet less safe. Security experts agree that any backdoor access, even if hidden or controlled by a trusted entity, is vulnerable to hacking.
Competing Justifications and Tech Realities
Governments around the world continue to wrestle with the proliferation of strong encryption in messaging tools, social media and virtual private networks.
For example, rather than embrace a technical golden key, a recent proposal in France would have provided the government the ability to add a hidden “ghost” participant to any encrypted chat for surveillance purposes. However, legislators removed this from the final proposal after civil liberties and cybersecurity experts warned that such an approach would undermine basic cybersecurity practices and trust in secure systems.
In 2025, the U.K. government secretly ordered Apple to add a backdoor to its encryption services worldwide. Rather than comply, Apple removed the ability for its iPhone and iCloud customers in the U.K. to use its Advanced Data Protection encryption features. In this case, Apple chose to defend its users’ security in the face of government mandates, which ironically now means that users in the U.K. may be less secure.
In the United States, provisions removed from the 2020 EARN IT bill would have forced companies to scan online messages and photos to guard against child exploitation by creating a golden-key-type hidden backdoor. Opponents viewed this as a stealth way of bypassing end-to-end encryption. The bill did not advance to a full vote when it was last reintroduced in the 2023-2024 legislative session.
Opposing scanning for child sexual abuse material is a controversial concern when encryption is involved: Although Apple received significant public backlash over its plans to scan user devices for such material in ways that users claimed violated Apple’s privacy stance, victims of child abuse have sued the company for not better protecting children.
Even privacy-centric Switzerland and the European Union are exploring ways of dealing with digital surveillance and privacy in an encrypted world.
The Laws of Math and Physics, Not Politics
Governments usually claim that weakening encryption is necessary to fight crime and protect the nation – and there is a valid concern there. However, when that argument fails to win the day, they often turn to claiming to need backdoors to protect children from exploitation.
From a cybersecurity perspective, it is nearly impossible to create a backdoor to a communications product that is only accessible for certain purposes or under certain conditions. If a passageway exists, it’s only a matter of time before it is exploited for nefarious purposes. In other words, creating what is essentially a software vulnerability to help the good guys will inevitably end up helping the bad guys, too.
Often overlooked in this debate is that if encryption is weakened to improve surveillance for governmental purposes, it will drive criminals and terrorists further underground. Using different or homegrown technologies, they will still be able to exchange information in ways that governments can’t readily access. But everyone else’s digital security will be needlessly diminished.
This lack of online privacy and security is especially dangerous for journalists, activists, domestic violence survivors and other at-risk communities around the world.
Encryption obeys the laws of math and physics, not politics. Once invented, it can’t be un-invented, even if it frustrates governments. Along those lines, if governments are struggling with strong encryption now, how will they contend with a world when everyone is using significantly more complex techniques like quantum cryptography?
Governments remain in an unenviable position regarding strong encryption. Ironically, one of the countermeasures the government recommended in response to China’s hacking of global telephone systems in the Salt Typhoon attacks was to use strong encryption in messaging apps such as Signal or iMessage.
Reconciling that with their ongoing quest to weaken or restrict strong encryption for their own surveillance interests will be a difficult challenge to overcome.
Richard Forno is Principal Lecturer, CSEE & Assistant Director, UMBC Cybersecurity Institute, University of Maryland, Baltimore County. This article is published courtesy of The Conversation.