As Tools for Hybrid Threats, Apps Like Telegram Must Be Accountable

Democracies must take proactive steps to mitigate these risks. First, they should enforce stringent regulations that require transparency in how messaging platforms operate and how they manage user data. This includes clear guidelines on privacy, content moderation, data storage, and co-operation with law enforcement, ensuring that platforms cannot be easily exploited by malicious actors. 

Second, governments and civil society should promote the use of alternative messaging platforms that prioritize transparency and accountability. Platforms like Signal, which offer end-to-end encryption and operate with a commitment to user privacy without the financial entanglements seen in Telegram, can serve as safer alternatives. This is particularly relevant for government officials and in sensitive sectors, for whom using encrypted messaging tools that are based in their own country or another trusted nation can reduce the risks. 

Finally, enhancing media literacy and public awareness about the risks of disinformation is crucial. Educating users on how to identify and counteract disinformation campaigns can help build resilience against these types of hybrid threats. This approach should be coupled with efforts to develop and promote technologies that can detect and mitigate the spread of false information on digital platforms. 

As messaging platforms become increasingly central to both communication and conflict, the lessons from Telegram’s rise and its connections to Russian interests underscore the importance of transparency, regulation, and the promotion of secure, accountable platforms.  

More stringent regulations for digital platforms can be implemented to ensure their transparency in content regulation policy, ownership, legal regimes and data storage. The world can learn from, among others, Germany’s Network Enforcement Act (NetzDG), which requires social media platforms to remove illegal content promptly or face significant fines. 

The European Union’s Digital Services Act (DSA) compels large online platforms to assess and mitigate risks related to the dissemination of illegal content, disinformation, and other harmful activities. Similarly, the United States’ Communications Decency Act Section 230 was established to hold platforms accountable for the content they host while preserving free speech. In Australia, regulation includes the Online Safety Act 2021 and Digital Platform Regulators Forum.  

While the global community has been arguably too slow to hold messaging and social media platforms accountable, these steps towards greater government action reflect a positive shift towards better oversight. The Telegram episode serves as a reminder of the importance for democracies to guard against multifarious hybrid threats and implement measures tailored to their unique security concerns.

The Strategist is running a short series of articles in the lead up to ASPI’s Sydney Dialogue on September 2 and 3. The event will cover key topics in critical, emerging and cyber technologies, including hybrid threats, disinformation, electoral interference, artificial intelligence, clean technologies and more.

Fitriani is a senior analyst at ASPI. This article is published courtesy of the Australian Strategic Policy Institute (ASPI).