TikTok and WeChat: Curating and Controlling Global Information Flows

Both Tencent and ByteDance, the companies that own and operate WeChat and TikTok, respectively, are subject to China’s security, intelligence, counter-espionage and cybersecurity laws. Internal Chinese Communist Party (CCP) committees at both companies are in place to ensure that the party’s political goals are pursued alongside the companies’ commercial goals. ByteDance CEO Zhang Yiming has stated on the record that he will ensure his products serve to promote the CCP’s propaganda agenda.6

While most major international social media platforms have traditionally taken a cautious and public approach to content moderation, TikTok is the first globally popular social media network to take a heavy-handed approach to content moderation. Possessing and deploying the capability to covertly control information flows, across geographical regions, topics and languages, positions TikTok as a powerful political actor with a global reach.

What’s the solution?
The global expansion of Chinese social media networks continues to pose unique challenges to policymakers around the world. Thus far governments have tended to hold most major international social media networks and Chinese social media networks to different standards. It’s imperative that states move to a policy position where all social media and internet companies are being held to the same set of standards, regardless of their country of origin or ownership.

This report recommends (on page 50) that governments implement transparent user data privacy and user data protection frameworks that apply to all social media networks. If companies refuse to comply with such frameworks, they shouldn’t be allowed to operate. Independent audits of social media algorithms should be conducted. Social media companies should be transparent about the guidelines that human moderators use and what impact their decisions have on their algorithms. Governments should require that all social media platforms investigate and disclose information operations being conducted on their platforms by state and non-state actors. Disclosures should include publicly releasing datasets linked to those information campaigns.

Finally, all of these recommended actions would benefit from multilateral collaboration that includes participation from governments, the private sector and civil society actors. For example, independent audits of algorithms could be shared by multiple governments that are seeking the same outcomes of accountability and transparency; governments, social media companies and research institutes could share data on information operations; all stakeholders could share lessons learned on data frameworks.

Conclusion
The Chinese state has demonstrated a propensity for controlling and shaping the information environment of the Chinese diaspora—including via WeChat. The meteoric growth of TikTok has now put the CCP in a position from which it can shape the information environment on a largely non-Chinese-speaking platform—with the help of the highest valued start-up in the world and its opaque advanced AI-powered algorithm.

Chinese party-state leverage over these companies is considerable, is exercised internally via CCP committees and is enforced by a suite of cybersecurity and intelligence laws.171 As Chinese companies, Tencent and ByteDance are not only required to participate in intelligence work, but they’re also legally mandated to promote CCP propaganda.

China’s censorship and propaganda apparatus is a responsibility that’s pushed down to media and technology companies such as Tencent and ByteDance.172 As Chinese companies, they’re obligated to comply with strict government regulations on what content is allowed to be published on their platforms, and they both invest heavily in automated systems for content filtering and human curation.

The demands of the PRC’s surveillance and propaganda apparatus on these technology companies are such that, at least in the case of WeChat, they’re even prepared to surveil the foreign users of their apps in order to better train the censorship algorithms used on Chinese citizens within the PRC.

The censorship and surveillance detailed in this report most probably represent only a fraction of the total activity that’s taking place on these social media platforms. At the same time as the apps compete on user growth, ad sales and investment, they’re also posing a challenge to liberal democratic ideals such as freedom of political expression and free speech.

As the underlying technology used in these apps continues to advance, the ability of these companies to monitor dissent and shape narratives globally will grow exponentially

….

Recommendations
1. To the extent that the censorship practices outlined in this report represent breaches of current law in liberal democracies around the world, governments should launch legal investigations.

2. In an effort to train AI algorithms that help to curate, filter and moderate content and enable targeted advertising, users’ data privacy has fallen by the wayside. Governments should introduce transparent user-data privacy and user-data protection frameworks that apply to all social media and internet companies, regardless of their country of origin and ownership.173 If companies refuse to comply with such frameworks, they shouldn’t be granted licences to operate.

3. Governments should mandate that all social media platforms publicly disclose, in detail, all the content they censor and make it an offence to censor content where that has not been publicly disclosed to users.

4. Independent audits of the algorithms of all social media companies should be conducted. Included in those assessments should be transparency about the guidelines that human moderators use and what impact their decisions have on the algorithms.

5. Governments should require that all social media platforms investigate and disclose information operations (also known as ‘coordinated inauthentic behaviour’) being conducted on their platforms by state and non-state actors. Disclosures should include publicly releasing datasets linked to those information campaigns.

6. Finally, all of the above recommended actions would benefit from multilateral collaboration that includes participation from governments, the private sector and civil society actors. For example, independent audits of algorithms could be shared by multiple governments that are seeking the same outcomes of accountability and transparency; governments, social media companies and research institutes could share data on information operations; all stakeholders could share lessons learned on data frameworks.

1 Lai Lin Thomala, ‘Number of active WeChat messenger accounts Q2 2011-Q2 2020’, Statista, 20 August 2020, online.

2 Ronald Deibert, ‘WeChat users outside China face surveillance while training censorship algorithms’, Washington Post, 8 May 2020, online.

3 Paul Mozur, ‘Forget TikTok. China’s powerhouse app is WeChat, and its power is sweeping’, New York Times, 4 September 2020, online.

4 Alex Sherman, ‘TikTok reveals detailed user numbers for the first time’, CNBC, 24 August 2020, online.

5 Alex Hern, ‘Revealed: how TikTok censors videos that do not please Beijing’, The Guardian, 20 August 2019, online.

6 David Bandurski, ‘Tech shame in the new era’, China Media Project, 11 April 2018, online

….

171 See Appendi1 for Tencent and ByteDance CCP connections

172 Lotus Ruan, ‘Internet censorship: how China does it’, The Strategist, 9 October 2017, online

Fergus Ryan is an analyst at the Australian Strategic Policy Institute (ASPI). Audrey Fritz is a researcher at ASPI. Daria Impiombato is a research intern at ASPI.