The American TikTok Deal Doesn’t Address the Platform’s Potential for Manipulation, Only Who Profits
Social media is also used as a tool for influence by hostile groups, corporations and governments, and concerns about ownership are often a proxy for deeper anxieties about the platforms themselves.
As users, we know little about how our feeds work, what’s shaping them, what they might look if they were built differently and how they are affecting us.
There is a rational basis to be mistrustful, and this cuts both ways. It’s not just the fear that we could be manipulated without realizing it; it’s also the temptation to see our opponents as manipulated, too, as if every disagreement might be product of someone rigging the system.
Manipulated Anxieties
Fear of TikTok as an influence machine continues to play a substantial role in politics, as “Washington has said that TikTok’s ownership by ByteDance makes it beholden to the Chinese government.”
U.S. Vice President JD Vance remarked that the executive order would “ensure that the algorithm is not being used as a propaganda tool by a foreign government… the American businesspeople … will make the determination about what’s actually happening with TikTok.”
Meanwhile, Trump ostensibly joked that he’d make TikTok “100 per cent MAGA” before adding “everyone’s going to be treated fairly.” And Israeli Prime Minister Benjamin Netanyahu told an audience of content creators that “weapons change over time… the most important one is social media,” stressing the importance of divestment of TikTok to U.S. owners.
One implication of these comments is that divestment doesn’t change the threat of manipulation — it just changes who’s doing the manipulating. Divestment is framed as resisting foreign propaganda, but at the same time domestic manipulation is legitimized as politics as usual.
Collective Dependence
This is a squandered opportunity for the U.S. By treating TikTok as a weapon to be seized, leaders have passed up the chance to model a more enduring form of soft power: building open, transparent, trustworthy information systems that others would want to emulate. Instead, what is gained is a temporary and possibly illusory sharp power advantage, at the expense of an enduring source of legitimacy.
The bigger problem is that the normalization of social media as a weapon is, to borrow a fear familiar to Trump, riggable. We know that social media can be manipulated, and yet we rely on it more and more as a source of news. And even if we ourselves don’t, we are influenced indirectly by those who do.
This collective dependence makes the platforms more powerful and their vulnerabilities more dangerous.
Protecting the Public Sphere
Canada has already had its own TikTok moment: the Online News Act (C-18), which required platforms to pay news outlets for sharing their content. This was intended to strengthen Canadian journalism, but in response, Meta banned news on its platforms (Facebook, Instagram) in Canada in August 2023, leading to an 85 per cent drop in engagement. Instead of strengthening Canadian journalism, Bill C-18 risks making it more fragile.
If we’re serious about protecting the public sphere from manipulation, what matters is the outsized power the platforms have, and the extent to which that power can be bought, sold or stolen. This power includes the surveillance power to know what we will like, the algorithmic power to curate our information diet and control of platform incentives, rules and features that affect who gains influence.
Bargaining with this power, as Canada tried with Bill C-18 — and as the U.S. is now doing with China and TikTok — only concedes to it. If we want to protect democratic information systems, we need to focus on reducing the vulnerabilities in our relationship with media platforms and support domestic journalism that can compete for influence.
The biggest challenge is to make platforms less riggable, and thus less weaponizable, if only for the reason that motivated the TikTok ban: we don’t want our adversaries, foreign or domestic, to have power over us.
Andrew Buzzell is Postdoctoral Fellow, Rotman Institute of Philosophy, Western University. This article is published courtesy of The Conversation.