American TikTok Deal Doesn't Address Platform's Potential For Manipulation, Only Who Profits

On Sept. 25, the Donald Trump administration in the United States again extended the TikTok ban-or-divest law, possibly for the last time. The latest extension to the law, which was passed in 2024 by the Joe Biden administration , includes a deal to transfer TikTok to American owners as a condition required to avoid a ban.

Author

  • Andrew Buzzell

    Postdoctoral Fellow, Rotman Institute of Philosophy, Western University

This raises the question on the validity of the warnings about the app as a tool of Chinese influence and whether American ownership will help.

Canada should be watching closely, because anxieties about foreign manipulation and social media exist north of the border, too. These range from bans on TikTok and concerns about Beijing-linked surveillance to efforts like Bill C-18 aimed at safeguarding domestic news sources.

What happens in the Canadian information environment has always been shaped by the U.S., a dependence that is even more precarious now that American politics has turned hostile to Canada.

TikTok concerns

TikTok is not the only digital media platform susceptible to worries about hostile influence. All major platforms introduce the same vulnerabilities. If the policy objective is to enhance the security of democracy, then a focus on TikTok is too narrow and divestment as a solution accomplishes little ( especially because it appears China will retain control of the algorithm ).

Worries about TikTok come down to two big fears. The first is that it functions as a spying machine, feeding data to the Chinese government. The spying concern isn't just about espionage, learning about sensitive infrastructure and activities, but also personal - the software itself might be unsafe and can be used to track individuals .

As a result, many countries have banned the app on government devices , and securing data along national borders may well address this.

The second fear, more vivid in the public and political imagination, is that TikTok functions as an influence machine. Its algorithm can be tweaked to push propaganda, sway opinion, censor views or even meddle in elections.

Such worries reached a fever pitch in America in 2023, when Osama bin Laden's "Letter to America" suddenly went viral on TikTok . Lawmakers seized on this as evidence that TikTok could amplify extremist content, reinforcing fears that the platform can be weaponized.

These worries aren't merely speculative. Investigations have shown that topics sensitive to China, such as Tiananmen Square and Tibet, are harder to find or conspicuously absent on TikTok compared to other platforms.

Social media is also used as a tool for influence by hostile groups, corporations and governments , and concerns about ownership are often a proxy for deeper anxieties about the platforms themselves.

As users, we know little about how our feeds work, what's shaping them, what they might look if they were built differently and how they are affecting us.

There is a rational basis to be mistrustful, and this cuts both ways. It's not just the fear that we could be manipulated without realizing it; it's also the temptation to see our opponents as manipulated, too, as if every disagreement might be product of someone rigging the system.

a screen showing app icons, including TikTok's
Users know little about how TikTok feeds work, what's shaping them or what they might look if they were built differently. (Solen Feyissa/Unsplash), CC BY

Manipulated anxieties

Fear of TikTok as an influence machine continues to play a substantial role in politics, as "Washington has said that TikTok's ownership by ByteDance makes it beholden to the Chinese government ."

U.S. Vice President JD Vance remarked that the executive order would "ensure that the algorithm is not being used as a propaganda tool by a foreign government… the American businesspeople … will make the determination about what's actually happening with TikTok ."

Meanwhile, Trump ostensibly joked that he'd make TikTok "100 per cent MAGA" before adding " everyone's going to be treated fairly ." And Israeli Prime Minister Benjamin Netanyahu told an audience of content creators that "weapons change over time… the most important one is social media ," stressing the importance of divestment of TikTok to U.S. owners.

One implication of these comments is that divestment doesn't change the threat of manipulation - it just changes who's doing the manipulating. Divestment is framed as resisting foreign propaganda, but at the same time domestic manipulation is legitimized as politics as usual.

Collective dependence

This is a squandered opportunity for the U.S. By treating TikTok as a weapon to be seized, leaders have passed up the chance to model a more enduring form of soft power: building open, transparent, trustworthy information systems that others would want to emulate. Instead, what is gained is a temporary and possibly illusory sharp power advantage, at the expense of an enduring source of legitimacy.

The bigger problem is that the normalization of social media as a weapon is, to borrow a fear familiar to Trump, riggable. We know that social media can be manipulated, and yet we rely on it more and more as a source of news. And even if we ourselves don't, we are influenced indirectly by those who do.

This collective dependence makes the platforms more powerful and their vulnerabilities more dangerous.

a row of people on public transit holding cellphones
Social media platforms have become a primary source of information. (Shawn/Unsplash), CC BY

Protecting the public sphere

Canada has already had its own TikTok moment: the Online News Act (C-18), which required platforms to pay news outlets for sharing their content. This was intended to strengthen Canadian journalism, but in response, Meta banned news on its platforms (Facebook, Instagram) in Canada in August 2023, leading to an 85 per cent drop in engagement . Instead of strengthening Canadian journalism, Bill C-18 risks making it more fragile.

If we're serious about protecting the public sphere from manipulation, what matters is the outsized power the platforms have, and the extent to which that power can be bought, sold or stolen. This power includes the surveillance power to know what we will like, the algorithmic power to curate our information diet and control of platform incentives, rules and features that affect who gains influence.

Bargaining with this power, as Canada tried with Bill C-18 - and as the U.S. is now doing with China and TikTok - only concedes to it. If we want to protect democratic information systems, we need to focus on reducing the vulnerabilities in our relationship with media platforms and support domestic journalism that can compete for influence.

The biggest challenge is to make platforms less riggable, and thus less weaponizable, if only for the reason that motivated the TikTok ban: we don't want our adversaries, foreign or domestic, to have power over us.

The Conversation

Andrew Buzzell does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).