PerspectiveYouTube’s Algorithms Might Radicalize People – but the Real Problem Is We’ve No Idea How They Work

Published 21 January 2020

Does YouTube create extremists? It’s hard to argue that YouTube doesn’t play a role in radicalization, Chico Camargo writes. “In fact, maximizing watchtime is the whole point of YouTube’s algorithms, and this encourages video creators to fight for attention in any way possible.” Society must insist on using algorithm auditing, even though it is a difficult and costly process. “But it’s important, because the alternative is worse. If algorithms go unchecked and unregulated, we could see a gradual creep of conspiracy theorists and extremists into our media, and our attention controlled by whoever can produce the most profitable content.”

Does YouTube create extremists? A recent study caused arguments among scientists by arguing that the algorithms that power the site don’t help radicalize people by recommending ever more extreme videos, as has been suggested in recent years.

Chico Q. Camargo writes in The Conversation that the paper claimed that YouTube’s algorithm favors mainstream media channels over independent content, concluding that radicalization has more to do with the people who create harmful content than the site’s algorithm.

Camargo writes:

Specialists in the field were quick in responding to the study, with some criticizing the paper’s methods and others arguing the algorithm was one of several important factors and that data science alone won’t give us the answer.

The problem with this discussion is that we can’t really answer the question of what role YouTube’s algorithm plays in radicalizing people because we don’t understand how it works. And this is just a symptom of a much broader problem. These algorithms play an increasing role in our daily lives but lack any kind of transparency.

It’s hard to argue that YouTube doesn’t play a role in radicalization, Camargo writes.

In fact, maximizing watchtime is the whole point of YouTube’s algorithms, and this encourages video creators to fight for attention in any way possible. The company’s sheer lack of transparency about exactly how this works makes it nearly impossible to fight radicalization on the site. After all, without transparency, it is hard to know what can be changed to improve the situation.

Camargo concludes:

Introducing counterfactual explanations or using algorithm auditing is a difficult, costly process. But it’s important, because the alternative is worse. If algorithms go unchecked and unregulated, we could see a gradual creep of conspiracy theorists and extremists into our media, and our attention controlled by whoever can produce the most profitable content.