Begin typing your search...

Content moderation: The YouTube rabbit hole gets nuanced

Most of those people found the videos not by accident but by following web links, clicking on videos from YouTube channels that they subscribed to, or following YouTube’s recommendations.

Content moderation: The YouTube rabbit hole gets nuanced
X

CHENNAI: Perhaps you have an image in your mind of people who get brainwashed by YouTube. You might picture your cousin who loves to watch videos of cuddly animals. Then out of the blue, YouTube’s algorithm plops a terrorist recruitment video at the top of the app and continues to suggest ever more extreme videos until he’s persuaded to take up arms.

A new analysis adds nuance to our understanding of YouTube’s role in spreading beliefs that are far outside the mainstream. A group of academics found that YouTube rarely suggests videos that might feature conspiracy theories, extreme bigotry or quack science to people who have shown little interest in such material. And those people are unlikely to follow such computerised recommendations when they are offered. The kittens-to-terrorist pipeline is extremely uncommon.

That doesn’t mean YouTube is not a force in radicalisation. The paper also found that research volunteers who already held bigoted views or followed YouTube channels that frequently feature fringe beliefs were far more likely to seek out or be recommended more videos along the same lines.

The findings suggest that policymakers, internet executives and the public should focus less on the potential risk of an unwitting person being led into extremist ideology on YouTube, and more on the ways that YouTube may help validate and harden the views of people already inclined to such beliefs.

“We’ve understated the way that social media facilitates demand meeting supply of extreme viewpoints,” said Brendan Nyhan, one of the paper’s co-authors and a Dartmouth College professor who studies misperceptions about politics and health care. “Even a few people with extreme views can create grave harm in the world.” People watch more than one billion hours of YouTube videos daily. There are perennial concerns that the Google-owned site may amplify extremist voices, silence legitimate expression or both, similar to the worries that surround Facebook.

This is just one piece of research. But what’s intriguing is that the research challenges the binary notion that either YouTube’s algorithm risks turning any of us into monsters or that kooky things on the internet do little harm. Neither may be true. Digging into the details, about 0.6 percent of research participants were responsible for about 80 percent of the total watch time for YouTube channels that were classified as “extremist,” such as that of the far-right figures David Duke and Mike Cernovich. (YouTube banned Duke’s channel in 2020.)

Most of those people found the videos not by accident but by following web links, clicking on videos from YouTube channels that they subscribed to, or following YouTube’s recommendations. About one in four videos that YouTube recommended to people watching an extreme YouTube channel were another video like it. Only 108 times during the research — about 0.02 percent of all video visits the researchers observed — did someone watching a relatively conventional YouTube channel follow a computerised suggestion to an outside-the-mainstream channel when they were not already subscribed.

Should YouTube make it more difficult, for example, for people to link to fringe videos — something it has considered? Should the site make it harder for people who subscribe to extremist channels to automatically see those videos or be recommended similar ones? Or is the status quo fine? This research reminds us to continually wrestle with the complicated ways that social media can both be a mirror of the nastiness in our world and reinforce it, and to resist easy explanations. There are none.

Ovide is a tech writer with NYT©2022

The New York Times

Visit news.dtnext.in to explore our interactive epaper!

Download the DT Next app for more exciting features!

Click here for iOS

Click here for Android

Shira Ovide
Next Story