Advertisers including Walt Disney, Nestle, McDonald’s and the Fortnite-maker Epic Games, have all reportedly pulled their ads from the YouTube platform following a video from blogger Matt Watson which appeared on Sunday, detailing how the comments function was used to identify videos in which young children were in situations that could be taken to be sexually suggestive, such as posing in front of a mirror or doing gymnastics.
The video, which has at time of writing amassed 2 million views, went on to demonstrate how YouTube’s much-discussed algorithm could serve up more and more of this type of content. Though the problem of child exploitation on the platform had been talked about for some time, Watson is able how to show how a user can “find this wormhole, from a fresh, never-before-used YouTube account, via innocuous videos within about five clicks.
“Paedophiles are trading social media contacts; they’re trading links to actual child porn in YouTube comments; they’re trading unlisted videos in secret, and YouTube’s algorithm through some glitch in its programming is facilitating their ability to do this,” Watson said.
YouTube’s particular challenge is quite different from the moment in 2017 during which many brands pulled their adspend following revelations that major companies’ ads were appearing next to extremist or racist content.
The flagged videos themselves are not necessarily problematic. It is instead a problem of the comments section, where users are sharing off-site links, time-stamps of particular moments in the videos, or sexual messages.
YouTube responded to the allegations. “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said at the time. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
The ad spend on the flagged videos came to less than $8000 over the past 60 days, the YouTube spokesperson said.
“We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
Sourced from Bloomberg, The Guardian, Matt Watson, WARC