Algorithmic exploitation: traumatic videos aimed at kids are being hidden on YouTube

By Adam Tinworth

08/11/2017 | Do you spend much time looking at YouTube channels targeted at kids? If you're not a parent, probably not. Even if you are a parent, you might not spend as much time paying attention as you might like to. After all, 10 minutes of YouTube on a tablet is a great way to keep little ones distracted while you cook dinner.

You should probably be paying more attention. I know I should - based on an article by James Bridle, I've just deleted the YouTube kids app from my daughter's iPad.

Why? YouTube is full of content that is exploitative at best, and traumatic at worst, designed to play the algorithm and generate money through view - many of them automated.

James, who spoke at NEXT back in 2012, and whose autonomous car art project we've featured here before, started delving deeply into the phenomenon of "word salad" video titles - where YouTube videos seem to have little but a collection of seemingly disconnected phrases as their title - and what he found was far from pleasant:

Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.

What. The. Hell.

Exploiting the algorithm

How did we get here? Once a trend gets popular, low quality, high volume video producers start cashing on it:

A second way of increasing hits on videos is through keyword/hashtag association, which is a whole dark art unto itself. When some trend, such as Surprise Egg videos, reaches critical mass, content producers pile onto it, creating thousands and thousands more of these videos in every possible iteration. This is the origin of all the weird names in the list above: branded content and nursery rhyme titles and “surprise egg” all stuffed into the same word salad to capture search results, sidebar placement, and “up next” autoplay rankings.

It really is an astonishingly dark picture: algorithms generate video concepts designed to perform well in YouTube's algorithmic environment - and then human performers actually create the video:

This is content production in the age of algorithmic discovery — even if you’re a human, you have to end up impersonating the machine.

But then, those of a malicious bent start exploiting it to make videos that are deeply disturbing - like Peppa Pig torture videos. That's the sort of thing that can deeply traumatise a five year old left alone with an iPad.

Now, of course, there's a parental responsibility here - but the situation betrays a deeper, darker truth of an age where algorithmic curation has all but supplanted human curation. Bridle again:

Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but right here, now, on your screen, in your living room and in your pocket.

Digital sucks. Make it better.