Twitch's 'Clips' Feature Being Used By Child Predators, Analysis Shows

On Friday morning, Bloomberg released the results of its recent analysis of 1,100 short video segments on Twitch known as “Clips,” which found that more than 80 contained sexualized content involving minors.

Based on the results, the publication believes the livestream social platform’s TikTok-like feature has become popular with child predators.

Clips allow users to capture seconds-long moments from other users’ broadcasts on the Twitch platform.

From there, users can share them in their own broadcasts or post them to other social media platforms like TikTok, in order to gain followers by showcasing their personal interests and tastes in a short-form aesthetic.

Like every other leading social-media platform, Twitch is looking to compete in the "TikTokification" of social media.



It plans to do this by expanding its clips feature, which originally launched in 2016. This year, the company plans to test a designated discovery feed for users' short-form videos while algorithmically suggesting clips that align with users' interests.

Unfortunately, the format is already being utilized by child predators, with older users capturing clips of minors and sharing them with like-minded users.

The Canadian Centre for Child Protection reviewed Bloomberg's findings and identified 34 children, primarily boys between the ages of 5 and 12 years, showing their private parts to the camera following requests from live viewers.

Those 34 clips, along with 49 other sexually explicit clips of children, had been viewed over 10,000 times.

“There’s a broader victimization that occurs once the initial livestream and grooming incident has happened because of the possibility of further distribution of this material,” said the Centre's director Stephen Sauer.

In response to Bloomberg's findings, Twitch deleted the prohibited clips, saying that the company is taking the matter “extremely seriously.”

People familiar with Twitch’s safety protocols, however, informed Bloomberg that the clips feature is currently the least moderated area on the platform, leaving it up to users to report upsetting material.

And while deleting clips of prohibited content removes them from the platform, users can download clips and upload them to other parts of the internet, allowing the abuse to continue.

The Amazon-owned company is not alone in hosting child abuse material.

According to various reports, the number of photos and videos containing child sexual abuse found online is increasing every year. Data from the Internet Watch Foundation (IWF), a UK child safety nonprofit, shows that between 2020 and 2021, URLs containing child sexual abuse imagery went up 64%, with a the majority of photos appearing on image-hosting websites where people can upload content to share with their followers.

But as a platform specializing in live streaming, the majority of content rolls out in real time, making it more difficult to stop.

Next story loading loading..