
Meta’s Instagram — which not long ago came under
fire for its unhealthy effects on large numbers of teens — has now been found to have been enabling and even promoting a “vast” network of pedophiles who sell child sexual abuse
content and promote criminal sexual abuse.
A joint investigation by The Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst found that
Instagram’s recommendation algorithms enabled people to search “explicit hashtags such as #pedowhore and #preteensex” and then connected them to accounts that use the terms to
advertise child-sex materials for sale.
Test accounts set up by the researchers for a single account “were immediately hit with ‘suggested for you’ recommendations of
purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites,” the Journal reports. “Following just a handful of these recommendations
was enough to flood a test account with content that sexualizes children.”
“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to
people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the Journalreported. “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel
at linking those who share niche interests.”
The research team identified 405 sellers of what the researchers described as “self-generated” sex material (accounts supposedly
run by children themselves) using hashtags associated with underage sex, with 112 of those accounts having a combined 22,000 unique followers.
Technical and legal hurdles make determining the
full scale of the pedophile network on Instagram “hard for anyone outside Meta to measure precisely,” according to the report.
Instagram has been failing to strike an effective
balance between recommendation systems and safety features for finding and removing abusive content, David Thiel, chief technologist at the Stanford Internet Observatory, told WSJ. “You
have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t,” he said.
The research also found that Meta failed to act on a
number of reports of child sexual abuse.
In response, Meta said it has set up an internal task force “to investigate these claims and immediately address them.”
“Child exploitation is a horrific crime,” and Meta is “continuously exploring ways to actively defend against this behavior” as well as supporting law enforcement efforts to
apprehend those who engage in these crimes, the company said in a statement.
Meta has provided updated guidance to its content reviewers to more easily identify and remove “predatory
accounts,” according to the statement. Meta blamed a software error, since corrected, for having prevented the company from processing some complaints about child sexual abuse.
Meta also
said that its policy enforcement teams dismantled 27 abusive networks between 2020 and 2022, and as of Q4 2022, had removed more than 34 million pieces of child sex materials from Instagram and
Facebook, with more than 98% detected before it was reported by users. The company said it disabled more than 490,000 accounts for violating child-safety policies in January 2023 alone.
The researchers also found child-sex abuse and materials-selling activities on non-Meta platforms, but to a more limited degree. For example, they reported finding 128 accounts offering to sell
such materials on Twitter — less than a third the number found on Instagram. Such content “does not appear to proliferate” on TikTok, and also appears not to be promoted actively on
the mainly direct-messaging platform Snapchat, they said.
Instagram came under scrutiny by the press and Congress in 2021, after it was revealed that internal Meta research confirmed
that using the platform was having negative mental health impacts on significant numbers of teens.
The company instituted additional safety settings for teens and stronger parental controls,
and “paused” its plan to launch an Instagram version for children under 13.