
New tests conducted by The Wall Street
Journal and Northeastern University professor Laura Edelson find that Instagram is recommending sexual content in the form of Reels to 13-year-old users who appear interested in “racy
content.”
Over the course of seven months in 2024, the publication ran tests by creating new Instagram accounts as 13-year-olds and found that they were receiving consistent recommendations
for sexual videos within minutes of when they first logged in, proving that the app has continued to foist adult content onto minors even after Meta said in January that it began restricting
“sensitive content” from teens' in-app experience.
The short-form videos the test accounts were seeing included women dancing sensually, pantomiming sex acts, or flashing their
genitalia, while others promised nudes to commenters and information about anal sex, the report says.
advertisement
advertisement
After watching these videos and skipping the Reels without sexually relevant material, the
accounts started receiving recommendations for explicit videos within minutes.
According to the report, within 20 minutes of watching sexually explicit videos, the accounts’
recommendations Reels page was filled with creators showcasing sexual content.
Tests run on TikTok and Snapchat did not produce the same results for The Journal or Edelson. Neither
platform recommended sexual content to teen accounts, even after actively searching the platforms and following creators known to produce and post that type of content.
Based on undisclosed
documents The Journal collected from Meta employees, similar issues have taken place in the past, with young teenagers on Instagram experiencing harmful material. Meta's own safety staff ran
tests and came up with similar results, but denied the validity of their findings.
“This was an artificial experiment that does not match the reality of how teens use Instagram,”
company spokesperson Andy Stone told the publication, adding that Meta has “established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have
meaningfully reduced these numbers in the past months.”
In January, Meta implemented a host of new privacy updates aimed at better protecting teen Facebook and Instagram
users, including more restrictive content control settings, expanded content restrictions related to self-harm, and prompts linked to privacy settings. But it is worth noting that The
Journal's tests ran after these privacy updates rolled out.
Over the past few months, Instagram has also introduced new tools to help protect younger users from falling victim to various forms of
intimate image abuse, while also trying to boost awareness surrounding potential sextortion scams across the platform.
These include a nudity protection feature in DMs, automated safety tips,
and privacy from potential sextortion accounts.
In response to these updates, however, concerned parents and advocates called Meta’s announcement a PR stunt and highlighted the
company’s repeated inability to protect children from predators and sexual exploitation on their apps.