A recent report in The Wall Street Journal found Instagram recommended Reels containing “risque footage of children as well as overtly sexual adult videos” to test accounts strictly following young influencers such as preteen and teen gymnasts and cheerleaders, while serving ads for major brands alongside the short-form videos.
In testing Meta’s algorithm, the Wall Street Journal set up a number of accounts simulating those of younger Instagram users.
These accounts not only contained sexually explicit content, but the child users the test accounts were following were also followed by accounts owned by adult men.
The report also found that Instagram Reels displayed ads for big brands like Walmart, Disney, Pizza Hut, Bumble, Match Group and the Journal itself alongside the same sexualized content being served to underage users via the platform’s algorithms.
The Canadian Centre for Child Protection achieved similar results with its own tests.
As of Tuesday, Bumble, Match Group, Hims and Disney have either suspended advertising on Instagram or confronted the company to address the issue.
“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” said Meta’s Samantha Stetson in a statement. “We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low.”
"Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” Stetson continued, adding that the Journal’s test results are “based on a manufactured experience that does not represent what billions of people around the world see.”
The tech giant also told its clients that it “would pay for brand-safety auditing services to determine how often a company’s ads appear beside content it considers unacceptable” but did not provide any further details.