Meta Platforms, TikTok, YouTube and Snapchat parent Snap have all known, but intentionally disregarded, the addictive and destructive effects their social platforms have on teens and children, alleges a lawsuit filed in federal court in Oakland, California.
Last October, a judicial panel consolidated more than two dozen complaints filed across the country on behalf of teens and young adults into a single class action, but an unredacted version was just made available over this past weekend.
The many accusations include that the companies intentionally use algorithms that drive addictive behaviors leading to anxiety, depression, sleeplessness, eating disorders and sometimes suicide.
“These never-before-seen documents show that social media companies treat the crisis in youth mental health as a public relations issue rather than an urgent societal problem brought on by their products,” the three plaintiffs’ lawyers leading the lawsuit, Lexi Hazam, Previn Warren and Chris Seeger, said in a statement. “This includes burying internal research documenting these harms, blocking safety measures because they decrease ‘engagement,’ and defunding teams focused on protecting youth mental health.”
Specifically, the suits claim that the companies encourage addictive behavior and bypass parental controls, fail to verify users’ ages, inadequately safeguard against harmful content and intentionally amplify that content.
One charge is that Meta defunded its team dedicated to addressing the mental health implications of its Facebook and Instagram platforms despite internal warnings to CEO Mark Zuckerberg. The suit quotes a message sent directly to Zuckerberg stating that Meta was “not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.” The consolidated complaint also cites Meta's own internal studies showing its platforms' risks to young people, which were outed by a whistleblower.
One Meta employee wrote that “No one wakes up thinking they want to maximize the number of times they open Instagram that day, but that’s exactly what our product teams are trying to do,” according to the filing.
A Meta spokesperson told Bloomberg that the company has actually increased its funding for efforts to safeguard users’ well being, “shown by the over 30 tools we offer to support teens and families. Today, there are hundreds of employees working across the company to build features to this effect.”
The social platforms have all made similar claims over the years in response to criticisms and lawsuits.
They have also to date been able to rely on Section 230 of the Communications Decency Act to protect them from liability for material posted by third parties. But the possible limitations of that regulation are now under review by the Supreme Court in a case brought against Google.
This lawsuit is not the only one making similar complaints against the social platforms. For example, in January, the Seattle school system filed a lawsuit against Meta, Google/YouTube parent Alphabet, Snap Inc. and ByteDance.
It's up to the person on limits to social media personal responsibility as that is a you problem not a social media company in my opinion.