Meta's 'Addictive' Design Harms Young Users, Lawsuits Claim

Meta this week was hit with several lawsuits claiming that it designed Facebook and Instagram in a way that posed a risk to the health of young users.

The cases, filed in nine states, all essentially claim that Facebook and Instagram designed their services to be addictive, and served potentially harmful content to teens and children.

“Meta knows that its product is contributing to teen depression, anxiety, even suicide and self-harm,” 19-year-old Alexis Spence and her family allege in a lawsuit filed in U.S. District Court for the Northern District of California by attorneys with the Seattle-based Social Media Victims Law Center.

“Why doesn’t it change these harmful product features and stop utilizing algorithms in connection, at least, with teen accounts? Because Meta’s priority is growth and competition concerns,” Spence's complaint continues.

Other, similar lawsuits were filed by a different law firm, Beasley Allen, in federal courts in eight other states: Colorado, Delaware, Florida, Georgia, Illinois, Missouri, Tennessee, and Texas.

All of the cases draw on former Facebook executive Frances Haugen's accusations that the company chose profits over safety by designing its services in ways that can harm users -- such as by promoting material associated with eating disorders to teen girls.

These new cases come on top of several lawsuits accusing Snapchat, Meta and other social media companies of contributing to suicides of young users.

Spence alleges in her complaint that she began using Instagram when she was 11, and that her social media use “coincided with a steady, but severe, decline in her mental health."

Around four years ago, she was hospitalized for an eating disorder, according to the complaint.

“Alexis’s addiction and resulting mental health disorders were the proximate result of the unreasonably dangerous Instagram product Meta made accessible to her and that she used,” the complaint alleges.

Plaintiffs in the other cases also allege that they were injured due to supposedly addictive design features of Facebook or Instagram.

It's not clear whether these cases will get very far in court.

Meta will likely argue that Section 230 of the Communications Decency Act protects web companies from liability for content created by users, as well as for decisions about how to recommend content to users.

In 2019, a federal appellate court in New York sided with Facebook in a lawsuit by terrorist victims who argued that Facebook's recommendation algorithms helped introduce terrorists to each other.

The 2nd Circuit Court of Appeals ruled in that case that Section 230 immunized Facebook from content created by users, as well as from the use of algorithms to recommend that content to other users.

Since then, a different federal appellate court, the 9th Circuit, ruled that Section 230 didn't protect Snapchat from liability in a lawsuit over its speed filter overlay, which lets users post photos showing how fast they're driving.

The appellate judges said in the matter that Section 230 didn't apply because the allegations didn't center on third-party content.

Instead, according to the 9th Circuit judges, the lawsuit's main claim “faults Snap solely for Snapchat’s architecture, contending that the app’s Speed Filter and reward system worked together to encourage users to drive at dangerous speeds.”

Santa Clara University law professor Eric Goldman, an expert in Section 230, says that law could protect Meta even after the ruling involving Snapchat.

“The key part of that ruling was that speed filter would create liability, even if it was never used to publish content,” Goldman says of the decision involving Snapchat.

The lawsuits against Meta, by contrast, are more closely tied to allegedly harmful content posted by third parties.

Goldman adds that even if a judge allows the claims against Meta to proceed, the company has defenses -- including that the First Amendment protects the company's decisions about what material to publish and how to promote that content.

“I don't think in the end the plaintiffs will win these claims,” he says.

Earlier this year, a trial judge threw out a lawsuit brought against Netflix by family members of a teen who committed suicide after watching the show 13 Reasons Why, which originally included a now-removed suicide scene.

The complaint in that case alleged that Netflix didn't warn “that viewing the show could itself cause suicide,” and that Netflix used “its trove of individualized data about its users to specifically target vulnerable children and manipulate them into watching content that was deeply harmful to them.”

U.S. District Court Judge Yvonne Gonzalez Rogers tossed the case for several reasons, including that California's anti-SLAPP (strategic lawsuits against public participation) law provides for a fast dismissal of complaints stemming from the right to engage in free speech about matters of public policy -- including youth suicide and depression.

The family has appealed that ruling to the 9th Circuit.

Next story loading loading..