Meta Platforms must face accusations that it violated Washington, D.C.'s consumer protection law by allegedly designing Facebook and Instagram in ways that would addict young users, a judge ruled this week.
The decision, issued by District of Columbia Superior Court Judge Neil Kravitz, stemmed from a lawsuit brought in October by D.C. Attorney General Brian Schwalb. He alleged that Meta developed and implemented features “that induce children’s extensive, compulsive, and harmful social media use,” and that the company misled the public by claiming its services are safe for young people.
“For years, Meta has publicly claimed that its top priority is well-being and that its platforms are safe and age-appropriate platforms for children. However, Meta has known these claims are misleading and continually chooses to maximize profits without limits over the health and safety of children,” Schwalb alleged.
Schwalb took issue with several Meta features that allegedly were addictive, including algorithmic recommendations, videos that disappears after 24 hours (supposedly creating a “sense of urgency in children”), push notifications and automatic scrolling.
advertisement
advertisement
Meta urged Kravitz to throw out the lawsuit at an early stage of the proceedings. The company argued that its design decisions are protected by Section 230 of the Communications Decency Act, which immunizes companies from liability for material posted by users, and the First Amendment, which prohibits the government from suppressing lawful speech.
Meta specifically contended that Section 230 doesn't just protect companies from lawsuits over the content, but also from liability for organizing and displaying that content.
Kravitz rejected Meta's argument, writing that Section 230 only immunizes social media companies from liability “for harms arising from particular third party content published on their platforms,” but not from liability regarding the organization and display of that material.
Kravitz also refused to dismiss the lawsuit on free speech grounds, writing that the case centers on Meta's design features, and not “the subject matter or viewpoint” of the content on the platform.
The lawsuit is one of numerous pending cases that take aim at social media platforms over the way they deliver content to teens. Tech companies typically argue in those cases that they are protected by both Section 230 and the First Amendment.
Judges across the country have come to different conclusions in those matters.
For instance, U.S. District Court Judge Yvonne Gonzalez Rogers in the Northern District of California ruled last year that Section 230 protected Meta, Google, TikTok and Snapchat from some claims in a lawsuit alleging that the companies designed their services to be addictive and then served teens potentially harmful material -- such as filtered photos that promote unrealistic beauty standards.
Rogers specifically said Section 230 immunized the social media platforms from liability for claims regarding recommendation algorithms, but she allowed other claims to move forward -- including claims that the platforms should label photos with filters.
Also, earlier this week a federal judge in Utah blocked a law that would have required platforms to disable push notifications and refrain from automatically playing content on minors' accounts, among other restrictions. U.S. District Court Judge Robert Shelby said in that matter that the law likely violates the First Amendment.