D.C.'s AG Sides Against Facebook In Battle Over 'Hate Speech'

The Attorney General of Washington, D.C. is siding against Facebook in a lawsuit brought by the group Muslim Advocates, which alleges that Facebook ran afoul of a local consumer protection law by failing to enforce promises to suppress “hate speech.”

Facebook recently urged District of Columbia Superior Court Judge Anthony Epstein to dismiss Muslim Advocates' lawsuit for numerous reasons, including that Section 230 of the Communications Decency Act protects the company from lawsuits over how it treats users' posts.

“All of plaintiff's claims challenge Facebook's alleged failure to remove third-party content from its platform, and therefore are barred,” the company wrote.

On Monday, Washington, D.C. Attorney General Karl Racine urged Epstein to reject that argument.

Racine contends that Section 230 doesn't apply in situations where a web company is accused of engaging in “affirmative deception.”

“Immunizing defendants from liability merely because their misleading statements addressed their content moderation policies would grant Facebook and other social media platforms unprecedented, unchecked power to deceive District consumers about the dangers of their goods and services,” he writes.

The battle dates to April, when Muslim Advocates alleged in a lawsuit that Facebook officials “consistently misrepresented the company’s actual practices when it comes to enforcing Facebook’s own its own standards and policies to keep Facebook free of hate speech and other harmful content.”

“Facebook has been used, among other things, to orchestrate the Rohingya genocide in Myanmar, mass murders of Muslims in India, and riots and murders in Sri Lanka that targeted Muslims for death,” the lawsuit alleges.

“Armed, anti-Muslim protests in the United States have been organized on Facebook event pages. The Christchurch, New Zealand, mosque massacres were live-streamed on Facebook.... If Facebook’s executives had enforced their own Community Standards and policies as they promised, a significant amount of the anti-Muslim hate and realworld damage could have been avoided,” the group added.

In September, Facebook urged Epstein to dismiss the lawsuit for a host of reasons. In addition to arguing that it was protected by Section 230, Facebook denied misrepresenting its content policies. The company said in its legal papers that public statements regarding the removal of objectionable content came with qualifications -- including that the company must be aware of the content, and make a judgment call about whether the material violates its standards.

Racine isn't the only attorney general to argue that social media platforms might run afoul of local consumer protection laws by allegedly misrepresenting their content policies.

Texas Attorney General Ken Paxton is taking a similar position in a separate battle with Twitter.

Shorty after Twitter suspended former President Donald Trump, Paxton launched an investigation of social media companies' content moderation policies, and demanded that Twitter (and other large platforms) turn over a trove of documents.

Twitter recently asked a federal appellate court to halt Paxton's investigation, arguing that the probe into editorial policies violates the First Amendment. (Even without Section 230, the First Amendment separately protects companies' decisions about editorial content.)

Paxton countered that his investigation doesn't run afoul of free speech principles, because he's investigating whether Twitter violated a state consumer protection law by misrepresenting its editorial policies.

“Even if Twitter has a First Amendment right to choose discriminatory content-moderation policies, the Constitution does not empower it to mislead consumers about those policies,” Paxton wrote in papers filed earlier this year with the 9th Circuit Court of Appeals.

The Reporters Committee for Freedom of the Press recently urged the 9th Circuit to reject that argument, writing that Paxton shouldn't be able to use Texas's consumer protection law as the basis for an investigation into Twitter's editorial practices.

“Were the government able to deploy consumer protection laws in this way, it would invariably seek to favor viewpoints perceived as supportive and disfavor viewpoints perceived as critical,” that group wrote in a friend-of-the-court brief filed in July.

Judges have sided with tech companies in other lawsuits by people who alleged the companies' misrepresented their content-moderation policies. For instance, judges California sided against Canadian journalist Meghan Murphy in her legal battle with Twitter.

Murphy, who was permanently banned from Twitter for allegedly violating its “hateful conduct policy” by referring to a transgender woman as a man, alleged in a class-action complaint that the ban amounted to a breach of contract between Twitter and its users.

“Twitter’s repeated representations that it would uphold the free speech rights of its users and not censor user speech were material to the decision of millions of users, like Murphy, to join,” she alleged.

In that case, the 1st District California Court of Appeal ruled that Twitter was protected by Section 230 of the Communications Decency Act.

Muslim Advocates alleged in its lawsuit, filed in July, that Facebook officials “consistently misrepresented the company’s actual practices when it comes to enforcing Facebook’s own its own standards and policies to keep Facebook free of hate speech and other harmful content.”

“Facebook has been used, among other things, to orchestrate the Rohingya genocide in Myanmar, mass murders of Muslims in India, and riots and murders in Sri Lanka that targeted Muslims for death,” the lawsuit alleges.

“Armed, anti-Muslim protests in the United States have been organized on Facebook event pages. The Christchurch, New Zealand, mosque massacres were live-streamed on Facebook.... If Facebook’s executives had enforced their own Community Standards and policies as they promised, a significant amount of the anti-Muslim hate and realworld damage could have been avoided,” the group adds.

Next story loading loading..