Tech Group Backs Meta In Fight Over Scam Ads

The tech industry organization NetChoice is backing Meta's request for an immediate appeal in a dispute over its potential liability for scam ads.

The group argued in a proposed friend-of-the-court brief filed Wednesday that an appellate court should decide the key issue in the dispute -- whether "aspirational" statements in companies' terms of service give users grounds to sue.

NetChoice's brief comes two months after U.S. District Court Judge Jeffrey White in the Northern District of California ruled that Facebook users who allegedly lost money after responding to scam ads could proceed with a lawsuit against the platform.

The ruling stemmed from a lawsuit brought in 2021 by Facebook users including Oregon resident Christopher Calise, who alleged in a class-action complaint that he was bilked out of around $49 after attempting to purchase a car-engine assembly kit that was advertised on the site.

advertisement

advertisement

White based his decision on language in a former version of Meta's terms of service and community standards sections.

The 2021 terms of service provided that Meta would "take appropriate action" regarding harmful content, and the "community standards" section said the company would remove fraudulent content, according to documents provided to the court by the plaintiffs' counsel.

White characterized those statements as "unambiguous and well-defined promises from Meta to users," and allowed the plaintiffs to proceed with claims that Meta broke its contract with them.

Meta is now seeking to appeal that ruling to the 9th Circuit Court of Appeals, arguing that appellate judges should decide whether statements in the terms of service and community standards created a legally enforceable obligation to take action regarding "purported scam advertisements."

NetChoice -- which counts large tech companies including Meta, Amazon and Google as members -- also wants an appellate court to intervene in the case.

"If every aspirational statement about maintaining a 'safe environment' or taking 'appropriate action' against harmful content can be construed as an enforceable promise, platforms face an impossible choice: either create vague, meaningless policies that provide no guidance to users, or face endless breach of contract litigation whenever their content moderation decisions fall short of perfection," NetChoice writes.

"Platforms will thus be discouraged from making stronger commitments to their users at the risk of increasing potential liability," it adds.

The organization adds that other judges have reached different decisions in lawsuits by users who claimed tech companies' violated their terms of service by failing to take down material. Among other examples, NetChoice noted that in 2009, a federal judge threw out a lawsuit accusing Google of displaying fraudulent ringtone ads.

Earlier in the proceedings, White dismissed the lawsuit against Meta on the grounds that Section 230 of the Communications Decency Act protected Meta from liability for ads placed by third parties.

That law broadly protects web companies from lawsuits over posts by outside individuals and companies. Calise appealed to the 9th Circuit, which ruled Section 230 didn't protect Meta from claims that the company broke its contract with users by allegedly violating its terms of service.

Next story loading loading..