Meta Platforms must face discrimination claims stemming from ad targeting, two separate appellate courts recently ruled.
On Friday, the 9th Circuit Court of Appeals confirmed a prior ruling that allowed Rosemarie Vargas and other Facebook users to proceed
with claims that the company wrongly allowed advertisers to prevent housing ads from being shown to users based on their race, sex, age or other comparable factors.
Meta had argued it was
protected by Section 230 of the Communications Decency Act -- which shields companies from liability over content posted by third parties.
The appellate panel rejected that argument in June,
writing that if the allegations in the complaint were proven true, Facebook would have been a “co-developer” of the ads.
After that opinion came out, Meta urged the 9th Circuit to
reconsider, arguing that the ruling conflicts with prior decisions about the scope of Section 230.
advertisement
advertisement
“Plaintiffs sued Facebook for providing neutral tools that unidentified third-party
advertisers allegedly chose to misuse,” the company wrote in its request for a new hearing. “The nature of those claims should have made this case easy under this court’s Section 230
case law, which holds that an interactive computer service provider cannot be held liable when it gives users of the service neutral tools they can use to communicate information.”
The
9th Circuit on Friday issued a slightly revised opinion, but the changes didn't affect the core holding that Meta
Platforms may have violated anti-discrimination laws.
Vargas's suit was one of several alleging that Facebook's ad-targeting options violated federal fair housing or employment laws. By the
time she filed the complaint, Facebook had already agreed to
revise its targeting options to prohibit advertisers of housing, employment or credit offers from targeting ads based on age, gender, ZIP code and ethnic affinity.
The 9th Circuit's move comes
just weeks after a California state appellate court separately ruled that Facebook's ad-targeting system may have violated state
civil rights laws. The decision in that matter stemmed from a lawsuit brought by Samantha Liapes, who alleged that the platform's ad-targeting algorithms didn't serve some life insurance ads to women
and older people.
The California Court of Appealrejected Meta's argument that it was protected by Section 230, essentially ruling
that the allegations, if proven true, would show Facebook was acting as a content developer.
“According to the complaint, Facebook uses its internal data and analysis to determine what
specific people will receive ads,” the appellate judges wrote. “The algorithm relies heavily on age and gender to determine which users will actually receive any given ad. This occurs even
if an advertiser did not expressly exclude certain genders or older people. The algorithm then sends or excludes users from viewing ads based on protected characteristics such as age and gender.
Because the algorithm ascertains data about a user and then targets ads based on the users’ characteristics the algorithm renders Facebook more akin to a content developer.”
Santa
Clara University law professor Eric Goldman, who last week called attention to that ruling, wrote that it implies
“any gender- or age-based ad targeting for any product or service (and targeting based on any other protected characteristics)” could violate California's civil rights laws.
“This conclusion would have devastating effects on the entire internet ecosystem,” he wrote.
It's not yet clear whether Meta plans to seek further appeals in either case.