
The Department of Housing and
Urban Development Thursday sued Facebook over its ad-targeting system, which the government contends violates civil rights laws by blocking ads from being shown to people based on factors such as race
and gender.
“Because of the way respondent designed its advertising platform, ads for housing and housing-related services are shown to large audiences that are severely biased ... such
as audiences of tens of thousands of users that are nearly all men or nearly all women,” HUD alleges in an administrative law complaint.
The government's move comes one week after
Facebook agreed to settle civil rights lawsuits brought by the ACLU,
National Fair Housing Alliance and other organizations.
The settlement calls for Facebook to prohibit advertisers of housing, employment or credit offers from targeting ads based on age,
gender, ZIP code and ethnic affinity -- often used as a proxy for race. The deal also requires Facebook to pay around $5 million, and to create a tool that will allow people to search for all housing
ads in the U.S.
A company spokesperson says Facebook was surprised by HUD's decision to bring charges, given that the company was working with HUD and had already taken “significant
steps” to address discrimination in advertising.
“While we were eager to find a solution, HUD insisted on access to sensitive information -- like user data -- without adequate
safeguards,” the spokesperson said.
HUD's complaint includes allegations about the targeting options that Facebook offers advertisers as well as Facebook's ad delivery system.
Specifically, HUD alleges that Facebook's targeting system allegedly allows advertisers to block ads from being shown to people based on characteristics protected by civil rights laws.
“Respondent has provided a toggle button that enables advertisers to exclude men or women from seeing an ad, a search-box to exclude people who do not speak a specific language from seeing an
ad, and a map tool to exclude people who live in a specified area from seeing an ad by drawing a red line around that area,” the complaint alleges. “Respondent has offered advertisers
hundreds of thousands of attributes from which to choose, for example to exclude 'women in the workforce,' “moms of grade school kids,' 'foreigners.'”
HUD also says in its
complaint that Facebook's ad-delivery tool operates in ways that discriminate even when advertisers would prefer to be inclusive.
“Even if an advertiser tries to target an audience that
broadly spans protected class groups, Respondent’s ad delivery system will not show the ad to a diverse audience if the system considers users with particular characteristics most likely to
engage with the ad,” the complaint alleges.
The legal battle over Facebook's targeting options dates to November of 2016, soon after ProPublica reported that Facebook enabled
advertisers to prevent their ads from being shown to users who belong to certain "ethnic affinity" groups -- including people the social networking believes have an ethnic affinity of black,
Asian-American and Hispanic.
After ProPublica's initial report, Facebook updated its ad guidelines to strengthen prohibitions against discrimination based on race, ethnicity, color,
national origin, religion, age, sex, sexual orientation, gender identity, family status, disability, or medical or genetic condition.
The company also said it would require advertisers
offering housing and employment ads to certify compliance with anti-discrimination laws.
Last year, Facebook eliminated 5,000 ad-targeting options, including ones that enabled discrimination
based on ethnicity or religion. Among other segments, Facebook removed advertisers' ability to block ads from being seen by users interested in things like “Passover,”
“Evangelicalism,” “Native American culture,” “Islamic culture,” and “Buddhism.”
Advocacy group Free Press praised news of the lawsuit, while calling for a broader investigation of online platforms.
“Online advertisers and
major tech companies have been getting away with practices that deny people their basic civil rights for far too long,” Free Press Policy Counsel Gaurav Laroia stated Thursday.
“The algorithms that power the microtargeting of ads on these sites have been used to deny people opportunities in education, housing, jobs and lending -- all areas where our civil-rights laws
were designed to prevent discrimination.”
It's not clear whether online platforms like Facebook are liable for discriminatory ads, given that Section 230 of the Communications Decency
Act protects platforms for material posted by users.
“Courts have never definitively answered questions about Section 230 protection for Facebook's ad delivery system,” says
Internet law expert Eric Goldman, a professor at Santa Clara University.
Before Facebook agreed to settle the lawsuit by the ACLU, the company argued that the matter should be dismissed on the
grounds that the company was protected by Section 230. But the matter was resolved before judges ruled on that argument.