Commentary

Meta Addresses Algorithmic Discrimination With New Ad Tech

It looks like one of many lawsuits filed against Meta Platforms is resulting in a positive change––what U.S. Attorney Damian Williams is calling a “groundbreaking resolution” that “sets a new standard for addressing discrimination through machine learning.”

Over the past year, Meta has worked with the Justice Department to develop new technology––a system known as the Variance Reduction System (VRS)––to help distribute ads in a more equitable way. Meta is now launching the VRS in the U.S. for housing ads––which will help the company comply with the Fair Housing Act––with plans to expand later this year to employment and credit ads.

According to the U.S. Justice Department, Meta’s development and launch of the VRS is indicative of a “key milestone” in the June 2022 DOJ settlement with the tech giant, which involved a lawsuit surrounding Meta’s violation of the Fair Housing Act.

On Monday, both parties informed the court that they had reached agreement on the system’s compliance targets. Still, Meta will be subject to court oversight and regular review of its compliance with the settlement through June 2026, to make sure it doesn’t continue to mislead consumers with unfair targeting practices.

“This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms,” said assistant attorney General Kristen Clarke of the Justice Department’s Civil Rights Division.

As for the VRS: the system measures the audience reach for each ad displayed across Meta’s various platforms––Facebook, WhatsApp, Instagram, Messenger––ensuring a broader spread of exposure.

“After the ad has been shown to a large enough group of people, the VRS measures aggregate demographic distribution of those who have seen the ad to understand how that audience compares with the demographic distribution of the eligible target audience selected by the advertiser,” explains Meta.

By measuring overall ad exposure and comparing it to audience data based on US Census statistics on race and ethnicity, the VRS is meant to ensure that housing ads are not being limited to certain ethnic or socioeconomic groups by Meta’s ad-targeting AI.

It’s good (and necessary) that Meta is addressing its unjust, predatory, and sinister ad-targeting processes. But we should remember what happened to help avoid such problems in the future.

In 2017, an investigation by watchdog ProPublica discovered that advertisers were able to create Facebook ads that excluded people based on race, ethnicity, familial status, national origin, and whether or not they had a disability.

For example, ProPublica was able to buy an ad in Facebook’s housing category and exclude Black, Hispanic or Asian-Americans from seeing it.

These ad targeting options were then removed in 2019 for housing, employment and credit ads. The VRS is Meta’s most recent attempt at continuing this effort.

Still, AI models are all likely marred by bias to some degree. Because of this, the effects of Meta’s VRS system will need to be measured closely, with each concurrent update.

“The field of fairness in machine learning is a dynamic and evolving one,” Meta wrote in its white paper on the VRS. “Much of this work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is responsibly used to deliver personalized ads.”

Next story loading loading..