Meta, the parent company of Facebook, and the United States government have agreed on to settle a lawsuit that accused the social network of engaging in discriminatory advertising for housing, the US Department of Justice (DoJ) announced on June 21.
The settlement, which will have to be approved by a judge before it’s truly final, says that Meta will have to stop using a discriminatory algorithm for housing ads and instead develop a system that will “address racial and other disparities caused by its use of personalization algorithms in its ad delivery system.”
In a statement on the same day, Meta announced that it will replace its Special Ad Audiences tool for housing, as well as credit and employment opportunities. The Silicon Valley giant plans on tackling this issue with machine learning, making a system that will “ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” Put simply, the proposed new system is supposed to make sure that the people actually seeing the ad are the audiences targeted by and eligible to see the ad. Meta will look at age, gender, and race to measure how far off the targeted audience is from the actual audience.
As per the settlement, the company has to prove to the government that the system works as intended and build it into its platform, by the end of December 2022.
Why this matters? The American Fair Housing Act is a federal law enacted in 1968 that prohibits discrimination in the purchase, sale, rental, or financing of housing — private or public — based on race, skin color, sex, nationality or religion. The fact that Meta had been in flagrant violation of this law for over a decade, all the while mostly flying under the media’s radar, needs to be noted by governments and rights campaigners all over the world. India, where housing discrimination has been a long running issue, also needs to look into similar ad algorithms.
What are the terms of the settlement?
The key features of the parties’ settlement agreement have been listed below:
- “By December 31, 2022, Meta must stop using an advertising tool for housing ads known as “Special Ad Audience” (previously called “Lookalike Audience”), which relies on an algorithm that, according to the United States, discriminates on the basis of race, sex, and other FHA-protected characteristics in identifying which Facebook users will be eligible to receive an ad,” the settlement states.
- Meta has until till the end of this year to develop and get approved a new system for housing ads which will address these observed discriminatory disparities between advertisers’ targeted audiences and the group of users to whom Facebook’s personalization algorithms actually delivers the ads.
- If the United States concludes that Meta’s changes to its ad delivery system do not adequately address the discriminatory disparities, the settlement agreement will terminate and the DoJ will litigate its case against Meta in federal court.
- “If the new system is implemented, then the parties will select an independent, third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties,” the agreement notes. Under the settlement terms, Meta must provide the investigator with all relevant information to verify compliance. Although, it is the court which will have the ultimate authority to resolve disputes related to the findings of these inquiries.
- “Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics.” Under the agreement, Meta must notify the federal government if it intends to add any targeting options. The court will have the last say in any disputes between the parties about the proposed new targeting options.
- “Meta must pay to the United States a civil penalty of $115,054, the maximum penalty available under the Fair Housing Act,” the settlement concluded.
Why was Meta brought to court?
The algorithmic housing discrimination lawsuit was initially filed by the government in 2019, though accusations about the company’s practices go back years before that. The 2019 suit, filed in federal court by the National Fair Housing Alliance and other groups, accuses Meta of enabling and encouraging advertisers to target their housing ads by relying on race, colour, religion, sex, disability, familial status, and nation of origin to decide which Facebook users will be eligible or ineligible to receive housing ads in certain regions.
In 2016, investigative news website ProPublica first documented how Facebook allowed advertisers to place housing ads that excluded users based on their race, apparently in violation of the Fair Housing Act, way back in 2016. A follow-up report in 2017 found that, despite promises, the company continued to allow similar exclusions. Facebook, in response, said it would temporarily block advertisers from using options that excluded users by race.
The groups who filed the 2019 suit said that they had also conducted similar independent investigations to reach the same conclusion. These findings were the basis of the case.
Also read:
- Updated Privacy Policy Reveals How Meta Processes User Data
- Meta Rescinds Request For Oversight Of Russia-Ukraine Policy
- Meta Says It Will Publish Data On Targeted Ads From June
