SAN FRANCISCO. On Tuesday, Meta agreed to change its ad technology and pay a $115,054 fine in a settlement with the Justice Department over allegations that the company’s ad systems Facebook user discrimination on limiting who can view housing ads on the platform based on their race, gender, and zip code.
According to the agreement, Meta, the company formerly known as Facebook, said he would change his technology and use a new computer method that aims to regularly check whether those who are targeted by housing advertisements and who are eligible to receive housing advertisements actually see the advertisement. The new method, called the “variance reduction system,” relies on machine learning to ensure that advertisers deliver housing-related ads to certain protected classes of people.
“Meta will change its ad serving system for the first time to address algorithmic discrimination,” said Damian Williams, U.S. Attorney for the Southern District of New York. said in a statement. “But if Meta cannot demonstrate that it has changed its delivery system sufficiently to guard against algorithmic bias, this office will move forward with litigation.”
Facebook, which has become a business colossus by collecting data about its users and allowing advertisers to target ads based on audience characteristics, has faced complaints for years that some of these methods are biased and discriminatory. The company’s advertising systems allow marketers to choose who sees their ads using thousands of different characteristics, which also allows these advertisers to exclude people who fall under a number of protected categories such as race, gender and age.
The Justice Department filed a lawsuit and settlement against Meta on Tuesday. In its lawsuit, the agency said it had concluded that “Facebook can achieve its interests by maximizing its revenue and providing users with relevant advertising through less discriminatory means.”
While the deal is specific to housing ads, Meta said it also plans to use its new system to test the targeting of ads related to jobs and loans. Previously, the company faced negative consequences prejudice against women in job postings and exclusion of certain groups of people from see credit card ads.
The issue of biased advertising targeting has been particularly discussed in housing advertising. In 2016, Facebook’s potential for ad discrimination was uncovered in a study. the study ProPublica, which showed that the company’s technology makes it easy for marketers to exclude certain ethnic groups for advertising purposes.
In 2018, Ben Carson, who was Secretary of the Department of Housing and Urban Development, announced official complaint against Facebook, accusing the company of having ad systems that “unlawfully discriminate” on categories such as race, religion and disability. In 2019 HUD sues Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems did not deliver ads to “diverse audiences”, even if the advertiser wanted the ad to be visible to a wide audience.
“Facebook discriminates against people based on who they are and where they live,” he said. Carson said at the time. “Using a computer to restrict housing choices can be as discriminatory as slamming a door in someone’s face.”
The DOJ lawsuit and settlement is based in part on a 2019 HUD investigation and discrimination allegation against Facebook.
In its own tests related to this issue, the U.S. Attorney’s Office for the Southern District of New York found that Meta’s advertising systems were rejecting housing ads from certain categories of people, even when advertisers weren’t trying to do so. According to the Justice Department’s complaint, the ads were targeted “disproportionately to white and non-black users, and vice versa.”
The complaint alleges that many housing ads in areas where the majority of people were white were also directed primarily to white users, while housing ads in predominantly black areas were shown primarily to black users. As a result, the complaint says, Facebook’s algorithms “actually and predictably reinforce or perpetuate segregated housing patterns due to race.”
In recent years, civil rights groups have also spoken out against the vast and complex advertising systems that underpin some of the largest Internet platforms. The groups argue that these systems have inherent biases built into them and that tech companies like Meta, Google and others should be doing more to fight back against these prejudices.
An area of study known as “algorithmic justice” is of considerable interest to computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists such as Timnit Gebru and Margaret Mitchell, alarm bell sounded on such prejudices for many years.
In subsequent years, Facebook has been suspended about the types of categories that marketers could choose from when buying listings for housing, narrowing them down to hundreds, and eliminating targeting options based on race, age, and zip code.
Chansela Al-Mansour, executive director of the Center for Housing Rights in Los Angeles, said it was “extremely important” that “fair housing laws be enforced hard.”
“Housing advertising has become a tool for illegal behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea that they were either being harassed or being denied housing ads based on their race and other characteristics.”
Meta’s new ad tech, which is still in development, will check from time to time who is being shown ads for housing, jobs, and loans, and make sure those audiences match the people marketers want to target. If, for example, ads start to heavily bias toward white males in their 20s, the new system will theoretically recognize this and shift ads so that they are more equitably shown to a wider and more diverse audience.
“We’re going to take snapshots of the marketers’ audience from time to time, see who they’re targeting, and eliminate as many differences in that audience as we can,” Roy L. Austin, Meta vice president of civil rights and deputy general counsel. , – said in an interview. He called it “a significant technological advance in how machine learning is being used to deliver personalized ads.”
Meta said it will work with HUD in the coming months to incorporate the technology into Meta’s ad targeting systems, and has agreed to a third-party audit of the new system’s effectiveness.
The company also said it will no longer use a feature called “custom ad audiences,” a tool it has developed to help advertisers expand the groups of people their ads target. The Justice Department said the tool also engaged in discriminatory practices. Meta said the tool was an early attempt to fight prejudice and that his new methods would be more effective.
The $115,054 fine the Meta agreed to pay as part of the settlement is the maximum available under the Fair Housing Act, the Justice Department said.
“The public should know that the latest abuse by Facebook cost the same amount of money that Meta earns in about 20 seconds,” said Jason Kint, executive director of Digital Content Next, an association of premium publishers.
As part of the settlement, the Meta admitted no wrongdoing.