Incident 93: HUD charges Facebook with enabling housing discrimination

Description: In March 2019 the U.S. Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act by allowing real estate sellers to target advertisements in a discriminatory manner.

Suggested citation format

Dadkhahnikoo, Neama. (2018-08-13) Incident Number 93. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
93
Report Count
4
Incident Date
2018-08-13
Editors
Sean McGregor, Khoa Lam

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In March 2019, the U.S. Department of Housing and Urban Development (HUD) charged Facebook with violating the Fair Housing Act. HUD claims the platform’s ad-targeting options enabled advertisers to illegally restrict the housing options presented to marginalized groups. In a similar case brought by a group of civil rights groups, Facebook reached a settlement and agreed to several changes to their platform. Real estate sellers can no longer target ads by age, gender or zip code and Facebook created a housing portal that allows users to view all available house listings. HUD alleges that, despite these changes, Facebook’s AI and machine learning tools create proxy classifications that continue to enable advertisers to discriminate against protected groups.

Short Description

In March 2019 the U.S. Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act by allowing real estate sellers to target advertisements in a discriminatory manner.

Severity

Moderate

Harm Distribution Basis

Race, Age, Sex

Harm Type

Harm to civil liberties

AI System Description

Facebook's algorithms which provide user classifications to advertisers to enable ad targeting.

System Developer

Facebook

Sector of Deployment

Arts, entertainment and recreation

Relevant AI functions

Cognition

AI Techniques

machine learning

AI Applications

data analytics, classification

Location

United States

Named Entities

Facebook, Department of Housing and Urban Development, HUD, National Fair Housing Alliance, Ben Carson, American Civil Liberties Union

Technology Purveyor

Facebook

Beginning Date

2019-03-01T08:00:00.000Z

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Laws Implicated

Fair Housing Act

Data Inputs

user Facebook activity, user social network data

Incident Reports

The Department of Housing and Urban Development (HUD) on Thursday charged Facebook with encouraging and enabling housing discrimination through its targeted advertising practices.

HUD is charging Facebook with violating the Fair Housing Act, federal legislation that prohibits discrimination against people seeking to buy or rent a home.

"Facebook is discriminating against people based upon who they are and where they live,” HUD Secretary Ben Carson said in a statement. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

The charge follows a months-long investigation by HUD into whether Facebook illegally allows real estate sellers to restrict their advertisements by characteristics such as race.

Facebook earlier this month agreed to enact sweeping reforms to its ad-targeting system as part of a settlement with civil rights groups alleging similar complaints. The rights groups, including one dedicated to housing, alleged the tech giant allowed advertisers to discriminate against marginalized groups.

As part of that settlement, Facebook will no longer allow advertisers to target or exclude housing ads by age, gender or zip code, and it also removed hundreds of targeting options for anyone advertising housing, credit or employment opportunities.

The company as part of the settlement also said it will create a new portal to allow users to search for and view housing ads in the U.S. regardless of who the advertisers hoped to target.

Facebook said it was surprised by HUD's decision, and said it had enacted changes to its platform to undercut misuse by advertisers.

"We're surprised by HUD's decision, as we've been working with them to address their concerns and have taken significant steps to prevent ads discrimination," a Facebook spokesperson said. "Last year we eliminated thousands of targeting options that could potentially be misused, and just last week we reached historic agreements with the National Fair Housing Alliance, [American Civil Liberties Union] ACLU, and others that change the way housing, credit and employment ads can be run on Facebook."

"While we were eager to find a solution, HUD insisted on access to sensitive information — like user data — without adequate safeguards," the spokesperson added. "We're disappointed by today’s developments, but we’ll continue working with civil rights experts on these issues.”

According to HUD, Facebook allowed advertises to exclude people from seeing housing advertisements based on interests that "closely align with the Fair Housing Act’s protected classes," including users who Facebook classified as non-American-born, non-Christian, interested in accessibility or who were interested in Hispanic culture, in addition to other groups.

Facebook last August removed thousands of targeting options that advertises could use to exclude audiences based on ethnicity, gender and religion, the company noted.

HUD claims that Facebook allowed advertisers to exclude people based on their neighborhood "by drawing a red line around those neighborhoods on a map."

Finally, HUD's charge asserts that Facebook's machine learning and artificial intelligence (AI) tools "classify and group users to project each user's likely response to a given ad," potentially creating groupings defined by their protected class.

Facebook says HUD has not found evidence that its AI systems discriminate against people.

The debate over Facebook's targeted ad practices was originally sparked when ProPublica reported in 2016 that Facebook was allowing advertisers to exclude certain users based on "ethnic affinity," which critics have argued is used as a proxy for race and ethnicity.

Since then, housing, employment and civil rights groups have raised concerns that Facebook was side-stepping civil rights laws by allowing advertisers to target or exclude certain groups closely aligned with protected classes of people.

"Even as we confront new technologies, the fair housing laws enacted over half a century ago remain clear—discrimination in housing-related advertising is against the law," HUD General Counsel Paul Compton said. "Just because a process to deliver advertising is opaque and complex doesn’t mean that it’s [sic] exempts Facebook and others from our scrutiny and the law of the land."

HUD charges Facebook with enabling housing discrimination

The federal government is suing Facebook over allegations of housing discrimination in the social network's advertising platform.

The U.S. Department of Housing and Urban Development on Thursday announced that it is charging Facebook with violations of the Fair Housing Act stemming from an ad-serving program that determines which users get to see housing advertisements based on race, gender, family status and other characteristics protected under the statute.

"Facebook is discriminating against people based upon who they are and where they live," HUD secretary Ben Carson said in a statement. "Using a computer to limit a person's housing choices can be just as discriminatory as slamming a door in someone's face."

"Even as we confront new technologies," said HUD general counsel Paul Compton, "discrimination in housing-related advertising is against the law."

The charges come close on the heels of a settlement Facebook reached with civil rights groups in which it agreed to modify its advertising program to limit the ability of lenders, creditors and employers to reach users on the basis of race, gender and other protected categories.

"We're surprised by HUD's decision, as we've been working with them to address their concerns and have taken significant steps to prevent ads discrimination," Facebook spokeswoman Elisabeth Diana said in an emailed statement.

Facebook says that it had been collaborating with HUD to address the issue of housing discrimination through its ad platform, but that federal authorities had overstepped in their demands for access to the company's network of users.

"While we were eager to find a solution, HUD insisted on access to sensitive information -- like user data -- without adequate safeguards," Facebook said. "We're disappointed by today's developments, but we'll continue working with civil rights experts on these issues."

Announcing changes to its ad platform under the settlement earlier this month, Facebook chief operating office Sheryl Sandberg explained that housing, job and credit ads would no longer be eligible for some of Facebook's microtargeting categories. Accordingly, "any detailed targeting option describing or appearing to relate to protected classes" would be off-limits, as would targeting by age, gender and ZIP code, Sandberg wrote.

"Getting this right is deeply important to me and all of us at Facebook because inclusivity is a core value for our company," Sandberg said.

HUD had earlier been probing Facebook's policies for targeting housing ads, and last August had filed a complaint against Facebook alleging discrimination on the basis of "race, color, religion, sex, familial status, national origin and disability."

Now that talks between the two parties have broken down, HUD is alleging a host of violations of the Fair Housing Act, claiming that Facebook allowed advertisers to structure their campaigns so that their ads would not be seen by users identified as non-Christians, foreign-born, interested in accessibility and other characteristics covered by the law.

HUD Is Suing Facebook For Housing Discrimination

Last week I watched a webinar (additional cost) that is available on the NAFCU Online Training Center titled Red Flags for Fair Lending. The webinar was presented live in March of this year on the same day that the United States Department of Housing and Urban Development (HUD) charged Facebook with violating the Fair Housing Act. Despite the brief warning, the presenters did a great job of explaining what was at issue in the charge against Facebook and what that might mean for credit unions and other financial institutions attempting to comply with the Fair Housing Act's requirements. The Fair Housing Act prohibits discriminating against any person with respect to the terms and conditions of the sale or rental of a dwelling or in providing any services in conjunction with a sale or rental on the basis of race, color, religion, sex, familial status, national origin, or disability. When a dwelling is involved, NCUA's nondiscrimination in lending rule prohibits discrimination against any person in the lending context on similar bases.

The HUD Charge of Discrimination against Facebook followed a Housing Discrimination Complaint HUD filed against Facebook in August 2018. In the Housing Discrimination Complaint, HUD alleged that 

Facebook unlawfully discriminates by enabling advertisers to restrict which Facebook users receive housing-related ads based on race, color, religion, sex, familial status, national origin and disability. Facebook mines extensive user data and classifies its users based on protected characteristics. Facebook's ad targeting tools then invite advertisers to express unlawful preferences by suggesting discriminatory options, and Facebook effectuates the delivery of housing-related ads to certain users and not others based on those users' actual or imputed protected traits.

Both the Housing Discrimination Complaint and Charge of Discrimination contained specific allegations about how Facebook's advertising platform worked.  As alleged in the Charge of Discrimination, Facebook determined which users would ultimately end up seeing an advertisement based on a two-part process.

Part one of the Facebook advertising process involved advertisers selecting "attributes that the users who will be shown the ad must have and attributes that users who will be shown the ad must not have." This first part of the process also enabled advertisers to include or exclude specific individuals identified by the advertisers as well as users who might share commonalities with the specifically identified individuals. With respect to the latter group - those Facebook users who might share commonalities with the specifically identified individuals - that group would be identified by Facebook by considering "sex and close proxies for the other protected classes."

In the second part of the process, Facebook determined which of its users would actually see an advertisement based "in large part on the inferences and predictions it draws about each user's likelihood to respond to an ad based on the data it has about that user, the data it has about other users whom it considers to resemble that user, and the data it has about 'friends' and other associates of that user."

The risk to credit unions pointed out by the NAFCU webinar presenters is that the ability to target members and other customers based on their address, age, gender, or other interests may raise the risk of excluding members or potential members that are part of a protected class. Thus, the challenge is to develop a holistic marketing plan in light of the risk of exclusion.

The March 2019 webinar also covered more issues related to inclusion and exclusion, pricing practices, underwriting practices, and fair lending programs. If you are looking for more guidance about fair lending issues, NAFCU is also running a Fair Lending 101 webinar (additional cost) on June 4th.

HUD v. Facebook

The Department of Justice announced today that it has obtained a settlement agreement resolving allegations that Meta Platforms Inc., formerly known as Facebook Inc., has engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed agreement resolves a lawsuit filed today in the U.S. District Court for the Southern District of New York alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin. The settlement will not take effect until approved by the court.

Among other things, the complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the FHA. This is the department’s first case challenging algorithmic bias under the Fair Housing Act.

Under the settlement, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) that, according to the department’s complaint, relies on a discriminatory algorithm. Meta also will develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. That system will be subject to Department of Justice approval and court oversight.

This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”  

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said U.S. Attorney Damian Williams for the Southern District of New York. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”  

“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, the Principal Deputy Assistant Secretary for Fair Housing and Equal Opportunity at the Department of Housing and Urban Development (HUD). “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”

United States’ Lawsuit

The United States’ complaint challenges three key aspects of Meta’s ad targeting and delivery system. Specifically, the department alleges that:

  • Meta enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status and national origin to decide which Facebook users will be eligible and ineligible to receive housing ads.
  • Meta created an ad targeting tool known as “Lookalike Audience” or “Special Ad Audience.” The tool uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several options provided by Facebook. Facebook has allowed its algorithm to consider FHA-protected characteristics — including race, religion and sex — in finding Facebook users who “look like” the advertiser’s source audience and thus are eligible to receive housing ads.
  • Meta’s ad delivery system uses machine-learning algorithms that rely in part on FHA-protected characteristics — such as race, national origin and sex — to help determine which subset of an advertiser’s targeted audience will actually receive a housing ad.

The complaint alleges that Meta has used these three aspects of its advertising system to target and deliver housing-related ads to some Facebook users while excluding other users based on FHA-protected characteristics.

The department’s lawsuit alleges both disparate treatment and disparate impact discrimination. The complaint alleges that Meta is liable for disparate treatment because it intentionally classifies users on the basis of FHA-protected characteristics and designs algorithms that rely on users’ FHA-protected characteristics. The department further alleges that Meta is liable for disparate impact discrimination because the operation of its algorithms affects Facebook users differently on the basis of their membership in protected classes.

Settlement Agreement

These are the key features of the parties’ settlement agreement:

  • By Dec. 31, 2022, Meta must stop using an advertising tool for housing ads known as “Special Ad Audience” (previously called “Lookalike Audience”), which relies on an algorithm that, according to the United States, discriminates on the basis of race, sex and other FHA-protected characteristics in identifying which Facebook users will be eligible to receive an ad.
  • Meta has until December 2022 to develop a new system for housing ads to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads. If the United States concludes that this new system sufficiently addresses the discriminatory disparities that Meta’s algorithms introduce, then Meta will fully implement the new system by Dec. 31, 2022.
  • If the United States concludes that Meta’s changes to its ad delivery system do not adequately address the discriminatory disparities, the settlement agreement will terminate and the United States will litigate its case against Meta in federal court.
  • The parties will select an independent, third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties. Under the agreement, Meta must provide the reviewer with any information necessary to verify compliance with those standards. The court will have ultimate authority to resolve disputes over the information that Meta must disclose.
  • Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics. Under the agreement, Meta must notify the United States if Meta intends to add any targeting options. The court will have authority to resolve any disputes between the parties about proposed new targeting options.
  • Meta must pay to the United States a civil penalty of $115,054, the maximum penalty available under the Fair Housing Act.

The Justice Department’s lawsuit is based in part on an investigation and charge of discrimination by HUD, which found that all three aspects of Meta’s ad delivery system violated the Fair Housing Act. When Facebook elected to have the HUD charge heard in federal court, HUD referred the matter to the Justice Department for litigation.

This case is being handled jointly by the Justice Department’s Civil Rights Division and the U.S. Attorney’s Office for the Southern District of New York.

Assistant Attorney General Kristen Clarke and U.S. Attorney Damian Williams thanked the Department of Housing and Urban Development for its efforts in the investigation.

The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, familial status, national origin and disability. More information about the Civil Rights Division and the laws it enforces is available at www.justice.gov/crt. More information about the U.S. Attorney’s Office for the Southern District of New York is available at www.justice.gov/usao-sdny. Individuals who believe they have been victims of housing discrimination may submit a report online at www.civilrights.justice.gov, or may contact the Department of Housing and Urban Development at 1-800-669-9777 or through its website at www.hud.gov. The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, familial status, national origin and disability. More information about the Civil Rights Division and the laws it enforces is available at www.justice.gov/crt. More information about the U.S. Attorney’s Office for the Southern District of New York is available at www.justice.gov/usao-sdny. Individuals who believe they have been victims of housing discrimination may submit a report online at www.civilrights.justice.gov, or may contact the Department of Housing and Urban Development at 1-800-669-9777 or through its website at www.hud.gov.

Justice Department Secures Groundbreaking Settlement Agreement with Meta Platforms, Formerly Known as Facebook, to Resolve Allegations of Discriminatory Advertising