Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

The Breakdown

Does Facebook Still Sell Discriminatory Ads?

We found discriminatory ads can still appear, despite Facebook's efforts

Drew Angerer/Getty Images

In May, a Wisconsin health care agency, Tenderness Health Care, posted a job ad on Facebook looking for personal care workers. According to Facebook’s “Why am I seeing this ad” pop-up, when the agency purchased the ad, it asked Facebook to not show it to anyone over 54 years of age. And they asked Facebook to show it specifically to people who have “African American multicultural affinity.” Facebook, apparently, complied. 

The problem? Federal law prohibits employers from discriminating on the basis of age and race, including in advertising open jobs. 

When The Markup brought the ad to Facebook’s attention, the company took it down, according to Tom Channick, a Facebook spokesperson. (Tenderness’s CEO, Julio Fernandez, didn’t respond to The Markup’s requests for comment.) 

This is not the first time Facebook has been accused of allowing discrimination on its platform. 

Civil rights groups have accused Facebook of violating federal antidiscrimination laws and filed at least five lawsuits against the company, which culminated in a landmark settlement last year. In it, Facebook agreed to change its practices that enabled advertisers to exclude people of certain genders, ages, and “multicultural affinities” from seeing housing, job, and financial services ads. (Facebook does not collect data on users’ race but did until recently have a “multicultural affinity” category, which it did not clearly define.)

The problem? Federal law prohibits employers from discriminating on the basis of age and race, including in advertising open jobs.

The company is still defending another lawsuit, filed by federal housing authorities.

The bulk of the legal issues centered on the way much of online advertising works, including Facebook’s. Unlike many print ads and early internet ads, which appeared next to related content, Facebook’s advertising targets people. 

Facebook’s detailed data trove on its users allows companies advertising everything from jobs to jeans to choose from a sprawling menu of demographic characteristics and interests when choosing who sees their ads—and that has earned the company billions of dollars year after year. 

Facebook has defended that practice, arguing that companies might legitimately use age-based targeting to tailor their messages or for outreach to underrepresented groups. Its written policies ban discrimination, it has said.

And targeting can sometimes be innocuous. A 21-plus hip-hop venue in Atlanta might have good reasons to not want to spend money advertising to 18-year-olds, people living in New York, or people whose online behavior reveals no interest in hip-hop. 

But there are circumstances where targeting an advertisement based on a person’s characteristics is illegal. Federal laws prohibit discrimination in housing, employment, and financial services—including in housing advertisements in particular. These regulations were imposed after decades of landlords and neighborhoods banning minority and immigrant renters and homeowners; employers discriminating based on race, gender, and sexuality; and creditors selling minorities bad loan rates and financial products. 

Credit:Facebook Ad Library
A screenshot of a Facebook advert for Tenderness Health CARE
Caption: This ad was shown specifically to people who have "African American multicultural affinity" and are under 55 years old.

“Helping prevent discrimination in employment opportunity ads is an area where we lead the industry,” the Facebook spokesperson, Channick, said in an email.

So why was The Markup still able to find a discriminatory job ad? Channick wouldn’t say. 

The company uses automated controls to limit when and how advertisers can target ads based on age and gender, in line with its civil settlement.

Channick didn’t respond to a question about whether Facebook’s automated controls always work.

Facebook had forbidden the use of “multicultural” categories for targeting in job ads. But those categories were used by other kinds of ads by civil and voting rights groups, Black-focused health advocacy groups, politicians on both sides of the aisle, and commercial advertisers.

A week after The Markup contacted Facebook, Facebook announced it was eliminating the multicultural affinity categories altogether after years of internal debate.

Aside from allowing advertisers to target specific audiences, some critics say Facebook discriminates all on its own in delivering those ads. That’s what the Department of Housing and Urban Development argues in its lawsuit.

↩︎ link

Facebook’s Algorithm Has Its Own Biases

Once an advertiser picks its target audience, Facebook’s algorithm makes its own decisions about which specific user within the target audience to show an ad to.

The goal of that algorithm, Facebook has said, is to try to predict which people might be interested in the ad, based on their Facebook activity and the demographics it can pull from their personal pages—including age, gender, and other factors.

For pretty much all the types of credit ads that we’ve analyzed, men tend to be a greater percentage [of the people shown the ad].

Sara Kingsley, Carnegie Mellon University

“Facebook is not giving the user what the user wants—Facebook is giving the user what it thinks a demographic stereotype wants,” wrote the Lawyers’ Committee for Civil Rights Under Law in a court filing.

In 2019, researchers at Northeastern University and the digital civil rights advocacy group Upturn ran dozens of their own ads, all targeted to the same broad audience, and found that Facebook showed the ads seeking to hire secretaries mostly to women.

It’s not clear whether Facebook programmed its algorithms to show the secretary job ad to any given woman because she herself did something to express an interest in secretarial work, or simply because she was a woman.

A team at Carnegie Mellon University recently analyzed real-life ads for things like jobs, housing, and credit that were included (sometimes by mistake) in Facebook’s publicly available archive of political ads. The ads were posted both before and after Facebook’s policy change.

“For pretty much all the types of credit ads that we’ve analyzed,” said Sara Kingsley, a Ph.D. student who led that research, “men tend to be a greater percentage” of the people shown the ad.

Housing and job ads, she said, went disproportionately to women. But individual ads can vary widely:

↩︎ link

Things that Facebook may consider innocuous data points may correlate so strongly with age or gender or race that “the algorithm sees the circumstances of older black women, which are the result of systemic discrimination and inequity, and misinterprets those circumstances as preferences,” David Brody, an attorney at the Lawyers’ Committee for Civil Rights Under Law, said in an email.

“So the next time the algorithm encounters a user matching the qualities of an older black woman, it is going to impute those so-called preferences on to that user and in the process reinforce the pattern of systemic discrimination,” he said.

↩︎ link

What Has Facebook Done About It?

Facebook’s 2019 settlement with civil rights groups required it to implement new rules, like requiring advertisers to click a button to self-report housing, job, and credit ads. Facebook then removes certain demographic targeting choices from that ad’s menu of options. 

And Facebook began requiring advertisers to accept an agreement promising not to discriminate. The company also promised to research bias in its algorithms. But researchers and civil rights activists say that they’re frustrated the company hasn’t released any findings.

Just a few weeks ago, Facebook created teams to study racial bias on its platforms.

One internal study reportedly found bias against users who had been categorized as matching the African American “multicultural affinity” category. Facebook made adjustments but shut down further research because that category was not supposed to represent race—data that Facebook didn’t gather from its users.

“They have chosen not to collect that data because they don’t want to turn on the lights and see how many cockroaches are in the room,” Brody said.

Just a few weeks ago, Facebook created teams to study racial bias on its platforms.

↩︎ link

Wait, Isn't Discrimination Illegal?

California law bans “intentional” discrimination in places of “public accommodation,” including on websites. A pending lawsuit in federal court in Northern California argues that Facebook’s ads break that law because the algorithms’ outcomes treat people differently based on characteristics like age and gender. 

“Look, they wrote the algorithm,” Brody told The Markup. “No one knows how the algorithm works except them. They are responsible for everything the algorithm does.” (Facebook declined to comment on the lawsuit, but in court filings, it has denied breaking the law.)

Look, they wrote the algorithm…. They are responsible for everything the algorithm does.

David Brody, Lawyers’ Committee for Civil Rights Under Law

Federal laws also ban discrimination, but federal law also protects tech companies like Facebook from liability over the content on their platforms—specifically Section 230(c) of the Communications Decency Act, which in essence says websites aren’t responsible for things other people say on their sites. 

Facebook’s legal response to the public accommodation lawsuit in California invokes Section 230—essentially saying the company can’t be sued over ads on its platform—a legal question that has not yet been settled. 

Senator Mark Warner, a Democrat, in a statement sent to The Markup, called Facebook’s argument in fighting the public accommodation lawsuit a “misuse of Section 230.” It’s “one of the most pressing examples of why we need to reform this antiquated law,” he said. “Internet exceptionalism rationales should not stand in the way of upholding longstanding principles of fairness and non-discrimination.”

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now