Skip navigation

Locked Out

Can Algorithms Violate Fair Housing Laws?

Landlords increasingly use screening services to weed out renters. Advocates say both landlords and the algorithms should be accountable when things go wrong

A photo of public housing projects in Manhattan, New York City
Buśa Photography/Getty Images

Update: After publication, several housing advocacy groups filed court cases challenging HUD’s controversial disparate-impact rule change, and in November, a federal judge put the change on pause until the cases were resolved. Then, on Jan. 26, President Biden issued an executive order calling on HUD to “examine the effects” of the Trump administration’s proposed rule change on HUD’s duty to prevent practices that result in housing discrimination. Experts expect that this order will lead HUD to abandon the rule-change attempt and return the disparate impact rule to its previous strength.

When Carmen Arroyo asked her apartment’s management company in 2016 if her son, Mikhail, could move in with her after a bad accident left him unable to care for himself, her request was denied. A tenant-screening background check had dredged up a minor (and considering his current circumstances, irrelevant) shoplifting charge from Mikhail’s past. 

This past month, a federal district court judge in Connecticut agreed to let Arroyo’s lawsuit against the screening company, CoreLogic, go to trial in what experts believe is the first case of its kind, targeting a screening company, rather than a landlord, for housing discrimination. The decision was a victory for fair housing advocates who have argued that tenant screening services are error-prone, result in racial discrimination, and are largely unaccountable. But even as the case proceeds, the Trump administration is looking to make it more difficult to bring similar lawsuits in the future.

The Department of Housing and Urban Development (HUD) finalized a change this month to rules governing how people make housing discrimination complaints to the agency, and the rule is scheduled to be entered into the federal register Thursday. It raises the bar for people proving that they’ve been discriminated against, and gives housing providers—whether landlords, realtors, developers, insurers, or lenders—more ways to get those claims thrown out. For instance, critics say, the rule change effectively immunizes people and companies from discrimination charges if they use “profit” as a reason for their decision-making, or if they use third-party systems to choose tenants—as was the case in Arroyo’s rejected application for her son.

Critics say the rule change effectively immunizes people and companies from discrimination charges if they use ‘profit’ as a reason for their decision-making.

The change, in draft form, provoked a major controversy last year, flooding HUD with over 45,000 public comments. Advocates of both fair housing policies and algorithmic accountability were vocal in their dissent. Even mortgage lenders and realtors eventually distanced themselves from HUD’s proposal—some of them invoking this summer’s seeds of a national reckoning over systematic racism in America. 

HUD’s general counsel, Paul Compton, told reporters last year that the rule change “frees up parties to innovate, and to take risks to meet the needs of their customers, without the fear that their efforts will be second-guessed through statistics years down the line.”

HUD says it responded to subsequent public concerns by dropping some controversial language. Previously the proposed rule had said that if a housing provider used an “algorithm” that they had no control over to help them make a decision, then they couldn’t be held responsible for possible discrimination that resulted. Now, instead of “algorithm,” the rule refers to “predictive models,” which housing attorneys and advocates say is an even broader term. 

“There was a serious problem with what they proposed, and there is an even greater problem with what they replaced it with,” said Sara Pratt, a private attorney who previously served as the deputy assistant secretary of HUD’s Office of Fair Housing in the Obama administration.

Private landlords and even public housing authorities are increasingly relying on algorithms to help them screen and score applicants. A joint investigation by The Markup and The New York Times this year found that 90 percent of landlords now rely on tenant-screening reports to make renting decisions; many of these reports are generated automatically in seconds by matching-algorithms prone to errors and mismatches.

But while those same landlords are subject to fair housing laws that bar them from discriminating on the basis of a client’s race, age, or gender, it’s not a settled question as to whether screening services are subject to those same laws. 

Arroyo’s case could provide clarity, said Arroyo’s attorney, Salmun Kazerounian, with the Connecticut Fair Housing Center.

“Tenant-screening companies need to clean up their products, and take a serious look at the outcomes that their products are generating, in order to avoid exposing themselves to potentially considerable liability,” he said.

Arroyo and her attorneys argued that the screening algorithm her landlord used, “CrimSAFE” by CoreLogic, disproportionately screens out Black and Latino applicants by relying on criminal records, and that it doesn’t give applicants the chance to explain their mitigating circumstances through more detailed, individualized assessments. They argued that CrimSAFE reported a “disqualifying” record without providing any details about it that would have allowed the property manager to make his own decision. (The screening report simply states there was a “criminal court action” found.)

CoreLogic argued in its defense, among other things, that it was not subject to the Fair Housing Act because its tool doesn’t make housing decisions—the landlords using the tool do. 

Last month, Federal District Judge Vanessa Bryant shot down that argument. She pointed out that CoreLogic did market CrimSAFE as a decision-making product, and that it also gave landlords the option of hiding the details behind those decisions in order to simplify the process. She also cited a guidance letter from HUD from 2016, which told housing providers that they may open themselves up to housing discrimination complaints if they denied applicants merely because of previous arrests (rather than convictions), since minorities are disproportionately more likely to be arrested in the U.S. 

Attorneys for CoreLogic did not respond to requests for comment.

HUD’s rule, on the other hand, deals with whether landlords who use tools like CoreLogic to choose who to rent to thereby immunize themselves from fair housing complaints. The rule, which is scheduled to become law in 30 days, could itself face legal challenges. 

“This rule is so unsupported in the law, and it’s so different from judicial precedent,” said Sara Pratt, the attorney who previously worked for HUD. 

In 2015, the U.S. Supreme Court ruled that if a business practice—like using a tenant screening tool—results in disparate results for people of different races, genders, or ages, then that business can be subject to a fair housing claim. That’s regardless of whether the landlord or tool intended to discriminate. 

HUD’s new rule, however, appears to say the opposite. 

“Essentially it says, if a policy is predictive, and it is generally not biased in its predictive functions, then it doesn’t matter if it has a discriminatory outcome,” said Morgan Williams, general counsel at the National Fair Housing Alliance, based on his initial reading of the text.

Asked about criticism that the rule change would disadvantage minority renters and borrowers, HUD spokesperson Matt Schuck responded in a statement that the rule change does not conflict with the Supreme Court’s decision. 

“This action brings legal clarity for banks and underwriters, and that clarity will stimulate mortgage credit and affordable housing for low-income and minority populations,” he wrote. 

Update

This article has been updated to reflect new developments.

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now