Skip navigation

Locked Out

The Markup’s Work Cited in Effort to Outlaw Discriminatory Algorithms

Investigation on tenant screening used to illustrate needed reforms

D.C. Attorney General Karl Racine standing at a podium with microphones.
Washington, D.C., attorney general Karl Racine Bill Clark/CQ-Roll Call, Inc. via Getty Images

A bill introduced in Washington, D.C., last week takes aim at a wide swath of discriminatory algorithms, and the author cited The Markup’s Locked Out series to explain why reforms are needed. 

The proposal seeks to hold companies and organizations that do business in the nation’s capital responsible for algorithms they use to make decisions on housing, education, employment, and “public accommodations and services” such as insurance, health care, and credit. If passed by the Council of the District of Columbia, violations would carry a fine of up to $10,000 each.

“This so-called artificial intelligence is the engine of algorithms that are, in fact, far less smart than they are portrayed and more discriminatory and unfair than big data wants you to know,” Karl Racine, D.C.’s attorney general, said in a news release. “Our legislation would end the myth of the intrinsic egalitarian nature of [artificial intelligence].”

Last year, The Markup and The New York Times found that tenant background checks are widely used and often rife with inaccuracies, which can bar qualified people from obtaining housing. 

People with common names are particularly at risk for having incorrect reports, such as Latinos who often share a smaller pool of unique last names. Common names, as well as hyphenated last names, can trigger a false match if the screening company relies solely on an algorithmic match and doesn’t use other information to corroborate someone’s identity.

Tenant screening companies at the time said renters rarely dispute reports, and the Consumer Data Industry Association, a trade group that represents background check and consumer reporting companies, said that systemic issues don’t exist. The group declined to comment on the bill. 

Last month, the Consumer Financial Protection Bureau issued an advisory against such “name only” matching to curb errors. Similarly, the chair of the Senate Committee on Banking, Housing, and Urban Affairs sent a letter to the head of the CFPB in October calling for a review of the tenant screening industry; The Markup and New York Times’ work was cited in it.

The Washington, D.C., bill, which was introduced by D.C. Council chair Phil Mendelson on Racine’s behalf, is aimed at companies whose businesses focus on data and make decisions with algorithms.

You can’t just plead ignorance.

Laura Moy, Georgetown Law’s Communications and Technology Law Clinic

If it’s approved, companies would be required to audit their algorithms annually for potential discrimination and submit a report to Racine’s office, which will also handle enforcement. 

“You can’t just plead ignorance,” said Laura Moy, director of Georgetown Law’s Communications and Technology Law Clinic, which helped research and draft the bill. “You have to go out and proactively find out whether or not there is discrimination and then you have to tell us what you’re doing to address it.”

Companies would also be required to inform consumers about how they use people’s personal information to make decisions with algorithms and where they got that information. D.C. residents would also be able to challenge inaccuracies, and the updated information would be used to re-generate the report or algorithmic decision. 

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now