Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

Denied

Markup Mortgage Industry Investigation Cited in Support of Algorithmic Accountability Bill

The measure would require companies to test the algorithms they use for bias and discrimination

Senator Ron Wyden speaks before the Capitol building in Washington, D.C.
Sen. Ron Wyden Jemal Countess/Getty Images for SEIU

Authors of a bill introduced in the Senate and House on Thursday that aims to increase oversight of automated decision-making systems cited The Markup’s investigation into racial disparities in mortgage denials in their announcement of the measure.

“Automated systems are increasingly making critical decisions about Americans’ health, finances, housing, educational opportunities and more—potentially exposing the public to major new risks from flawed or biased algorithms,” reads a summary of the Algorithmic Accountability Act of 2022 released by its main sponsor, Sen. Ron Wyden (D-OR). “The Markup investigated poorly designed mortgage-approval algorithms that inexplicably denied loans to applicants who had just previously been approved. These harms could have been mitigated if companies had appropriately assessed the impacts of applying automation to these critical decisions.”

The Markup’s analysis of public mortgage data, co-published last year with the Associated Press, found that people of color who applied for home loans in 2019 were 40 to 80 percent more likely to be denied than White applicants with similar financial characteristics. The stark difference in denials was a constant nationwide, but in some cities the disparity was greater than 250 percent.

Experts interviewed for the investigation said some of the problems were due to opaque algorithms that guide the mortgage-approval process and use data that affects different groups differently.

For example, the standard method of credit scoring for mortgages, required by quasi-governmental agencies Fannie Mae and Freddie Mac, uses an outdated formula that can disadvantage people of color. The calculations of Fannie’s and Freddie’s automated underwriting software also depend on financial factors that are anything but color-blind. Experts say both processes can contribute to the further entrenchment of centuries-old racial biases and inequities.

The inner workings of automated underwriting software are also very secretive—even for the loan officers who use it. According to The Markup’s sources, not even the federal agencies tasked with regulating the industry know exactly how the software works.

Mortgage industry groups criticized The Markup’s investigation at the time because the analysis of mortgage denials did not include applicants’ credit scores, which are not publicly available, and because it focused on conventional loans only and not government-insured loans.

After the investigation published last year, our findings were cited in the announcement by the Consumer Financial Protection Bureau, the U.S Department of Justice, and the Office of the Comptroller of the Currency of a new initiative to fight discriminatory mortgage lending practices. In response to the story, Minnesota’s attorney general also warned that lenders whose algorithms discriminate “should not be surprised” if they are investigated for violating the law.

The new bill, which was co-sponsored by Sen. Cory Booker (D-NJ) and Rep. Yvette Clarke (D-NY), would require companies to assess the algorithms they use, and the data used to train them, “for impacts on accuracy, fairness, bias, discrimination, privacy and security” and then fix whatever problems they might find. It would apply to any algorithms used to make “critical decisions” about people’s lives—including their education, employment, family planning, health care, legal services, housing, and financial services. It would give the Federal Trade Commission more authority and more resources to oversee the process.

The bill is an update to the lawmakers’ previous attempt to pass similar legislation in the Senate in 2019. They said they consulted dozens of experts and advocacy groups to improve it. The new version was endorsed by Color of Change, Consumer Reports, the Electronic Privacy Information Center (EPIC), and the Institute of Electrical and Electronics Engineers (IEEE), among others.

As The Markup recently reported, many efforts to regulate governments’ use of algorithms on the local level have failed to pass. Government agencies and contractors who frequently oppose the bills not only fight to keep the details of proprietary software private but also rebuff attempts from would-be regulators to learn what software is even in use.

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now