Skip navigation

The Breakdown

Police Are Looking to Algorithms to Predict Domestic Violence

But will they be effective or reinforce bias-based practices?

Silhouette of a stressed young woman backlit against a window. She is holding her nose with her hand.
kieferpix/iStock

Domestic abuse is a widespread problem in the United States and around the world. Violence at the hands of an intimate partner has affected more than 600 million women globally, according to World Health Organization estimates, and the problem has only grown during the pandemic.

Law enforcement officers have turned to various tools, from simple questionnaires to algorithms, as a way to prioritize the highest-risk crimes. While some research has recognized the potential benefits of the tools, it has also left experts in the domestic violence community with questions about the ethics and efficacy of relying on technology to predict future violent acts.

Matthew Bland, an associate professor in evidence based policing at the University of Cambridge, said there is broad acknowledgment that something needs to be done to improve services for domestic violence victims, but that how, or whether, to use technology as a solution is up for debate. 

“We’re still quite polarized, I think, as a domestic abuse community, on the right way forward,” he said. 

↩︎ link

Range of Techniques

Some tools used by police are effectively just paper questionnaires. In the United Kingdom, police use a relatively simple tool called DASH, short for “Domestic Abuse, Stalking and Honor-Based Violence.” After an incident, police question victims and add up the number of “yes” responses to produce a risk classification that guides their response. 

Although the idea has gained the most traction in Europe, some police forces in the United States also use a basic form of risk assessment similar to DASH. 

Other systems are relatively advanced. The government of Spain launched an ambitious project in 2007 to battle domestic violence through a system called VioGén. Its goal was to build a centralized system for domestic violence cases that could also predict future incidents. 

VioGén is powered by an algorithm developed by researchers based on which factors in an incident have been linked to high-risk cases in the past. Police log such details of a case as whether the aggressor has made death threats or uses drugs, and VioGén calculates a score based on the inputs.

VioGén has since performed millions of “risk evaluations.” The scale rates risk from lowest to highest and guides how police respond, including whether to pursue charges or provide a victim with police protection.

Today, VioGén is likely the most advanced predictive domestic violence tool. According to a report from Eticas Foundation, a nonprofit tech advocacy group that studied the tool, there were more than 670,000 cases in the system at the beginning of 2022. 

↩︎ link

Effectiveness and Ethics

Are the tools effective at preventing domestic violence?

“That’s kind of the gigantic elephant in the room, not only in Spain but with all risk assessment tools,” said Juan Jose Medina Ariza, a researcher in crime sciences at the University of Seville. “We don’t really know” whether putting these tools in the hands of police improves their response to domestic violence, he said.

[Whether these tools are effective is] kind of the gigantic elephant in the room.

Juan Jose Medina Ariza, University of Seville

Researchers have found that some relatively simple tools like DASH are disappointing. One 2019 study by Medina Ariza and colleagues found that the system was “underperforming” and was “at best, weakly predictive of revictimization.”

The published research on VioGén has been relatively positive, Medina Ariza said—but it’s been criticized for being evaluated by researchers who work directly on the tool with the Spanish government. 

Eticas CEO Gemma Galdon said there needs to be more transparency from the Spanish Ministry of the Interior, which developed the system. Police have leeway to override the algorithm and heighten the risk level of a case manually, but, 95 percent of the time, officers followed the algorithm, according to the Eticas report, which relied on limited available data on the system.

Without independent third-party audits, Galdon said, the public can’t truly be assured that tools like VioGén are effective and resources are reaching the people they’re meant to help.

“When a woman with a low risk score is killed, the ministry cannot say, with confidence, ‘This is an anecdote, and the system works,’ ” Galdon said. “That is very, very, very concerning.” 

The Spanish Ministry of the Interior did not respond to a request for comment.

↩︎ link

More Options, More Controversy 

Some officials and researchers have suggested using more data-intensive techniques. One controversial idea: machine learning.

VioGén’s decisions are based on factors predetermined by researchers to be linked to violence—whether the aggressor has had suicidal thoughts, for example, is factored into VioGén’s decisions. 

But a machine learning tool can draw its own conclusions about risk. Such a system could read through police data on crimes and decide autonomously which cases are the highest risk, based on factors like prior arrests and convictions. The system could even decide that cases from certain zip codes are higher risk because it sees more reports of abuse from those neighborhoods.

Multiple researchers have found that they were able to improve on the predictions of simple risk assessments by using such a technique. But Medina Ariza, who also published a paper finding that a machine learning technique could improve on the predictive power of the United Kingdom’s DASH tool if it were implemented, said using machine learning in domestic violence abuse remains ethically controversial.

The technique relies on past data to make predictions about the future, raising the concern that it will reinforce past prejudices, like a focus on one racial group. If a machine learning algorithm is trained on arrest data, for example, it may overpredict abuse in groups that police disproportionately arrest.

“Our fear is that we are substituting really faulty and discriminatory human systems with even worse and more opaque technical systems,” Galdon said.

Our fear is that we are substituting really faulty and discriminatory human systems with even worse and more opaque technical systems.

Gemma Galdon, Eticas

Still, the idea of using machine learning to sort cases is being toyed with. Last year, for example, police in Queensland, Australia, announced that they would pilot the use of a machine learning program trained on police data to predict the highest-risk domestic violence offenders. 

According to The Guardian, police officials said officers would use the tools to predict which cases would escalate and be “proactively knocking on doors without any call for service.” Matt Adams, a spokesperson for the Queensland Police Service, told The Markup that the trial has been delayed by COVID, but the police are moving ahead with the plan.

Medina Ariza said that, at the very least, researchers have shown that big data techniques have been better able to predict domestic abuse than the simplest risk assessments. 

“The question then becomes one of, is it O.K. to use a machine learning model, even with all of the debates that are going on about algorithmic fairness?” he said. “I think that that’s still very much an open question.”

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now