Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

Axon’s Ethics Board Resigned Over Taser-Armed Drones. Then the Company Bought a Military Drone Maker

The CEO's vision for Taser-equipped drones includes a fictitious scenario in which the technology averts a shooting at a daycare center

Photo collage illustration of an Axon drone with several pages of its patent report overlaid on top of it.
Gabriel Hongsdusit

This article was co-published with WIRED.

Less than 10 days after the Robb Elementary School shooting in Uvalde, Texas, late last May, Axon Enterprises CEO Rick Smith announced the company had formally started developing Taser-equipped drones. The technology, Smith argued, could potentially save lives during future mass shootings by incapacitating active shooters within seconds.

For Axon, which changed its name from Taser in 2017, the concept seemed a sensible next step for stakeholders who share Axon’s public safety mission, Smith said on the company’s site. 

“In brief,” he wrote, “non-lethal drones can be installed in schools and other venues and play the same role that sprinklers and other fire suppression tools do for firefighters: Preventing a catastrophic event, or at least mitigating its worst effects.”

Elsewhere, however, the announcement roused significant concern. Only a few weeks before Smith’s announcement, a majority of the members of Axon’s AI Ethics Board—which consisted of a dozen academics, attorneys, activists, and former law enforcement officials—recommended the company not move forward with a pilot study of Taser-armed drones, then called Project ION. The board had spent more than a year considering the Taser-equipped drone project, but had never considered any use case in which it would be a solution to mass shootings. 

Advisory board members told The Markup that Smith’s announcement was unexpected and made without consultation or input from the ethics body the company had worked relatively well with for the previous four years. In the past, the board’s work prompted Axon to ban facial recognition on its body cameras, out of concern the technology could not be responsibly rolled out.

“I begged Rick not to go public with the weaponized drone plan without consulting with the board, as our operating principles required,” Barry Friedman, founder of NYU Law School’s Policing Project and former Axon ethics advisory board chair, told The Markup. 

Within a week of the announcement, nine of Axon’s 12-member ethics board resigned, saying in a joint letter that they had “lost faith in Axon’s ability to be a responsible partner.” 

“Although we all joined this Board understanding that we are advisory only—and have seen Axon reject our advice on some prior occasions—rushing ahead to embrace use of surveillance-enabled, Taser-equipped drones, especially when its Board was urging against unnecessarily precipitate action, is more than any of us can abide,” the exiting members wrote then.

Black and white line illustration of an Axon drone, with various arrows and number pointing to parts of it. There is a human figure standing on the bottom left, with a double-headed arrow between the figure and the drone.
Caption: An illustration of a Taser-equipped drone, taken from Axon’s patent application. Credit:Axon

In the wake of the board’s dissolution, Axon halted its Taser-drone program temporarily. Former board members, meanwhile, continued to speak out against the company’s efforts. The group released a report in January 2023 criticizing company leaders for “trading on the tragic shootings which had just occurred in Uvalde and Buffalo.” The report included a number of recommendations about Taser-drone technology, including the need for accuracy and safety thresholds, as well as local lawmakers’ approval and internal department policies governing the drones’ use. Drones deploying force should never be autonomous, either, the ex-board members recommended—a human should make that decision.  

The former members also noted that in addition to the physical risk of injury or death posed by an electroshock weapon, the proposed devices would rely on surveillance systems that would be triggered by the sound of gunshots, posing privacy and accuracy risks. Additional surveillance in schools might also lead to increased disciplinary action, even for minor offenses, they said. There’s potential for disparate, racist impact here, too, the former members pointed out, saying “Black students are four times more likely to attend a school with a high level of surveillance.” 

Weaponized drones are also vulnerable to misuse and might increase how frequently force is used, too, the experts said in their report. “A growing literature on military use of drones notes the unique characteristics of remote use of force—humans appear as figures on a computer screen, and decisions to use force often are made by teams rather than by a single individual,” the experts wrote. This “could lead to dehumanization of individuals targeted by the drone and could diminish operators’ sense of personal moral culpability for their decisions, leading to increased use of force.”

However, Axon still appears to be moving forward with its armed drone plan.

“On a longer time horizon, Axon sees opportunities to explore how robotics can expand to include less-lethal robotic payloads and operations,” company leadership said in a statement on its website from April of this year. “While this is still in early concept, we believe with ample research, ethical development and identifying the most amenable use cases, this capability can positively contribute to the future of public safety.”

Then, in July, the company acquired Sky-Hero, a company based in Belgium that manufactures drones and unmanned ground vehicles. Sky-Hero has already developed so-called “distraction” technology for some of its drones and rovers that produces the same sound pressure levels as a semiautomatic rifle, it said in the description of a YouTube video demonstrating the tech, acting as a “true non-lethal flashbang.”

Sky-Hero

In interviews with The Markup, former ethics advisory board members expressed concern about the company’s plans to continue developing weaponized drone technology. 

“Vendors like Axon are selling products to public agencies for the benefit of the public, and I think they have a responsibility to consider the harms their products might cause and to try to mitigate those harms,” said Max Isaacs, a senior staff attorney at the Policing Project who worked with the board. 

“Everyone deserves public safety,” Isaacs added. “Everyone wants more public safety. That’s not the question. The question is, when companies sell these products claiming all of these public safety benefits, have they proven those benefits? Has there been any independent testing? Do we know that these products are making us safer? Oftentimes the answer is no.”

Isaacs and others The Markup interviewed noted that ethics boards are an imperfect patch for the regulatory void around evolving technology like drones and artificial intelligence. 

“The fact that we’re relying on companies to set up these advisory boards in order to address the harms of their products is itself a very problematic concept to me,” Isaacs said. 

Though Axon leadership says it is committed to the “responsible” development of new technology, it’s not clear whether the company is still consulting with ethics experts on the plan. 

The fact that we’re relying on companies to set up these advisory boards in order to address the harms of their products is itself a very problematic concept to me.

Max Isaacs, Policing Project

The Markup found that mentions of the former AI ethics board appear to have been removed from the company’s website, including the board’s once-public recommendations and reports. The webpage axon.com/ethics, where the former board’s governing principles and work was hosted, now sends searchers to a letter from Smith announcing the company would pause its Taser-drone plan. In September 2022, the company unveiled its Ethics and Equity Advisory Council, a panel of academics and community leaders who advise Axon on “a limited number of products per year,” according to the company. Axon says the body is independent, but unlike the former ethics board, it is led by an Axon executive vice president.

The company declined to make members of this group available for interviews. Public reports from the body are not available on its website, nor is a copy of its operating principles.

Because ethics is both subjective and not legally binding, it can readily be trumped by capitalist imperatives, said Ryan Calo, a professor at the University of Washington School of Law and a former member of the ethics board at Axon. 

“Ethics is important,” said Calo. “Ethics is good. But law needs to be a backstop. Law is the place where society goes to decide what’s forbidden and what’s required.”  

↩︎ link

The End of Killing?

The Arizona-based company has not shipped any Taser-equipped drones to any client, it said in a 2022-2023 annual report, the most recent available. It also pledged to never build “lethal” drones. 

Axon, which launched in 1993, is a premier player in the law enforcement and military technology space. Technology developed by the company, which makes Taser devices and body-worn cameras, is already used by more than 95 percent of state and local law enforcement agencies in the United States, it claims in investor reports. Axon also owns the evidence.com platform, a cloud-based evidence management system for police officers and what the company calls “the world’s leading repository of law enforcement data.” Regarding Taser drones, Axon maintains they should be developed. “We believe there is no organization in the world better suited to develop it the right way,” its most-recent investor report reads.

Company leadership expects to generate more than $1.5 billion in revenue in 2023, it said in an August investor statement. And by 2025, Rick Smith has set a target of reaching $2 billion. According to its own reports, Axon, which went public in 2001, has generated “over $15 billion in wealth” for its shareholders.

Still, the plan to arm drones with Tasers was not universally well-received by Axon’s shareholders, some of whom criticized the company for Smith’s announcement about the weaponized drones. A shareholder proposal submitted by the Jubitz Family Foundation, a Portland, Oregon-based foundation that promotes nonviolent alternatives to conflict, encouraged shareholders to vote to discontinue developing these drones. 

“Axon proposed using AI surveillance, algorithmic predictors, and virtual reality simulations to stop mass shootings,” the proposal, which was included as part of the company’s 2022-2023 annual report, reads. “Axon did not seek meaningful input from its in-house Community Advisory Coalition, AI Ethics Board, or Vice President of Community Impact prior to the announcement.”

After the ethics board’s resignations last year, “Axon has now replaced both the Community Advisory Coalition and the AI Ethics Board with a new advisory council, which Smith still does not commit to heeding,” the foundation added in its proposal. 

“The rollout of this proposal demonstrates a tremendous failure of management’s self-governance procedures,” the Foundation wrote, and risked not only harming children psychologically and physically, but possible litigation and reputational damage.

The Jubitz Family Foundation did not respond to a request for comment on the proposal from The Markup.

In a lengthy response to the Jubitz proposal, Axon said robotic security could save lives, slashing gun-related deaths by presenting police with longer range, remotely operable weapons. 

“Axon is working to reduce violence and displace lethal uses of force with less-lethal alternatives that can save—rather than take—lives,” the company said.

“Based on our analysis of the Washington Post’s dataset of fatal officer involved shootings, we estimate that a more effective, longer range handheld TASER device has the potential to reduce fatal officer involved shootings by around 40%,” the company said. “When we run this same analysis looking at instances where police could have utilized a less-lethal capable drone, we estimate that a drone could likely have been used instead of lethal force in 57% of these fatal shootings. When we combine an advanced handheld TASER device together with remotely operated drone and robotic capabilities, we estimate that up to 72% of fatal shootings might be averted.” (The company did not share information about its analysis in response to a question about it.)

While Axon technology is used by major police departments and federal agencies including the New York Police Department, the Los Angeles Police Department, the U.S. Department of Homeland Security, and the Departments of Defense and Justice, according to the company, there isn’t proof that the products are solving the problem of police violence. According to the Washington Post database of fatal police shootings, the number of such shootings was higher in 2022 than it was in any of the previous seven years tracked. And recently, some police unions have argued they should be paid more just to use body cameras, a barrier to critical transparency even where these tools are available.  

Closeup photograph of a black Axon body camera worn by a police officer. The badge next to it says “LOS ANGELES POLICE.”
Caption: A Los Angeles police officer wears an AXON body camera. Credit:David McNew/Getty Images

Axon did not respond directly to questions about whether its new Equity and Ethics Advisory Council is advising it on the weaponized drone program, or whether the company has received any demand from school districts for these products.

However, in an emailed statement to the Markup, Axon spokesperson Alex Engel wrote: “Specifically regarding your timing question about bringing a TASER-equipped drone to market: We are learning that the interest in this concept is highest in areas of public safety where it could be most useful in saving the lives of officers and the public. But, we are still in the early stages and do not have any product for purchase. … We continue to explore this concept in our R&D pipeline with a focus on identifying circumstances in which a de-escalation tool like this would be most valuable to help reduce harm.” 

“Above all, we recognize there needs to be substantial work to ensure the right ethical measures are in place around any such technology,” he wrote. 

Some further insight into CEO Smith’s vision for this kind of drone technology can be gleaned from his 2019 book, The End of Killing

In a chapter of the book on school safety, Smith presents readers with a fictitious scenario about a daycare center shooting, a tragedy averted because of a Taser-equipped drone, installed in the room and activated by a “AI algorithm …designed to constantly monitor for potential firearm discharge sounds, not all that different from the iPhones of millions of people around the world that are awakened by the ‘Hey Siri’ sound pattern,” Smith writes. 

An algorithm, Smith writes, would subsequently calculate the direction of the sound, and combined with a panic alert signal system, trigger the drop of a small drone within a second. A “computer vision algorithm” on the drone would detect muzzle flashes. Sensing a probable weapon, he added, police could remotely deploy the Taser-armed drone, shooting electrical impulses “designed to paralyze the human nervous system” toward the shooter. All of this could take place within two seconds of gunfire.

Update, September 12, 2023

This article has been updated to include an emailed statement from Axon.

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now