Skip to main content
The next generation of weapons in military arsenals could be "killer robots," machines that would select and destroy specific targets without further human intervention. But if a robot broke the law, who would be held responsible? Would the programmer, manufacturer, military commander, or robot end up in court?   © 2015 Russell Christian for Human Rights Watch

(Geneva) – Programmers, manufacturers, and military personnel could all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots,” Human Rights Watch said in a report released today. The report was issued in advance of a multilateral meeting on the weapons at the United Nations in Geneva.

The 38-page report, “Mind the Gap: The Lack of Accountability for Killer Robots,” details significant hurdles to assigning personal accountability for the actions of fully autonomous weapons under both criminal and civil law. It also elaborates on the consequences of failing to assign legal responsibility. The report is jointly published by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.

“No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” said Bonnie Docherty, senior Arms Division researcher at Human Rights Watch and the report’s lead author. “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”

Fully autonomous weapons would go a step beyond existing remote-controlled drones as they would be able to select and engage targets without meaningful human control. Although they do not exist yet, the rapid movement of technology in that direction has attracted international attention and concern.

Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots and serves as its coordinator. The international coalition of more than 50 nongovernmental organizations calls for a preemptive ban on the development, production, and use of fully autonomous weapons.


The report will be distributed at a major international meeting on “lethal autonomous weapons systems” at the UN in Geneva from April 13 to 17, 2015. Many of the 120 countries that have joined the Convention on Conventional Weapons are expected to attend the meeting of experts on the subject, which will continue deliberations that started at an initial experts meeting in May 2014.

Human Rights Watch believes the agreement to work on these weapons in the Convention on Conventional Weapons forum could eventually lead to new international law prohibiting fully autonomous weapons. The Convention on Conventional Weapons preemptively banned blinding lasers in 1995.

A key concern with fully autonomous weapons is that they would be prone to cause civilian casualties in violation of international humanitarian law and international human rights law. The lack of meaningful human control that characterizes the weapons would make it difficult to hold anyone criminally liable for such unlawful actions.

Military commanders or operators could be found guilty if they intentionally deployed a fully autonomous weapon to commit a crime, Human Rights Watch said. But they would be likely to elude justice in the more common situation in which they could not foresee an autonomous robot’s unlawful attack and/or were unable to stop it.  

“A fully autonomous weapon could commit acts that would rise to the level of war crimes if a person carried them out, but victims would see no one punished for these crimes,” said Docherty, who is also a lecturer at the Harvard Law School clinic. “Calling such acts an ‘accident’ or ‘glitch’ would trivialize the deadly harm they could cause.”

The obstacles to accountability would be equally high under civil law, Human Rights Watch said. Civil liability would be virtually impossible, at least in the United States, due to the immunity granted by law to the military and its contractors and the evidentiary hurdles in product liability suits. Many other countries have similar systems of sovereign immunity.

Even if successful, a civil suit would have limited effectiveness as a tool for accountability. The primary focus of civil penalties is compensating victims, not punishment. While monetary damages can assist victims, they are no substitute for criminal accountability in terms of deterrence, retributive justice, and moral stigma.

A system in which victims could receive compensation without proving legal fault also would not achieve meaningful accountability, and most governments would be hesitant to adopt such a system, Human Rights Watch said.  

The accountability gap is just one of a host of concerns raised by fully autonomous weapons. In other publications, Human Rights Watch has elaborated on the challenges the weapons would face in complying with international humanitarian law and international human rights law. It has also noted concerns about the potential for an arms race, the prospect of proliferation, and questions about whether it is ethical for machines to make life-and-death decisions on the battlefield.

“The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban,” Docherty said.  

Your tax deductible gift can help stop human rights violations and save lives around the world.