• Press Release

A Critical Opportunity to Ban Killer Robots – While We Still Can

November 1, 2021

MICHAEL MATHES/AFP via Getty Images
Amnesty International and the Stop Killer Robots campaign today unveiled a social media filter which provides a terrifying glimpse of the future of war, policing and border control. Escape the Scan, a filter for Instagram and Facebook, is part of a major campaign calling for a new international law to ban autonomous weapons systems. It uses augmented reality (AR) technology to depict aspects of weapons systems that are already in development, such as facial recognition, movement sensors, and the ability to launch attacks on ‘targets’ without meaningful human control.

Several countries are investing heavily in the development of autonomous weapons, despite the devastating human rights implications of giving machines control over the use of force. In December, a group of UN experts will meet to decide whether to begin negotiating new international law on autonomy in weapons systems. Amnesty International and Stop Killer Robots have launched a petition calling on all governments to voice their support for negotiations.

“We are stumbling into a nightmare scenario, a world where drones and other advanced weapons can choose and attack targets without human control. This filter is designed to give people an idea of what killer robots could soon be capable of, and show why we must act urgently to maintain human control over the use of force,” said Verity Coyle, Amnesty International’s Senior Advisor on Military, Security and Policing.

“Allowing machines to make life-or-death decisions is an assault on human dignity, and will likely result in devastating violations of the laws of war and human rights. It will also intensify the digital dehumanization of society, reducing people to data points to be processed. We need a robust, legally binding international treaty to stop the proliferation of killer robots – before it’s too late.”

Escape the Scan will be available from November 2 on Stop Killer Robots’ Instagram and Facebook pages.

A larger version of the filter will be on display as an interactive experience at Westfield Stratford City in London – one of the largest shopping centers in Europe – for two weeks from today.

Time to draw a line

“We have had a decade of talks on autonomous weapons at the United Nations, but these are being blocked by the same states that are developing the weapons,” said Ousman Noor of the Stop Killer Robots campaign.

“The UN Secretary General, the International Committee of the Red Cross, Nobel Prize Winners, thousands of scientists, roboticists and tech workers, are all calling for a legal treaty to prevent these weapons – governments need to draw a line against machines that can choose to kill.”

On December 2, 2021, the Group of Governmental Experts to the Convention on Conventional Weapons (CCW) will begin critical talks on whether to proceed with negotiations on a new treaty to address the threat posed by killer robots. So far 66 states have called for a new, legally binding framework on autonomy in weapons systems. But progress has been stalled by a small number of powerful states, including Russia, Israel and the US, who regard the creation of a new international law as premature.

The replacement of troops with machines will make the decision to go to war easier. What’s more, machines can’t make complex ethical choices within the context of unpredictable battlefield or real world scenarios; there is no substitute for human decision making. We have already seen how technologies like facial, emotion, gait and vocal recognition fail to recognize women, people of color and persons with disabilities; and how they cause immense human rights harms even when they “work”. Employing these technologies on the battlefield, in law enforcement or border control would be disastrous.

Despite these concerns, countries including the US, China, Israel, South Korea, Russia, Australia, India, Turkey and the UK are investing heavily in the development of autonomous systems.  For example, the UK is developing an unmanned drone which can fly in autonomous mode and identify a target within a programmed area. China is creating small drone “swarms” which could be programmed to attack anything that emits a body temperature, while Russia has built a robot tank which can be fitted with a machine gun or grenade launcher.

Contact: Gabby Arias, [email protected]