Press releases
Killer robots: new social media filter highlights dangers of autonomous weapons
‘Escape the Scan’ filter for Instagram and Facebook will be displayed at Westfield Stratford shopping centre in east London
Filter warns of dangers posed by autonomous weapons systems ahead of vital UN talks next month
‘We are stumbling into a nightmare scenario’ - Verity Coyle
The development of autonomous weapon systems threatens a “nightmare” future, Amnesty International and the Campaign to Stop Killer Robots said today, as they unveiled a new social media filter which provides a terrifying glimpse of the possible future of war, policing and border control.
Escape the Scan, a filter for Instagram and Facebook, is part of a campaign calling for a new international law to ban autonomous weapons systems.
The filter uses augmented reality (AR) technology to visually replicate aspects of weapons systems that are already in development - such as facial recognition, movement sensors, and the ability to launch attacks on “targets” without meaningful human control.
Some countries - including the USA, China, Israel, South Korea, Russia, Australia, India, Turkey and the UK - are currently investing heavily in the development of autonomous systems. China is creating small drone “swarms” which could be programmed to attack anything that emits a body temperature; while Russia has built a robot tank which can be fitted with a machine gun or grenade launcher.
The UK is developing an unmanned drone which can fly in autonomous mode and identify a target within a programmed area. However, UK officials insist that they “have no intention of developing systems that could operate without any human control”. The UK is currently opposed to new legal controls being devised.
In December, a group of UN experts will meet to decide whether to begin negotiating a new international law on autonomy in weapons systems (see below). Amnesty and the Campaign to Stop Killer Robots have launched a petition calling on all governments to voice their support for negotiations.
Escape the Scan filters are available on the Stop Killer Robots Instagram and Facebook pages. A large version of the filter will also be on display as an interactive experience at Westfield Stratford City in east London - one of the largest shopping centres in Europe - for two weeks from today.
Verity Coyle, Amnesty International’s Senior Advisor on Military, Security and Policing, said:
“We are stumbling into a nightmare scenario, a world where drones and other advanced weapons can choose and attack targets without human control.
“This filter is designed to give people an idea of what killer robots could soon be capable of, and show why we must act urgently to maintain human control over the use of force.
“Allowing machines to make life-or-death decisions is an assault on human dignity and will likely result in devastating violations of the laws of war and human rights.
“It will also intensify the digital dehumanisation of society, reducing people to data points to be processed. We need a robust, legally-binding international treaty to stop the proliferation of killer robots - before it’s too late.”
Ousman Noor, the Campaign to Stop Killer Robots’ Government Relations Manager, said:
“We have had a decade of talks on autonomous weapons at the United Nations, but these are being blocked by the same states that are developing the weapons.
“The UN Secretary General, the International Committee of the Red Cross, Nobel Prize winners, and thousands of scientists, roboticists and tech workers are all calling for a legal treaty to prevent these weapons - governments need to draw a line against machines that can choose to kill.”
December talks
Next month (2 December), the Group of Governmental Experts to the Convention on Conventional Weapons will begin talks on whether to proceed with negotiations on a new treaty to address the threat posed by killer robots. To date, 66 countries have called for a new, legally-binding framework on autonomy in weapons systems. However, progress has been stalled by a small number of powerful countries - including the USA, Russia and Israel - who regard the creation of a new international law as premature.
Amnesty and the Campaign to Stop Killer Robots are warning that the replacement of troops with machines will make the decision to go to war easier. Machines can’t make complex ethical choices within the context of unpredictable battlefield or real-world scenarios. In the recent past, technologies such as facial-, emotion-, gait- and vocal-recognition systems have all failed to recognise women, people of colour and persons with disabilities, among other significant failings. Employing such technologies on the battlefield, in law-enforcement or border-control situations, could be disastrous, said the two groups.