The topic of killer robots was drawn back into the public sphere last week with the widely publicised call for a moratorium on the development and use of “lethal autonomous robotics” by a top UN human rights expert; and inevitably, this conjured up some familiar concerns.
The opening scenes of James Cameron’s 1984 film The Terminator portray people running for cover beneath ruined buildings while hunter-killer robots circle menacingly overhead. Of course, such images must already have a certain contemporary resonance in Pakistan and Afghanistan, where people live in fear of being killed by aHellfire missile fired by a Predator or Reaper drone, controlled by operators in the United States.
Yet if people are dying in drone strikes today at least a human being has confronted the question of whether the goals the attack is intended to serve are worth killing them for.
Now that military scientists around the world are working on developing autonomous weapons intended to be capable of identifying and attacking targets without direct human oversight – referred to interchangeably as lethal autonomous robots and killer robots – the scenario Cameron portrays in the first few minutes of his film is perhaps closer than we think.
It’s important to stress here that, currently, such weapons are not employed, although various technologies of this sort are in development. And, while not “autonomous”, the sophistication of certain robotics being trialled for the battlefield, as discussed already on The Conversation, gives some insight into where things may be going.
Last week’s discussion on the ethics of lethal autonomous robots at the UN Human Rights Council followed in the footsteps of a November 2012 Human Rights Watchreport, Losing Humanity: the Case Against Killer Robots.
But the military logic driving the rapidly expanding use of drones and the development of autonomous weapons has been obvious for some time. It was because we viewed this prospect with alarm that colleagues and I founded the International Committee for Robot Arms Control at a meeting in the UK in September 2009.
Risks and rewards
The development of autonomous weapons would undermine international peace and security by lowering the domestic political costs of going to war and by greatly increasing the risk of conflicts being triggered by accident.
The fear of the public seeing their sons and daughters return in body bags is the main thing that currently prevents governments from going to war. If governments think they can impose their will on affairs in foreign lands using autonomous weapons there will be little to stop them bombing and assassinating those they perceive as their enemies more often than they already do.
The UN’s Christof Heyns has called for a global pause in the development and deployment of “killer robots”
Of course, as the invasions of Iraq and Afghanistan demonstrate all too well, wars are easier to start than to finish. Similarly, despite the enthusiasm of the West for fighting wars entirely in other people’s countries, the violence of these conflicts has ways of finding its way home.
The stabbing of a British soldier in Woolwich by two men identifying as Muslims has been widely described as an act of terrorism: as Glenn Greenwald has argued, given that it involved an attack on a member of the British armed services in the context of the UK’s involvement in the war in Afghanistan, one wonders if it might not equally well be thought of as a poor man’s drone strike. Misplaced faith in the possibility of risk-free warfare may end up putting more lives at risk.
When autonomous submarines are circling each other in the Pacific 24 hours a day and autonomous planes are poised to strike strategic targets should some particular set of conditions on a checklist maintained by a computer be met, the risk of accidental war will be all too real.
This tradition places severe restrictions on the conduct of war, including regarding who is and is not a legitimate target of attack. Civilians are not legitimate targets, nor are soldiers who have indicated a desire to surrender or who are wounded such that they pose no military threat.
Despite the rapid progress of computer science, I am extremely cynical that machines will be able to make the complex contextual judgements required to reliably meet the requirements of just war theory for the foreseeable future.
There is also a peculiar horror associated with the idea of people being killed by robots, which I have been working to elucidate in my research. Even though they are willing to kill each other, enemies at war are in a moral relationship.
At a bare minimum, they must acknowledge their enemy as their enemy and be willing to take responsibility for the decision to kill them. Robots are unable to offer this recognition themselves and arguably obscure the moral relationship between combatants to such an extent as to call into question the ethics of their use as weapons.
For all these reasons, I applaud the recent launch of the Campaign to Stop Killer Robots announced by a coalition of NGOs in London in April this year and support its goal of a global ban on the development and deployment of lethal autonomous weapons.
Master of Journalism: Double Masters with Warwick information session
Information Session: Thursday 2nd June 6.30-8.00pm
Monash in Focus: Kate Brabon, Vogel’s Award winner
Monash in Focus recently featured Australian/Vogel’s Literary Award winner Kate Brabon. Kate is a PhD … Continue reading Monash in Focus: Kate Brabon, Vogel’s Award winner
Monash journalism students report on federal election for UniPollWatch and The Guardian
Monash University’s journalism students are part of Australia’s largest newsroom, reporting on the 2016 federal … Continue reading Monash journalism students report on federal election for UniPollWatch and The Guardian
Monash Gender Peace and Security secures Linkage Grant
In a success for Monash Arts research, the Gender, Peace and Security Initiative has recently secured a major ARC Linkage … Continue reading Monash Gender Peace and Security secures Linkage Grant
Vogel’s Literary Award for PhD candidate Kate Brabon
Earlier this week, Kate Brabon was announced as the winner of the 2016 Australian/Vogel’s Literary … Continue reading Vogel’s Literary Award for PhD candidate Kate Brabon
The Monash Media Lab: a great place to learn
Head of School, AP Mia Lindgren, and TV presenter and academic, Waleed Aly, talk about what makes the new Lab so important for students of Media, Film and Journalism.
Monash Arts welcomes the first Monash Asylum Seeker Bursary recipient
Ali Khan, a new member of the Monash Arts student family, came to Australia from … Continue reading Monash Arts welcomes the first Monash Asylum Seeker Bursary recipient
Place and Character: Monash Prize judge Mridula Chakraborty on what she loves to see in new literature
We recently chatted to Monash Undergraduate Prize for Creative Writing judge (and Monash academic) Mridula … Continue reading Place and Character: Monash Prize judge Mridula Chakraborty on what she loves to see in new literature
Monash Media Lab Launch Social Media Round-Up
[View the story “#MonashMediaLab Launch” on Storify] Find out more Read more about the launch … Continue reading Monash Media Lab Launch Social Media Round-Up
A big deal: Global Studies students at Womensphere Conference, New York
For the two Monash students it was a first to be mixing with a group of such high profile business leaders, politicians, entrepreneurs, academics, philanthropists, artists and musicians – but they soon felt welcome.
Literary Commons brings inter-cultural Indigenous writing to Melbourne
Literary Commons plays on the idea of ‘commons’, the space where communities and cultures share … Continue reading Literary Commons brings inter-cultural Indigenous writing to Melbourne
New York bound: Global Discovery Program applications now open
The trip of a lifetime awaits eight talented Monash students who will wing their way … Continue reading New York bound: Global Discovery Program applications now open