Sunday, February 24, 2013
Ban ‘killer robots’ before it’s too late
Stop the Killer Robots is a new global campaign to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates. It aims to persuade nations to ban "killer robots" before they reach the production stage. This is in today’s Observer.
Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University is prominent in the campaign. He says robot warfare and autonomous weapons are the next step from unmanned drones. They are already being worked on by scientists and will be available within the decade. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.
The two images are from a Human Rights Watch press release issued last November.
The aircraft is a drone enabling an operator to strike distant targets, “even in another continent.” Whilst the Ministry of Defence has stated that humans will remain in the loop, Human Rights Watch says the Taranis “exemplifies the move toward increased autonomy”. The sentry robot can detect people in the Demilitarized Zone and, if a human grants the command, fire its weapons. The robot is shown here during a test with a surrendering enemy soldier.
Neither the aircraft not the robotic sentry are fully autonomous but Human Rights Watch sees full autonomy as being only a step away.
Here's a campaign video against military robots.
In the video, Noel Sharkey says there is nothing in artificial intelligence or robotics that could discriminate between a combatant and a civilian. It would be impossible to tell the difference between a little girl pointing an ice cream at a robot or someone pointing a rifle at it.
Advocates of military robots claim a moral argument in their favour. If you could build robot soldiers, you would be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn’t be susceptible to human emotions like anger or excitement in the heat of combat. So, you might actually want to entrust those sorts of decisions to machines rather than to humans.
Jody Williams of Human Rights Watch debates this point with in a Democracy Now episode last November.
I've got mixed feelings about this. Humans with their finger on the trigger will always commit war crimes. Could robots put an end to that?
ReplyDeleteAnd, at the end of the day, it is war and killing that we are talking about. Those who want robots will not be deterred because someone says it is foul play.