Friday, March 1, 2013

Who compared apples and ideas? Shaw?


If you have an apple and I have an apple and we exchange apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.

This aphorism should have been uttered by George Bernard Shaw and there's no shortage of websites claiming that it was.  But the excellent Quote Investigator has diligently sought compelling evidence that Shaw really made this remark, and has failed to find any. 

George Bernard Shaw: did he talk of swapping apples?
It seems the earliest quotation conforming to this theme was dated 1917, though dollars were mentioned rather than apples. It came in an advertisement for a magazine called System that was printed in the Chicago Tribune. The ad was titled “The Difference Between Dollars and Ideas”.  You'll find a full discussion by following the above link.

It's like this you see, when I find a good quote, I can't rest easy till I've sourced it. And when my dictionary of quotations doesn't help and I distrust what I'm seeing on the world wide web, that’s where the Quote Investigator comes in handy. It's a blog I can't praise too highly.

On a couple of occasions when stumped for the source of a quote, the Quote Investigator has tracked it down for me and put it on his blog. One was Einstein: “Everyone who is seriously interested in the pursuit of science becomes convinced that a spirit is manifest in the laws of the universe – a spirit vastly superior to man, and one in the face of which our modest powers must seem humble.” The other was Winston Churchill:  “Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened.” In the event this sort of thing excites you (and I realise it very possibly doesn't) the discussions on both these quotes are worth reading. Follow the links.


Sunday, February 24, 2013

Ban ‘killer robots’ before it’s too late


Stop the Killer Robots is a new global campaign to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates. It aims to persuade nations to ban "killer robots" before they reach the production stage.  This is in today’s Observer.

Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University is prominent in the campaign. He says robot warfare and autonomous weapons are the next step from unmanned drones. They are already being worked on by scientists and will be available within the decade. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.



The two images are from a Human Rights Watch press release issued last November. 

The aircraft is a drone enabling an operator to strike distant targets, “even in another continent.” Whilst the Ministry of Defence has stated that humans will remain in the loop, Human Rights Watch says the Taranis “exemplifies the move toward increased autonomy”. The sentry robot can detect people in the Demilitarized Zone and, if a human grants the command, fire its weapons. The robot is shown here during a test with a surrendering enemy soldier.

Neither the aircraft not the robotic sentry are fully autonomous but Human Rights Watch sees full autonomy as being only a step away.

Here's a campaign video against military robots.



In the video, Noel Sharkey says there is nothing in artificial intelligence or robotics that could discriminate between a combatant and a civilian. It would be impossible to tell the difference between a little girl pointing an ice cream at a robot or someone pointing a rifle at it.

Advocates of military robots claim a moral argument in their favour. If you could build robot soldiers, you would be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn’t be susceptible to human emotions like anger or excitement in the heat of combat. So, you might actually want to entrust those sorts of decisions to machines rather than to humans. 

Jody Williams of Human Rights Watch debates this point with in a Democracy Now episode last November.