Yet another geek blog

3 October 2008

The War Machines are Here

Terminator (copyright MGM Entertainment)

As the war on Afghanistan gets messier, the boundaries of who is fighting who become harder to define and the tactics used, on both sides, become more and more desperate. In the middle of all this, stories like this flare up and are then forgotten.

A Robot has killed a Man

Yet again, we have crossed a line without even noticing. I admit that it might not seem to be much of a line. Booby traps and landmines have been killing people for centuries. And attacks have been ordered on the basis of information provided by machines.

Nevertheless, we need to stop and think about the fact that the point of decision has moved. A human being is dead because a machine decided to kill him. The robot/drone was allowed to kill by human decision and deployed by human decision but the actual decision to kill an actual human was made by software.

I am not even going to get in to the fact that was done in a country that is not even part of the battlefield against an enemy that is not an army but an ill-defined mass of tribal, political and religeous affiliations.

The Future

What of the future? More of the same. Sending in machines rather than people is an irresistible idea so the machines will become more and more autonomous. For symmetrical warfare this will result in battles being fought between machines. Possibly with helpless locals trapped in between.

But symmetrical warfare seems to have been left behind in the 20th Century. What will be the effect on asymmetrical warfare? In short, more terrorism. These machines will be attacked where they are most vulnerable and that means taking advantage of the difficulty in distinguishing friend from foe. More attacks will take place on civilian populations and in civilian areas.

The Three Laws of Robotics

So much for the Three Laws of Robotics (http://en.wikipedia.org/wiki/Three_laws_of_robotics). These were created in 1942 by Isaac Asimov because he was irrated by the fact that so many science fiction stories were lazy re-tellings of the Frankenstein myth. To him, it seemed nothing but common sense that a machine capable of making a deadly decision must be prevented from doing so. Knives have handles he said.

But for the foreseeable future, our relationship to robots will be this:

targets.

It is also worth remembering that a lazy re-telling of Frankenstein is still the most popular science fiction plot. So nobody can say that we were not warned.

Books

Labels:


posted by Yet Another Geek @ Friday, October 03, 2008

3 Comments:

  • At 4 March 2009 09:56 , Anonymous Lionel Baartman said...

    The Global Predator drones are not robots - they are remotely controlled by a human operator. So whilst it is true that people were killed by a machine, it is not true to say that the kill/not kill decision process was devolved to a machine.

     
  • At 4 March 2009 10:17 , Blogger Yet Another Geek said...

    My Bad
    My bad. I should have probed the news a bit further.

    But we are getting close. According to roboticist Noel Sharkey in this New Scientist article:

    "Fully autonomous systems are in place right now... The US navy's Phalanx system, for instance, can identify, target, and shoot down missiles without human authorisation."

     
  • At 10 March 2009 11:00 , Blogger Yet Another Geek said...

    I also still believe that we are in imminent danger of crossing this line without ever noticing.

    See also A C Graylings article in New Scientist.

     

Post a Comment

Links to this post:

Create a Link

Advertisments

If you use Firefox, and install Adblock, you miss out on this stuff: