Military’s killer robots must
learn warrior code
   
February 16, 2009
   
Automatons revolt to form a dictatorship over humans in Asimov's I, Robot
 
   
By Leo Lewis
   
Autonomous military robots that will fight future wars must be programmed to live by a strict warrior code or the world risks untold atrocities at their steely hands.
   
The stark warning – which includes discussion of a Terminator-style scenario in which robots turn on their human masters – is issued in a hefty report funded by and prepared for the US Navy’s high-tech and secretive Office of Naval Research.
   
The report, the first serious work of its kind on military robot ethics, envisages a fast-approaching era where robots are smart enough to make battlefield decisions that are at present the preserve of humans. Eventually, it notes, robots could come to display significant cognitive advantages over Homo sapiens soldiers.
 
“There is a common misconception that robots will do only what we have programmed them to do,” Patrick Lin, the chief compiler of the report, said. “Unfortunately, such a belief is sorely outdated, harking back to a time when . . . programs could be written and understood by a single person.” The reality, Dr Lin said, was that modern programs included millions of lines of code and were written by teams of programmers, none of whom knew the entire program: accordingly, no individual could accurately predict how the various portions of large programs would interact without extensive testing in the field – an option that may either be unavailable or deliberately sidestepped by the designers of fighting robots.
 
The solution, he suggests, is to mix rules-based programming with a period of “learning” the rights and wrongs of warfare.
   
A rich variety of scenarios outlining the ethical, legal, social and political issues posed as robot technology improves are covered in the report. How do we protect our robot armies against terrorist hackers or software malfunction? Who is to blame if a robot goes berserk in a crowd of civilians – the robot, its programmer or the US president? Should the robots have a “suicide switch” and should they be programmed to preserve their lives?
   
The report, compiled by the Ethics and Emerging Technology department of California State Polytechnic University and obtained by The Times, strongly warns the US military against complacency or shortcuts as military robot designers engage in the “rush to market” and the pace of advances in artificial intelligence is increased.
   
Any sense of haste among designers may have been heightened by a US congressional mandate that by 2010 a third of all operational “deep-strike” aircraft must be unmanned, and that by 2015 one third of all ground combat vehicles must be unmanned.
   
“A rush to market increases the risk for inadequate design or programming. Worse, without a sustained and significant effort to build in ethical controls in autonomous systems . . . there is little hope that the early generations of such systems and robots will be adequate, making mistakes that may cost human lives,” the report noted.
   
A simple ethical code along the lines of the “Three Laws of Robotics” postulated in 1950 by Isaac Asimov, the science fiction writer, will not be sufficient to ensure the ethical behaviour of autonomous military machines.
   
“We are going to need a code,” Dr Lin said. “These things are military, and they can’t be pacifists, so we have to think in terms of battlefield ethics. We are going to need a warrior code.”
   
Isaac Asimov’s three laws of robotics
   
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm
   
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law
   
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
   
Introduced in his 1942 short story Run around
   
   
   
   
   
   
   
The full article is available at:
   
http://technology.timesonline.co.uk/tol/news/tech_and_web/article5741334.ece