Can military robots follow the Geneva Conventions?

Ronald_Arkin_Georgia_Tech.jpg

Over at our neighbor blog Risk Factor, Bob Charette has an interesting post on robots and ethics and the views of Ronald Arkin [photo, right], of Georgia Tech, and Noel Sharkey, of the University of Sheffield:

Government Computer News had a nice little story on the ethics of robot warriors a short time ago. It talked about the work of Georgia Institute of Technologyâ''s Mobile Robot Laboratory professor Ronald Arkin and his attempts to define algorithms to define ethical behavior in machines that can follow norms like the Geneva Convention.

[...]

In an AFP news story, Dr. Arkin is quoted last month as saying, "Robotics systems may have the potential to out-perform humans from a perspective of the laws of war and the rules of engagement," since with robots "there are no emotions that can cloud judgment, such as anger."

Arkin's work has direct relevance to another robot story in this week's London Telegraph and the aforementioned AFP story about University of Sheffield's Department of Computer Science professor Noel Sharkey's belief that the major powers are "sleepwalking" into an international robot arms race, and predicted "that it is only a matter of time before robots become a standard terrorist weapon, replacing suicide bombers."

As my blog colleague Mikell noted before, How can we make sure ethics and technology develop at the same pace?

Photo of Ronald Arkin by Gary Meek

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement