Features

December 1, 2009  

Gort! Klaatu barada nikto!

Battlefield robots raise ethical issues for military

The 7-foot-7-inch humanoid robot in the 1951 film “The Day the Earth Stood Still” foreshadowed the ethical concerns facing developers of robots for the Defense Department today.

Gort, as the robot was called, was a “policeman” of an advanced extraterrestrial, humanlike race that could make decisions of right and wrong, life and death, yet still would obey commands from its creators. The robots were designed to keep interplanetary peace. Gort was prepared to destroy an errant and irresponsible Earth until Helen Benson (Patricia Neal) saved the day with her command, “Gort! Klaatu barada nikto!” ordering Gort to revive its master, Klaatu (Michael Rennie), who had been killed by the Army.

Even the cartoonish Model B-9, Class M-3 General Utility Non-Theorizing Environmental Control Robot — a name seemingly straight out of a Pentagon procurement title — from the 1960s television series “Lost in Space” had the ability to tell right from wrong.

These Hollywood examples addressed the ethical issues facing contractors and the military today in designing a plethora of robots. There are plenty of people who are engaged in serious study about the legal and ethical issues of robotic warfare, and a cottage industry of critics is also emerging.

The situation is even worse for the Defense Department and contractors today than it was for fiction writers: Lawyers are involved. Due to a host of international laws governing rules of engagement, battlefield conditions and potential civilian casualties, contractors and DoD worry about running afoul of these laws if robots are unleashed without the ability to do some thinking on their own to avoid becoming unlawful.

The U.S. is regularly criticized when unmanned aerial vehicles — UAVs are technically robots — under the guidance of Air Force or CIA personnel inadvertently kill or harm civilians.

Congress has mandated that one-third of all operational deep-strike aircraft be unmanned by next year, and a third of ground-combat vehicles must be unmanned by 2015. Some fear this is going too far too fast, and could result in robotic ethical lapses and wrong decisions that could threaten civilians or friendly forces.

Yet robots are becoming increasingly important to the U.S. military. Although there have been plenty of instances of collateral damage, robots have also saved military and civilian lives. No longer is it necessary to engage in carpet bombing that can kill indiscriminately in a given area. Robots, including smart bombs, enable the military to perform surgical strikes.

Robots can perform reconnaissance and engage in combat. The number of unmanned systems in Iraq went from zero to 12,000 in five years, according to one report, but even this doesn’t appear to be enough. Army Lt. Gen. Rick Lynch recently said that unmanned systems could have prevented the deaths of most of the 153 soldiers killed while under his command in Iraq. Lynch also suggested that truck convoys could be driven by robots.

The debate over robotic ethics and actions has another element, however. A 2008 study by California Polytechnic State University prepared for the Navy’s Office of Naval Research suggested that robots can reduce combat troop malfeasance. “[R]obots may act as objective, unblinking observers on the battlefield, reporting any unethical behavior back to command; their mere presence as such would discourage all-too-human atrocities,” the Cal Poly researchers said.

But the researchers call the technology a “doubled-edged sword” with benefits and risks. Who will be responsible in the event the robots cause “unintentional or unlawful harm?” the study asks, noting the range of parties could start with the manufacturer and go through the field commander. What happens with the “possibility of a serious malfunction and robots gone wild; capturing and hacking of military robots that are then unleashed against us; refusing an otherwise legitimate order; and other possible harms?”

These scenarios are a lawyer’s nightmare — or dream, depending on which side of the issue one is on.

But what is the reality? It appears to be far less of a problem than the academics, lawyers and critics might suggest.

War inflicts collateral damage, whether by direct human combat or via UAVs hitting unintended targets. Critics, including a recently formed group in the U.K. dedicated to anti-robotic warfare, miss important points about robotic warfare, however, and in fact appear to look at Gort (and perhaps the Cal Poly study) and draw incorrect conclusions.

As for the debate over what robots can and can’t do, the science fiction of Gort and “Lost in Space’s” Model B-9 remains just that — fiction.

QinetiQ’s Foster-Miller subsidiary is one of the major robot contractors. “QinetiQ’s ground robots (more than 2,800 in use today) do not have the ability to engage targets on their own,” said Robert Quinn, vice president of Talon Robotic Operations for QinetiQ. “They are today, and will be for the foreseeable future, remotely operated systems where the war fighter makes all weapon engagement decisions complying with the Law of Armed Conflict and rules of engagement. The idea that robots will make these decisions independently is simply a way to create controversy and sell more books for the next 30 years or more.”

“The premise that ground robots will make the engagement decisions instead of humans making these decisions is at best decades away from technological reality, leaving plenty of time to discuss ethical considerations of ‘brilliant robots’ making engagement decisions on their own,” he added. “Meanwhile, countless lives of friendly forces and non-combatants are being saved by using ground robots that are remotely operated [and] fully compliant with the Law of Armed Conflict and rules of engagement.”

Joe Dyer, president of iRobot, makes the point that robots, if anything, are under greater human control than they used to be. One of the earliest, if crude, versions of a robot is the torpedo. In the early days of submarine warfare, humans would identify the target, adjust the settings on the torpedo, fire it and hope for the best. Sometimes the torpedo missed its target, and there were instances in World War II in which the torpedo malfunctioned, circled and destroyed the submarine that fired it. Today, torpedoes are wire-guided or otherwise controlled for greater accuracy and can be ordered to self-destruct if something goes wrong.

Dyer also points to the Tomahawk missile as another example where man has progressed to become more in the loop. With early versions, the missile was programmed and launched with the hope that it would hit its target hundreds of miles away. This weapon has evolved with man in control, enabling a change of targets while the missile is en route. The missile can be programmed to destroy itself if the connection between man and machine is lost.

Jim Overholt, director of the Army’s Joint Center for Robotics, and Dyer both make the point that today’s ground-based robotics are not armed and instead perform nonlethal functions such as bomb disposal and reconnaissance. Overholt said robots can identify civilian buildings and non-combatants, providing decision-makers the ability to avoid these when targeting areas for combat operations.

“We take the legal and ethical issues very seriously, there is no doubt about it,” Overholt says. “This is a very unique area where you are dealing with unmanned vehicles. Safety is the No. 1 concern when dealing with robotics.”

Overholt said the Army is closely following what automakers General Motors and Ford are doing. By 2012, the automotive companies will equip all vehicles under 10,000 pounds with electronic control systems to reduce accidents. The Army is watching this developing technology for application to robots.

The growth of robots can be seen in the statistics of just the last five years for the Army and Marine Corps. In 2004 there were five vendors supplying fewer than 200 robots in Southwest Asia, Overholt said. By 2009 there were nearly 8,000 robots in Iraq and Afghanistan performing bomb disposal, engineering and reconnaissance.

Does this make war “safer”? The fact is that robotics enables war to be “cleaner,” that is, with fewer casualties — civilian and military — and with less collateral damage to buildings and infrastructure than strategic bombing. Are there mistakes that kill and wound innocents and destroy unintended targets? Certainly. But this is the nature of war — robotic and human. AFJ

SCOTT HAMILTON is a consultant with Leeham Co. www.leeham.net.