RESEARCH AND DEVELOPMENT
Researchers Debate Utility of Autonomous Armed Robots
11/1/2008
By Stew Magnuson
By Stew Magnuson

Why? he asked.
“Because they could kill people,” he was told bluntly.
Of course they could, he thought. War is hell after all. Enemies die in combat.
No, really, he asked. These were judge advocate general attorneys, but they were combat veterans. One was a Marine.
He asked again, but their answer was the same.
Armed aerial robots have plied the skies over Iraq and Afghanistan. The Army is pressing ahead with development of ground robots that can roll down a street with a machine gun. But in these cases, there is a human in the decision-making loop.
Parallel work in research laboratories also continues on artificial intelligence, which would cut robots from their “tethers” and allow their operators to do other tasks. Sentry robots that can perform perimeter patrols without an operator constantly monitoring them are already in use.
Futurists see these efforts one day coming together. A robot soldier they say could conceivably move down a street and using advanced target and tracking systems, along with facial recognition software, spot a known terrorist and take him out.
Back in 2003, when Canning sat down with the lawyers, little of the legal, treaty and policy implications of moving down this path had been worked out.
“Legal issues must be addressed right up front. We’re going to be wasting our time if we don’t,” said Canning, who is chief engineer at the platform integration division, engagement systems department at the Naval Surface Warfare Center, Dahlgren Division.
There are other roadblocks — including skeptics who believe that the futurists may one day be right — but argue that spending precious research dollars on such high-minded concepts would be a waste of money.
One of them is David Bruemmer, technical director of unmanned ground vehicles at Idaho National Laboratory.
“I don’t foresee any time in the near future an automated system being able to discriminate between a tall 14-year-old boy and a soldier,” he said at an Institute for Defense and Government Advancement conference.
Nevertheless, the military is at least thinking how and if it could be done, although Canning confessed that there has been little funding for autonomous armed robots so far.
The Defense Department released in 2007 the “Unmanned Systems Safety Guide for DoD Acquisition.”
Autonomous armed robots are “allowed for in this document. It’s buried there, but it is there,” said Canning, who worked on the team that created the guide.
After consulting with lawyers, one basic tenet has been agreed upon.
“Let machines target other machines. Let men target men,” he said. In other words, autonomous armed robots would target the “bow and arrow, not the archer,” he said.
A hard and fast Defense Department requirement ensures that all new armed systems undergo a legal weapons review, Canning noted.
Even if an armed robot passes muster, one tenet in the framework holds that any legal weapon can be used illegally. That’s one reason why identification of the target is absolutely paramount, he noted.
In this “only target machines” scenario, an armed robot would aim at an enemy’s rifle, not his between the eyes.
There is legal precedence. Many countries have banned anti-personnel landmines — although not the United States.
Most nations recognize that a landmine buried in the ground is not targeting anyone. It can kill a soldier as well as a child. But these same nations have not banned anti-tank mines. Since they require immense pressure to detonate, they are not targeted at people.
Woe be to the soldiers who are in the tank, though.
“People may still die, but it will be a secondary consequence,” Canning said.
Bruemmer suggested to a reporter later that targeting machines and weapons instead of those carrying them was being disingenuous. Of course, the weapon-holder may be killed or injured.
Canning countered that nonlethal technology could be employed. He joked that his ideal robot would be one that walks up to an armed enemy, grabs the rifle out of his hand, cuts it in half with a diamond-tooth saw, hands it back and says “have a good day.”
In any case, Bruemmer said the technological hurdles to create algorithms that could help robots distinguish between armed enemy combatants and a civilian are too great to overcome.
“Don’t even try. It’s that hard,” he insisted.
“Even if we had an ethical framework the military was okay with and voters were okay with, I think we’d still be hamstrung by the difficulty of making those determinations,” he said. There are many arenas where robots can be of immediate value, he said.
“It’s not a good use of the community’s resources to focus right now on robots that can kill people,” he added. “I say focus our resources on robots that can protect people first. Let’s make sure we can do that well.”
For now, Bruemmer may not have to worry about where the R&D dollars are flowing.
Canning said there is only one such autonomous armed robot project in the pipeline — an Office of Naval Research armed sentry. That’s still on paper, he said.
“The hard part is finding the money,” he said.
Topics: Robotics, Armed Robots
Comments (0)