Twitter Facebook Google RSS
 
UNMANNED TECHNOLOGY  

For Now, Lethal Robots Not Likely to Run on Auto-Pilot 

2,008 

By Stew Magnuson  

SAN DIEGO — Bart Everett, technical director for robots at the Navy’s space and naval warfare systems center, acknowledged that the military isn’t ready for the next generation of mechanized soldiers.

He is nevertheless overseeing the development of a robot soldier. One that will enter into a building alongside a human companion uses sensors to seek out enemies, then fire lethal or nonlethal weapons to eliminate targets.

He calls the concept the “war fighter’s associate” and likens the human-robot relationship to that of a hunter and a bird-dog.

“What we have to do is work with the war fighter and figure out what [he] will accept.... If I lay this on him right away, it’s going to freak him out,” Everett said.

The problem boils down to the classic disconnect between those who work on cutting edge technologies in the lab and the users in the field, he said. The engineers have no idea what the soldier really needs and the conditions he encounters. And the soldiers don’t know what technologies are available to them and what they can do.

Everett said the lab is about 10 years ahead of where he expected to be in terms of achieving autonomy for robots.

Autonomy means little or no need for an operator to use a joystick. A reconnaissance robot, for example, can be sent into a bunker without any radio link, and come out with a complete map populated with icons showing the location of people, weapons, or evidence of weapons of mass destruction.

And it may mean allowing that robot to have a weapon to defend itself in case it comes under attack.

That by itself could cause skeptics to shake their heads. Letting a robot enter an enclosed space with a weapon, and giving it the ability to defend itself, could be too far of a leap for the military community to accept, he said.

As Everett sees it, the way robots are controlled has not evolved since World War II.

The fact that there were robots used that long ago in wartime is a surprise to most — even those in the industry, he said. The Germans built 8,000 Goliath suicide robots. They were a few feet long, moved on tank-like tracks and were loaded with explosives. They were designed to drive up to bunkers or tanks and then blow up.

One reason that they are largely forgotten today is that they were not very successful, said Everett, who is writing a book on the topic.

They were tele-operated — meaning that they needed a soldier to control the machine through a radio link. These links failed, and therefore, so did the robots.

When the U.S. military invaded Iraq more than six decades later, it arrived with about 170 robots — about 7,800 fewer than the Germans had. And when these radio-controlled machines lost their links, they also failed.

After spending about a billion dollars on robotics research and development funds, Everett said he finds it lamentable that the United States is still using tele-operated robots. Controlling a robot in a battle zone is an engrossing task for the operator, and therefore dangerous.

That’s why his lab, and others, are intensely working on achieving autonomy.

“The problem we have is the war fighter is just getting used to World War II technology,” he added.

In the SPAWAR laboratories, work continues on some of the leap-ahead concepts that he said the military is not ready to accept.

One of these systems, Robart III, carries a gun and a small rocket launcher. Electronics Engineer Brandon Sights gave it a command to “follow” and it kept pace with him through the room, and out the door into the California sunshine.

Sights picked up a rifle and pointed it towards the sea and Robart’s weapons did the same.
SPAWAR is not researching how to make the robot walk up and down stairs. There are other laboratories and organizations such as the Defense Advanced Research Projects Agency and its Big Dog project, working on walking robots. At some point, these two technologies — autonomous function and human or animal-like mobility — may be married, he predicted.

Another breakthrough has been in the realm of vision. Everett decided about two years ago that current algorithms — designed to let a robot see and understand what is around it through camera lenses — were too complicated.

When a man walks down the street, his brain is only taking in on a conscious level a small percentage of what he is seeing. The rest is filtered out and pushed down to the subconscious. If something catches his attention, then he will turn his head and focus on that object.

SPAWAR robots were already outfitted with ladars to help them navigate rooms and avoid obstacles. Ladars send out laser pulses to measure distances. The epiphany was to make the ladar the “subconscious.” If it picks up an anomaly — an object leaning against a wall — then it can move over to the target, switch on the vision, and decide what it is by comparing the shape of the object to those stored in its memory. Is it shaped like a rifle? If it is a common weapon like an AK-47, it should be able to identify it.

Earlier experiments were making robots walk into a room, then try to identify everything within its field of view. They became overloaded quickly.

“In the event it can’t be identified, take a picture of it, and ID it later,” Everett said.

This realization was one of the breakthroughs that led to SPAWAR being ahead of where it expected to be in terms of autonomy, he said.

“Because of the change in our approach, we’ve made a lot faster progress.”
Just as a hunting dog picks up on his master’s non-verbal cues, the war fighter’s associate robots are doing the same, Everett said.

Like dogs, “robots can do some things humans can’t, and vice versa.”

If put together as a team, the robot can be sent ahead to do missions that keeps the soldier out of harm’s way. Like a dog’s ears and nose, a robot’s sensors are superior to a human’s.
“We’re setting the bar pretty high if we want the robot to be as perceptive as a dog,” Everett admitted. But there is one way to cheat.

Programs such as the Army’s land warrior integrated modular fighting system, which envisions a sensor embedded in the uniform that can monitor a soldier’s vital signs, can be linked to the robot just as the weapon’s status has already been tied to Robart. The robot could then pick up on the same nonverbal cues that dogs can read.

“No one is controlling it, no one is talking to it, and yet it’s right there with the soldiers doing what it is supposed to be doing,” Everett said. “We have so many elements of that working right now, it’s spooky.”

However, an armed robot entering close quarters with humans is another big leap in terms of acceptance. Fratricide is the first problem that comes to mind.

Everett has installed five different sensors into the robot so it can keep track of friendly forces.
These technologies can be married to modern day fire control systems, which are highly accurate. Such systems can already track and destroy a target from a moving Apache helicopter at distances measured in miles. Scaling those capabilities down to meters does not pose a problem, he said.

An armed robot with body armor could walk into an ambush, “coolly find targets, and prioritize them, without getting scared, without making a mistake.” He said the robots may actually have fewer friendly fire incidents than their human counterparts.

“In a constricted environment you don’t want to go one on one with a computer controlled weapon system. You’re not going to win that one. The robot is going to get you before you get it. He’s got sensors that can see in the dark, see through smoke, whatever.”

Whether the military will embrace such a system is unknown. Meanwhile, laboratories and programs such as his continue to move in that direction.

Barriers in “acceptance” continue to be broken, he said. The first was robots themselves and the belief that they could not perform as well as humans. The highly successful explosive ordnance disposal robots, although tele-operated, broke that wall down, he asserted.

Next came armed robots. Some said it would never happen. But last summer, the special weapons observation remote reconnaissance (SWORDS) armed robot entered combat toting a M249 light machine gun. Again, Everett dismissively noted that humans control the guns and platform through radio frequencies. But nonetheless, it was another barrier broken.

Now, there are already inquiries as to whether these robots can be outfitted with nonlethal weapons so they can independently protect themselves from tampering, he said.
This summer, the military will see some of the first fully autonomous robots on bases throughout the United States.

The SPAWAR-built mobile detection and assessment response system (MDARS) will patrol domestic installations under a program run by the Army, which has tri-service responsibility for base security.

“There’s no human driving this thing. It is all automatic,” Everett said.
MDARS will use 360-degree sensors to detect motion for distances of up to 300 meters. Once it spots “purposeful movement” — in other words an object displaying human-like motions — its speaker blares out a warning:

“Intruder Stop! Stop and be identified.”

If there is no response, it shoots a swath of pepper balls in front of the intruder. It can track up to six targets at a time.

With the ability to track so many targets at once, then lay down fire, it’s not hard to imagine MDARS being converted to some kind of battle-bot with an array of lethal weapons instead of pepper-ball guns.

Everett acknowledged that this could come to pass, although this version of MDARS would not be ready for that. It would not perform well in rough terrain, for example.

Still, outfitting a robot with a nonlethal weapon it is authorized to fire without a human in the loop is another step in the evolution.

The military may decide it never wants an autonomous robot carrying a lethal weapon. “No problem. We just back off to what [they] will accept,” Everett said.

“Just make sure you keep getting their feedback and you’re not diverging on some spooky laboratory path that nobody wants to go down, and it will work out. If you try to force something on them they’re not ready for, it’s going to backfire.”

Please email your comments to SMagnuson@ndia.org

Submit Your Reader's Comment Below
*Name
 
*eMail
 
The content of this field is kept private and will not be shown publicly.
*Comments
 
 
Refresh
Please enter the text displayed in the image.
The picture contains 6 characters.
*Characters
  
*Legal Notice

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.

 
 
  Bookmark and Share