Technologists Make Progress On Autonomous Ground Robots
Using two small unmanned aircraft and an SUV that can drive itself, the three vehicles employ computer algorithms to conduct an auction over the airwaves. Once a human has designated an area for them to search, the onboard computers start bidding on which of the three is the closest and has the most fuel to spare. The winner takes off to conduct its search, leaving the other two to carry out other missions.
“We don’t want for a human operator to have to tell each vehicle what to do and where to go,” said Charles Pippin, a research scientist at the institute. “We prefer that the vehicles communicate with each other and task themselves.”
The 2009-2034 Unmanned Systems Integrated Roadmap produced by the office of the secretary of defense called for “autonomous adaptive tactical behavior” for ground robots. Researchers want to take the burden of tele-operating the machines off soldiers who have enough things to worry about in war zones. Or they want robots to do routine tasks such as carrying heavy loads or driving vehicles down highways with little or no directions from their human masters.
As Jim Overholt, senior research scientist of robotics at the Army Tank Automotive Research, Development and Engineering Center, put it, automation can “let a soldier be more situationally aware as he drives down an alley and let the mundane job of driving a vehicle be handled by the computer.”
Autonomy was a key theme at the 2nd annual Robotics Rodeo held here.
The Southwest Research Institute equipped a Ford Explorer with a computer system that can be switched to six different modes: manual driving; tele-operated; pedestrian following; vehicle following; supervised autonomy and full autonomy, said Ryan Lamm, manager of intelligent vehicle systems at the San Antonio, Texas-based nonprofit.
Within the vehicle following mode, the system can be set to follow at longer distances, or it can drive to the right or left of the lead vehicle. In some cases, it’s not always advantageous to follow another vehicle in a straight line, Lamm said.
“The more reliable comms you have, the more you are able to extend the distance out,” he said. The truck currently can follow a lead vehicle and self-drive up to 70 meters behind. If a pedestrian or other obstacle gets in the way, it either swerves or stops, then automatically goes faster to catch up with the convoy.
In the pedestrian-following, or dismounted, mode, the vehicle’s sensors are programmed to visually latch onto a person on foot and stay a few paces behind as he or she walks forward. Even if another pedestrian walks in within the truck’s line of sight, it remains zeroed in on the original subject.
The tele-operated mode allows a person to drive the vehicle remotely. A driver behind the wheel can choose to revert to manual mode and take complete control of the vehicle, or do “supervised autonomy” — making small adjustments to the steering, speed or brakes if he doesn’t like the choices the computer is making.
In a small system, iRobot Corp. has developed the AwareHead, a pedestrian-following, gesture-controlled software and sensor suite that it can integrate into its family of robots. The idea is to lessen the operational burden placed on soldiers, lighten their load and decrease time on task.
It’s not enough to just have a robot that can follow a soldier and carry equipment while avoiding obstacles. It must also sense situations, said Christopher Geyer, senior lead research scientist at iRobot of Bedford, Mass. For example, if it sees its leader firing a weapon, it can react by moving more conservatively.
“We are trying to make interaction with the robot more natural,” Geyer added.
For robots that are autonomously traversing over rough areas without a human guide, St. Paul, Minn.-based Primordial Inc. has adapted software that was designed to help soldiers find the best path from point A to point B.
Originally a part of the Army’s Land Warrior soldier ensemble, the ground guidance system takes data on topographical features and vegetation and lets robots plot courses without relying on GPS, said Kyle Estes, software engineering manager at the company.
An autonomous vehicle can head for a waypoint and use obstacle detection sensors to avoid rocks, trees or other objects in its path, but it doesn’t know if there are major impediments such as rivers, lakes, or ravines.
“The idea is to give the robot a course overview path, and then use its on-board sensors to avoid rocks and cars and stuff we can’t catch in our map data,” Estes said.
For urban navigation scenarios, Sarnoff Corp. in Princeton, N.J., has developed the ViewTrek suite of sensors and software that can help robots navigate in scenarios where GPS signals are weak or nonexistent.
The system, which can be integrated into any robot, uses visual landmarks to keep track of where it is going and to backtrack if needed. It uses stereoscoptic processing for obstacle avoidance.
“The big piece of this is that it is GPS denied,” said Chetna Bindra. “It can go indoors, outdoors, and in caves,” she said.
The OSD roadmap identified intelligent visual navigation and communications that worked in urban scenarios as key technology requirements for future ground robotics programs.