Operators of ground robots typically have relied upon laptop computers or game controller devices to navigate their unmanned vehicles and direct sensor movements. But several companies have developed technologies that untether troops from immobile controllers and give them the ability to hold their weapons and multitask while commanding their robots.
Think-A-Move Ltd., based in Beachwood, Ohio, has created a human-machine interface system that allows operators to control a robot through vocal commands.
When a person speaks or moves the tongue, sound waves are generated through the ear canal.
“Our technology picks up those signals that come through the ear canal through an ear piece, and then we process those signals to eliminate ambient noise and use them for voice control or communications,” says Jonathan Brown, vice president of sales and marketing.
An earpiece, similar to an iPod earbud, connects to a Sony Vaio computer the size of a paperback book. Speaking commands such as “forward,” “left,” and “right,” operators can guide a robot’s movements while keeping their hands on a weapon and their heads up.
“Not only do they have their hands free to do something else, but they’re not looking at a screen as they’re trying to control the robot. It’s better from a situational awareness standpoint, and also from a multi-tasking standpoint,” says Jim Harris, president of the company.
Commands can be spoken at different volumes, which is important, depending on the mission, they say. If troops are operating in a situation where radio silence is required, the technology gives them the ability to give subvocal commands using their tongues.
“Just as they might communicate with other members of their squad using hand signals silently, this enables them to communicate with the UGV silently,” says Brown.
The device also works accurately in noisy environments, up to 80 or 90 decibels. It can distinguish between the operator’s commands and those given by someone else in close proximity. Developers also have produced an audio feedback capability by adding a speaker. If a robot is equipped with a microphone, an operator can listen to what the robot hears in the same earpiece controller.
Additionally, the mobile PC allows users to view images and telemetry information via the robot’s cameras, says Harris.
The technology has been integrated onto an iRobot Packbot. It has been demonstrated to scientists at the Army’s Tank Automotive Research Development and Engineering Center, which has approved the company to move forward with a field-deployable version. Harris says a prototype will be ready for deployment next year.
As unmanned ground vehicle technologies improve, the military expects to incorporate more robots into its forces for use by smaller units, such as platoons and squads. Though these robots are seen as force multipliers, their bulky and complex controller interfaces generally demand the full attention of operators — a situation that can be deadly on patrol or during covert operations.
“Marines and soldiers are going to need simple, intuitive ways to control these assets,” says Jack Vice, president and chief technology officer for AnthroTronix Inc. in Silver Spring, Md.
The company has developed several human-machine interface technologies that are designed for use by dismounted troops in battlefield conditions.
The visually integrated sensors unit, which gives troops control of multiple unmanned vehicles, comes in a camcorder format that is held up to the eye like binoculars. By pressing buttons on top of the unit, operators can navigate through menus to use various functions, including target designation, mapping and robot operation.
For example, if a team is on patrol outside a base and stumbles upon a suspicious parked car, an operator can pick up the VIS unit and use its laser rangefinder to designate that car as a potential threat. The device’s software can calculate the best robot to send out to investigate the target.
An operator can agree with the selection or override it with another. Once the target has been identified and tasked to a robot, he can continue moving with his squad, unlike in current operations where the team must wait for the explosives ordnance disposal unit to arrive on scene.
Once the unmanned asset has made its way to the target, the VIS unit vibrates to alert the operator, who picks up the device. Instead of seeing the landscape immediately in front of him, the operator sees the image coming from sensors aboard the robot. As the operator moves the unit from left to right and up and down, he pans the robot’s camera.
“You almost telepresence yourself into the vehicle,” says Vice. “That gives you the ability to get a feel for what’s around that vehicle.”
An operator can also remotely control the robot by using a thumb joystick on the back of the VIS unit to move the robot to a better location.
With the appropriate optics, the VIS unit can function as a look-through device, with night and thermal imaging technologies.
It uses low-power micro-displays, which light up only when the unit is pressed up against the face, says Vice. “At night, if you’re patrolling and approaching an objective, you can be shot by a sniper if you’re shining a light,” says the retired Marine. “That’s really important on reconnaissance mission, because if you’re seen, you’re dead.”
The key to the VIS unit is its head tracking capability, says Vice.
When using its look-through capabilities, blinking arrows inside it prompt operators to turn their heads in the direction of assets or threats that might not be immediately visible. Range, angle, heading and other information are displayed on the screen.
“It builds in your brain that situational awareness that you don’t get from pulling out a map, reading from grid coordinates and then getting your bearing with the map and compass,” says Vice.
The technology is being funded through the Defense Department’s joint enterprise ground robotics program and the Air Force Research Laboratory. The company also has partnered with Lockheed Martin Corp. through the Defense Department’s mentor protégé program, says Corinna Lathan, AnthroTronix’s chief executive officer.
With additional funding, the company could produce a prototype in one year and a ruggedized version in two years, says Vice.
AnthroTronix also has developed other natural user interface technologies to complement the VIS unit. Most of them have been integrated into existing gear carried by troops to minimize their loads, says Bryan Hays, project engineer.
The iGlove, for example, recognizes hand signals and translates gestures into movement control for robots.
“If I give a hand signal to my point man, why can’t I give a hand signal to my unmanned asset? It’s a natural, intuitive interface for dismounted operations,” says Vice.
The mounted force controller is a vertical grip designed for use on military assault rifles and vehicle dashboards. Troops can hold their weapons at the ready and control a robot at the same time. The grip responds to slight pressure from users; pushing or pulling moves the robot forward and backward while twisting turns it right or left. Software permits the controller to dynamically adjust for overtorquing when operators are pumped full of adrenaline, says Vice.
Because it is not gimbaled like most other joysticks, the stabilized controller acts as a front vertical grip and allows operators to fire a weapon without any impediments.
Please email your comments to GJean@ndia.org