U.S. troops have access to a mind-bending wealth of information during combat, from video taken by drones overhead to GPS positioning and satellite communications.
But soldiers on the ground often feel overwhelmed with too much information, and the need to look at a smartphone or wearable computer can be a distraction during a life-or-death firefight.
“When you are looking down at a tablet or smartphone … you are not aware of what is going on around you at that time and then [my] brain has to cognitively figure out what it is that I saw on my 2-D screen. Now I have to compare that to what’s going on in the real world. Once you look up, you still have to make that comparison,” Dave Roberts, senior scientist and leader of military operations and sensing systems at Applied Research Associates, a Raleigh, North Carolina-based engineering firm, tells National Defense.
The answer, according to engineers at ARA and the Defense Advanced Research Projects Agency, is augmented reality — a system that overlays relevant information over a soldier’s field of vision in real time.
“By having all the information heads-up and geo-registered … I can be aware of what is going on and make quick decisions when there is a lot of uncertainty in the environment,” Roberts says.
Much attention has been paid to Google Glass, which provides information through a tiny screen that rests at the upper right corner of the wearer’s peripheral vision. It does not overlay information on the wearer’s view of the world and requires a shift in gaze, although slight, to read.
Glass basically miniaturizes a smartphone and moves it closer to a wearer’s eye. It is hands free, but a soldier still has to divert his attention from the enemy to access information, says Jenn Carter, a senior scientist with ARA.
ARA participated in a DARPA program to create “true augmented reality” that displays relevant tactical information within a soldier’s field of vision, rather than on a screen.
“What we mean is information that is displayed that is correctly positioned, geo-referenced and in the user’s immediate field of view,” Carter says. “A lot of the other systems that we have seen claim to produce augmented reality, like Google Glass, are really more of an information display. The idea is that a soldier might be able to take cover and still be able to see exactly what is around him.”
DARPA’s urban leader tactical response, awareness and visualization program, or ULTRA-Vis, resulted in ARC4, software that puts the wearer of a heads-up display into a real-world version of Google Maps street view.
The software pulls in information from high-level battlefield management systems at a tactical operations center, says Eric Wenger, another senior scientist with ARA. The information is then relayed to soldiers in the field. Both the center and the users have filtering options to prevent information overload.
It also produces a position report for each user, and displays their location on the displays of fellow squad members.
Without looking at a separate screen, a soldier can see icons marking preprogrammed waypoints, landmarks, other soldiers, vehicles and enemy positions. When the soldier turns his or her head, the system automatically displays all the relevant information for whatever environment is being viewed through a “light-weight, low-power holographic see-through display” with a “position and orientation tracking system,” controlled by the wearer’s vision, according to DARPA.
“Having the ability to understand where your teammates are at all times, even if they are not in your line of sight, if they are obstructed by buildings … without having to look down at a computer screen to view a map,” would be a revolutionary leap in battlefield situational awareness, Roberts says.
ARC4 was initially installed on BAE Systems’ Q-Warrior wearable computing system that was also a result of the ULTRA-Vis program. However, the software is display agnostic, essentially the processing engine that tracks where the user is and where he or she is looking with a high degree of accuracy. ARA scientists are farming it out to other augmented reality display manufacturers, Carter says.
The company is working with night vision goggle manufacturers to miniaturize the computational processor onto parts of a helmet, Roberts says.
Vuzix is testing the software on its M2000AR mobile display that uses a PC-compatible transparent optical system that overlays data on the wearer’s field of vision in full color. The helmet-mounted, see-through screen retails for $6,000.
The low-cost augmented reality system, or LARS, made by SA Photonics, places transparent monocles over each eye onto which information is projected from outside sources including icons and video imagery.
“As those displays keep getting better, you could see this incorporated into a pair of lightweight sunglasses,” she says. “This does not need to go in a traditional helmet. … We can integrate with any display that is out there.”
ARA is working with Six15 Technologies to incorporate ARC4 with its Wrap 1200DXAR augmented reality glasses. The system, which comprises a pair of high-definition cameras on the brow of a pair of wrap-around sunglasses with integrated displays, retails for $1,500.
Displaying relevant information accurately, in real time, as a soldier runs, jumps, dives and swivels his head has been difficult, Roberts says. It requires powerful data processing and extremely accurate measurements of a soldier’s position and orientation.
“The user is always moving around, their head is swinging rapidly, so we had to figure ways to keep the icons stable within the environment even during dynamic motion,” he says.
Most systems rely on GPS to identify a position and then use accelerometers, magnetometers and gyros to calculate new positions in relation to it. The ARC4 system also includes computer vision algorithms that are able to produce hyper-accurate orientation estimates, Roberts says.
The system can compare digital terrain elevation data to camera imagery of landmarks like mountains in the user’s field of view to pinpoint position and direction, Roberts says. Combining this and other methods of geolocation have made the ARC4 systems much more accurate than commercially available navigation software, he says.
ARA is developing another method, called celestial sensing, to calculate a wearer’s position relative to the sun.
“It’s the combination of all these different methods, when they are brought together in this sensor-fusion framework, that delivers a really robust solution over a wide operational space,” Roberts says.