Eyes of Army Drones Multiply, Open Wide
“You’re really beginning to see a lot of hard work over the last three years come to fruition in a way that’s going to fundamentally change the way these systems are used,” Tim Owings, the Army’s deputy project manager for unmanned systems, told National Defense.
The service will put more sensors on its larger aircraft and install cameras that can see over a much wider area. It also plans to make it easier for one soldier to control multiple aircraft and payloads. In time, onboard computers may even begin interpreting what they see to alert operators to a situation.
Many of these new devices will be displayed at a demonstration this fall. The manned-unmanned system integration concept, or MUSIC, will take place in September at Dugway Proving Ground, Utah. The Army says the exercise will be the largest display of interoperability ever conducted and a good indicator of what is to come in its ISR operations.
“It’s a large focus of ours currently across the Army — but specifically in unmanned aircraft systems — to be able to perform more missions with fewer aircraft,” he said. “That efficiency is becoming big in terms of how we’re procuring systems and how we’re equipping systems.”
The Army is trying to break the paradigm of having two operators for each platform — one who controls the payload and one who flies the aircraft. The service wants soldiers to be able to perform missions by doing both functions proficiently from a single console. “There is a focus on reduced force structure but not reduced capability,” Owings said. “That is probably the biggest trend you’re going to see over the next three years.”
Spectators at the MUSIC event are likely to see scenarios that include a soldier on the ground handing off controls of an unmanned aircraft payload to an Apache helicopter, all while allowing a Kiowa Warrior chopper to receive the video feeds. Soldiers will be tested on their abilities to do more with fewer aircraft, and sensors will show the ability to cover more ground than before.
The Army has built a universal ground control station and a mini-variant for its drones. The latter looks and works like an Xbox controller. Both allow a user to operate the entire catalog of Army UAS. It currently takes an operator about 30 minutes to switch platforms. The universal ground station allows for “hot swapping” aircraft, or handing off and taking control of different drones immediately. For instance, an operator flying a Hunter could turn over control of the UAS to an Apache co-pilot. The operator could then be free to take over a Shadow or other remotely piloted aircraft.
Soldiers on the ground will have bidirectional remote video terminals that allow them to receive video feeds and control sensors. The Army has seen an increased demand for small drones at the squad, platoon and company levels in Afghanistan, said Col. Greg Gonzalez, project manager for unmanned systems. These systems “are providing the soldiers significant eyes beyond what they can normally see and what they have not had in the past.” Smaller drones like the Raven and Puma also will be part of the MUSIC exercise.
Perhaps the biggest advances in Army drones are the sensor packages. The MUSIC demo will feature a Gray Eagle with three sensor balls — two on each wing to go with the one under the fuselage. This “triclops” arrangement allows different parties access to the payload simultaneously.
“The primary payload operator on the ground control station can control one sensor, Apache could control one and a soldier on the ground can control a third,” Owings said. The latter would use a bidirectional remote video terminal. The ground station operator also could point the three sensors to track three different subjects. This would come in handy when following insurgents who have scattered after placing a roadside bomb.
“It’s tracking more targets with fewer aircraft,” Owings said. “If it’s successful, we have variants in the works for Hunter and Shadow.”
The Army has conducted only ground testing of the triple-sensor system. Officials hope to test it in flight by the end of spring.
“The biggest movement for us has to do with more efficient collection of imagery and more efficient dissemination of imagery,” Owings said. “There is a lot less focus on exactly what the platform is and more focus on getting more product with fewer platforms. So you’re collecting with fewer platforms but disseminating the same information to a larger audience. That, in a nutshell, is really shaping the future of what we’re looking at.”
To that end, the service also is developing wide-area surveillance packages, including one called Argus that is being outfitted on a Boeing A160 Hummingbird and will be deployed to Afghanistan later this year. The unmanned helicopter will fill a need for a slow moving platform that can hover and stare at targets. BAE Systems recently won a $50 million contract to develop an infrared Argus (autonomous real-time ground ubiquitous surveillance imaging system) package that will allow for nighttime video surveillance over battlefields and urban areas.
Another Army effort called “Wolfpack” also allows a drone to see over a wider area. It is a collaboration between the Army, Navy and Marine Corps. The Air Force has run into several kinks with its similar Gorgon Stare technology. Field testers in a report said it was not operationally functional and that the nine-camera pod does a fine job of tracking vehicles but is not sufficient for following people. Air Force officials responded by saying they had already fixed some of the problems and still intend to field it in Afghanistan as soon as possible.
“What distinguishes [Wolfpack] from Gorgon Stare is the ability to nudge the image around the wide area,” Owings said. “You end up with a checkerboard mosaic.”
A regular camera can provide “soda-straw” images in swaths of only a few hundred meters. Wolfpack can expand that to about 2.5 square miles. This larger picture would be lower fidelity but an operator could zoom into a high-quality image if he sees something of interest. This feed could be sent to a ground control station, an Apache helicopter or a dismounted soldier with a bidirectional video terminal. “You can run seven or eight of these streams at one time, so you end up being able to serve multiple users with a single aircraft,” Owings said.
Meanwhile, critics have warned against putting an emphasis on technology at the expense of human interaction and analysis. Army drone operators at a recent Washington, D.C., conference even admitted that troops on the ground sometimes didn’t know what to do with the information being provided from above. Maj. Gen. Michael T. Flynn, the Army’s top intelligence officer in Afghanistan, did not sugar coat his take on current intelligence-gathering techniques when he issued a bracing report last year in collaboration with the Center for a New American Security.
The document, “Fixing Intel: A Blueprint for Making Intelligence Relevant in Afghanistan,” specifically addresses the use of remotely piloted aircraft:
“Aerial drones and other collection assets are tasked with scanning the countryside around the clock in the hope of spotting insurgents burying bombs or setting up ambushes. Again, these are fundamentally worthy objectives, but relying on them exclusively baits intelligence shops into reacting to enemy tactics at the expense of finding ways to strike at the very heart of the insurgency. These labor-intensive efforts, employed in isolation, fail to advance the war strategy and, as a result, expose more troops to danger over the long run.”
Still, military initiatives are relying more on machines than humans. The services have released reports emphasizing automation and training systems to think like operators. Beyond the showcase promised by MUSIC, the Army’s next big thing may follow along the same lines. Owings suggested that algorithms could be used to automate some of the ISR functions handled by soldiers today.
“We burn a lot of manpower watching an intersection or a house or something waiting for something to change,” he said. “What you’d rather be able to do is watch seven or eight houses at one time with an operator and have the systems tell you when something changed. I think that’s the next generation of stuff we’ll be looking at, but that’s a ways away.”
He added: “Who knows what the next ‘superidea’ is going to be?”