Twitter Facebook Google RSS
 
FEATURE ARTICLE  

Investments In Unmanned Aircraft Focus On Ground Operators 

2,007 

By Grace Jean 

InvestmentUnmannedCurrent and future purchases of unmanned aircraft increasingly are taking into account grounds troops’ demand for timely intelligence in a user-friendly format. Consequently, the military services are turning more attention and funding to the devices used to program and operate the aircraft, officials assert.

The use of unmanned aircraft in combat is rising at an unprecedented pace. The Pentagon budgeted $1.7 billion for unmanned systems purchases in 2007, and expects to be spending nearly $3 billion a year by the middle of the next decade.

Typically, 90 percent of the dollars spent on unmanned systems are invested directly into the platform while only 10 percent goes to the command and control system, says Mark Bigham, business development director at the Raytheon Company.

“It’s basically an afterthought,” he says.

The lack of user-friendly standards for current ground control stations not only make it harder for operators to fly UAVs, but also results in mishaps. One of the biggest contributing factors to crashes and failures in UAVs stems from human error, according to a Federal Aviation Administration study. “Human factors issues were present in 21 percent (Shadow) to 67 percent (Predator) of the accidents,” says the FAA.

The study implicates poorly designed user interfaces on ground control stations as a cause for human error. “The design of the user interfaces of these systems are, for the most part, not based on previously established aviation display concepts. Part of the cause for this is that the developers of these system interfaces are not primarily aircraft manufacturers.”

Ground control stations typically are not ergonomically designed, and when combined with the long hours required in combat duty, they can contribute to operators’ health problems, such as deep vein thrombosis, says Bigham.

In 2006 alone, operators at Nellis Air Force Base, Nev., flew nearly 51,000 combat hours in Predator unmanned vehicles.

“They are really stressed out at a high level of performance right now,” says Brig. Gen. Kevin Henabray, mobilization assistant for Air Force strategic planning.

To alleviate some of those problems, Raytheon has developed a “universal control station” that can operate up to eight dissimilar unmanned aerial platforms.

The system immerses operators in an airplane cockpit environment with videogame-like displays and controls. In current ground control systems, operators rely on video transmitted from on-board cameras, which is akin to looking through a straw. It makes it difficult for pilots and sensor operators to navigate and hone in on targets, say former Predator controllers.

On the universal control system’s three wide-screen monitors, engineers have expanded the operators’ field of view from the typical 14 degrees to 120 degrees, by creating a synthetic environment that places the video in context of a larger geographic area.

“Think of it as Google Earth,” explains Bigham. It helps the pilot and sensor operator to understand what’s happening.

“I feel like I’m in the environment. I feel like I’m flying the aircraft,” says Michael Keaton, a former U.S. Air Force Predator squadron commander who now works for Raytheon.

For the sensor operator, the field of view is extended to 270 degrees.

“The navigation is so much easier,” says Kathleen Heilner, who logged 1,300 combat hours on the Predator as an imagery analyst.

Raytheon turned to the video game industry for interface technologies that give operators “situational awareness” at a glance. The visuals are a dramatic departure from the tabular data that Predator operators presently use. Current systems require operators to punch in a sequence of function keys on a keyboard to access information in a spreadsheet format.

“You’re spending an inordinate amount of time, compared to other aircraft that we currently have in our inventory, trying to keep track of what’s going on with the airplane,” says Keaton. “You spend a lot of time doing menu drills, trying to find information that’s buried in the system.”

The universal control station boils that information down into an intuitive gaming symbol, says Bigham. Operators can click on the aircraft they want to control and navigate quickly through Windows-based menus for diagnostics data.

Men and women enter military service with 10,000 hours of “thumb-time,” playing PlayStation games that have intuitive, immersive interfaces, says Bigham. However, the ground control systems out in the field today “try to force these young men and women to try to learn how to use a ‘qwerty’ keyboard and joystick. Well, why do that? Why don’t we take advantage of some of the advanced interface technology that the gaming industry has spent billions on?”

The technology allows users to generate “target markers” on the synthetic background, much as icons populate maps on blue-force tracking systems. Sensor operators then can locate a target visually and navigate the appropriate targeting system to that location.

In the upcoming months, the universal control system prototype will be tested with Predator aircraft.

“We have a standing requirement as part of our programmed modernization to incrementally improve the cockpit and aircrew interfaces,” says Air Force Maj. Curt Hawes, Predator MQ-1 program element monitor. “Air Combat Command is currently updating a requirements document specifying desired capabilities.”

For troops on the battlefield looking at UAV video streams remotely, the problem is not the interface, but understanding the source of the information, and correlating the data to the correct position on the ground, says Army Lt. Col. Jennifer Jensen, product manager for common systems integration in the unmanned aircraft systems office at Redstone Arsenal, Ala.

Current remote video terminals require soldiers to “take out their GPS, unfold their maps to figure out where they’re at, look through binoculars to find that hilltop and hope that video is of that hilltop they’re looking at,” she tells an industry conference in Panama City, Fla.

The Army awarded a contract to AAI Corp. to build a kit called the “one-system remote video terminal” that will allow users in the field to “see the same thing that the ground control station is seeing,” says Jensen.

The system scans different spectrums for UAVs in the area, identifies them and places them as icons on a digital map on a laptop. Users can then select which UAV feed they want to view or record, says Maj. Scott Hamann, a project officer.

The technology is intended for use with the Air Force’s remote optical video enhanced receiver (ROVER) III, which enables front line fighters to receive video from 11 UAVs and other sensor platforms.

The one-system remote video terminal permits the decoding of UAV data streams and places the telemetry information directly onto digital maps. This is particularly useful for troops on the move, adds Hamann. “If I have convoy doing route reconnaissance, and a soldier has a Shadow flying above with forward looking eyes for him, he can have inside his humvee a one-system remote video terminal tuned to that frequency, and he can see exactly where that Shadow is looking at.”

The portable system has a range of 25 kilometers on the move. With an extended range antenna, it can reach 50 kilometers, says Hamann.

The system can plug directly into vehicles, such as humvees and Strykers, for power, or it can function on standard military batteries. The Army began fielding the systems last month.

“We believe by fiscal year 2008, we will probably become a program of record and it’s going to be competed and then fielded out to all the users they determine require this system,” says Hamann.

The Army worked with L-3 Communications to add a new circuit board, called a modulator, to the ROVER III receiver to enable signals from the UAVs to pass through and become decoded on the laptop, says Hamann. That receiver is now the baseline configuration for the ROVER, so all new ROVERs can be converted into one-system remote video terminals, adds Jensen.

In June, the technology will be fielded for the Shadow UAV system. Future initiatives include the integration of this technology into Army vehicles’ blue-force tracking terminals, says Hamann. The Army also is considering developing a bi-directional remote video terminal so that commanders might one day have the ability to control UAV sensors from the battlefield.

Beyond the immediate need for the system in combat, Jensen says the program office is looking for potential uses in the domestic arena, such as homeland security and border patrol.

“This capability has enormous potential,” she says. If, for example, the country suffered another hurricane disaster, rescuers might use UAVs to help in search efforts.

Please email your comments to GJean@ndia.org

  Bookmark and Share