COMMENTARY INFOTECH

Lowering Costs Through Information Sharing

12/1/2015
By Robert Smith
Technological advances and an ever-changing geopolitical landscape have been driving the evolution of military weaponry and strategy for centuries. At relatively few moments in history, however, has the confluence of these two drivers prompted a fundamental — even revolutionary — rethinking of how military operations are planned and accomplished.

The nation stands at such a juncture. At the same time when strategic priorities are changing and asymmetric threats are becoming the new normal, “fifth-generation” weapons systems are coming on line with game-changing sensing, processing and battle management capabilities.

The result is a unique opportunity — indeed, a pressing demand — for the U.S. armed forces to rethink military doctrine and create new concepts of operations for leveraging the full value of this giant leap in capabilities.

Recognizing the significance of the moment, leaders from all U.S. forces have begun to reevaluate — individually and collaboratively — the nation’s approach to warfare. The Navy and Air Force, for example, have taken a major step forward by reexamining the air-sea battle doctrine in the context of fifth-generation platforms — foremost among them the F-35, F-22, P-8, unmanned platforms, and littoral combat ship — as part of a new national focus on the Pacific.

However, the driving paradigm is shifting away from “exquisite” capabilities toward “expendable capabilities.” We are starting to think more like our asymmetric adversaries, still planning for losses, but in a more modern context. As a result, developing systems and capabilities to counter threats and defeat adversaries is not necessarily driven by the emergence of new platforms and completed capabilities, it is environmentally driven by issues such as existing technology available, the cost to develop new capabilities, budgetary issues and new acquisition models.

By leveraging inexpensive sensors, commercial communications and low-cost unmanned aerial vehicles in concert with “exquisite” capabilities, we can provide immense operational flexibility while reducing the vulnerability for our forces.

As time moves on, the primary set of challenges lies not in military commanders’ ability to envision the potential of the new platforms to drive a more effective doctrine, but in the ability of the forces and their industry partners to turn the vision into practical, affordable and sustainable battlefield dominance for U.S. warfighters for decades to come.

Among the many issues that must be addressed are the architecture design to support interoperability; connectivity; bandwidth constraints; data processing and management; cyber security; and planning for cost-effective procurement and upgrading of network elements that maximizes their value and ensures their cohesion.

Nobody knows better than the services themselves that the old paradigms of stove-piped networks and static information must be revamped. The ultimate objective of all information gathering — to make better-informed decisions — is severely undermined when the information is not made available to the people and systems that need it most, when they need it most.

Under the current architecture, data gathered by Navy and Air Force sensor platforms are fed back through the respective service networks to a point removed from tactical operations teams. The information is fused, and the collective picture is pushed back down to the users who will benefit from it. The downside, of course, is time lag. The shelf life of battlefield information is short and its usefulness decreases rapidly with age. The issue becomes even more pronounced when coalition forces are part of the mix.

Technology has the potential to minimize this time lag providing “actionable” information to the tactical user. Networking platforms such as jet fighters more effectively would allow them to serve as individual nodes to sense and distribute information — not only to operations centers and commanders, but also between themselves and legacy platforms. The effect would be that some battle management decisions, based on 360-degree situational awareness, could be made on the battlefield, with all of the fifth generation’s accelerated speed and improved decision quality. 

This is not to say, however, that the solution is to simply connect every deployed asset. Several practical considerations make fully interconnected platforms and unrestrained free flow of information unrealistic. While the quality of today’s sensor data is tremendous, so is its volume.

The quantity of data has exploded, making unfiltered flow impractical, given the constrained bandwidth capabilities of the shipboard environment as well as legacy air platforms. In addition, some platforms lack the processing capabilities that help avoid information overload for users by turning raw data into useful information products that improve situational awareness.

The solution — and the challenge — lies in systems interoperability and the ability to process information closer to the sensor and to the end user. While traditional stove-piped networks push tactical battle management decisions far from the battlefield, a network of interoperable systems would enable end users to pull useable first-pass information off the system, network or cloud — and make time-critical battlefield decisions — much sooner.

At the same time, additional onshore analysis and post-processing capabilities could improve the quality and depth of information products available to commanders. Ultimately, what is required is a balanced solution that redesigns the architecture to optimize task, collect, post, exploit and disseminate (TCPED) activities in a way that provides the right information, to the right users, at the right time.

Thanks to new and evolving technologies — such as cloud computing, big data analytics, cross-domain security solutions and data fusion software — designing and building an effective architecture to support tighter integration of battlefield assets and new concepts of operations is readily achievable. The goal of such an architecture will be to ensure that information is discoverable and readily available to as many relevant users as possible: to collect information once and use it often.

The architecture will contain several layers of information sharing and processing. At level one, fused sensor data from fifth-generation aircraft and ships will enable pilots and commanders to make tactical decisions for their battle team without having to wait for orders based on data that has been passed up the command chain for processing and analysis. Important research is being done in the optimal trade-off between decision making “at the node” and centralized command and control.

At the same time, formatted and unformatted data will continue to flow to the tactical command level, where it will undergo analysis to support the battle manager’s comprehensive view of the battle space. Rather than having their attention occupied with the task of pushing information to deployed assets, battle managers can focus on the higher-level decision making, which is where their experience and expertise adds the most value.

The data doesn’t stop at the battle management level. It also will be transmitted to a data warehouse or the cloud, where sophisticated big data technologies can be used to tag and classify the information to support discovery, retrieval and more in-depth analysis at strategic command levels, and to make it available to any authorized user across the enterprise.

Although progress in these and other programs indicate that technology is not a major hurdle to delivering a next-generation integrated TCPED architecture, two additional, closely interrelated, considerations must be addressed. One is the organizational structure that will procure, implement and oversee the new architecture, and the other is the approach to implementation.

Will the opportunity be seized with a top-down architectural approach that includes revised concepts of operations, or will it be addressed incrementally with tactical changes by different forces operating in their own environments? Or is there an approach that incorporates the best of both — one that delivers the benefits of a unified architectural design without a cost-prohibitive major acquisition program and allows incremental deployment without the onerous downstream costs of point solutions with limited interoperability?

The mission and organizational goals combined with technology enablers drive the need for acquisition paradigms. The acquisition chain must support two very different types of procurement in order to fully leverage the capabilities the whole of industry has to offer.

First, it needs a solid sustaining integrator that has breadth and depth to provide top level system concepts, architectures and high performance hardware and software modular open infrastructure implementation. This type of integrator needs to be more than just a cloud provider, they need the proven experience that understands the full mission envelope and can deliver performance to the mission operational users.

Second, the acquisition chain needs highly technical and agile teams to put together small purpose built acquisitions of mission capability services (algorithms and content) aimed at solving the individual or small number of organizational user needs. These need to be quick turn acquisitions that fit the larger architecture standards but provide specific sensor processing, fusion, prediction, identification, battle space awareness, etc. that can keep up with adversaries’ proliferation of new technologies and changing methods of engagement.

Together these two acquisition types — single knowledgeable integrator, and multiple agile mission capability services — deliver the responsiveness to the warfighter. An incremental architecture deployment program would allow new capabilities with sufficient fidelity in interoperability and procurement standards to plug into a joint intelligence center or other assets without having to add a new “box” to the network.

This approach has demonstrated that open system architectures can make it easier to add new technology and new applications on top of a common database and services infrastructure. It can effectively reduce the services’ total ownership costs for developing, evolving, sustaining and integrating systems and applications.

As the services move down the path to maximizing their investment in fifth-generation platforms, the needs of the converged ops and intel user community should be the leading priority. Responsiveness to the information and service need of the operational commands will help the services control costs while delivering solutions that are truly designed to serve warfighters and help them accomplish their missions.

The debate over the best way forward will be robust — as it should be. Without question, however, the nation’s defense forces, and those of its allies, have reached a juncture when fundamental changes in battlefield CONOPS are both feasible and desirable. They are necessary to fully leverage the nation’s investment in advanced platforms and to improve the effectiveness of its personnel in defeating the nation’s enemies.

The foundation for delivering on the vision for a more effective, cohesive and responsive fighting force will be an information-sharing architecture of truly integrated intelligence, reconnaissance and surveillance systems and platforms. It is within reach if there is a collaborative commitment to grasp it.

Robert Smith is vice president of C4ISR for Lockheed Martin’s Information Systems & Global Solutions business area where he leads a comprehensive portfolio of C4ISR programs.

Topics: C4ISR, Infotech, Infotech

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
 
Please enter the text displayed in the image.