DEFENSE WATCH EMERGING TECHNOLOGIES
Cold Dose of Reality on DoD Technology
By Sandra I. Erwin

It fell on one of the nation’s top tech gurus to speak an uncomfortable truth: The Pentagon has fallen so far behind the technology curve that drastic measures will be needed to ensure the military retains its edge in future wars.
“A lack of software and lack of computer science is a problem,” was the conclusion that Eric Schmidt — a software engineer and executive chairman of Alphabet, the parent company of Google — drew after months of touring defense laboratories and military sites.
Schmidt, who also chairs the advisory panel known as the Defense Innovation Board, was admittedly taken aback by a glaring disconnect: The world’s most powerful military, “broadly speaking, is run like a 1980s corporation. Not a lot of software is being used,” he told an audience of technologists and academics in Washington, D.C.
Schmidt’s verdict is blunt and arguably simplistic, but it should alarm military commanders and strategists as it shines a light on the Pentagon’s continuing troubles in modernizing its information technology infrastructure. And it comes at a time when national security writ large increasingly is becoming dependent on agencies’ abilities to collect, analyze and share torrents of data instantaneously.
The military services have come to grasp the severity of the challenge and are drawing up “big data” strategies to tackle growing information needs. Their plans are ambitious, although it is not yet apparent how they intend to bring them to fruition.
Among today’s most vocal champions of big data as an essential tool of war is Air Force Chief of Staff Gen. David Goldfein. “The network is the key,” he has said in speeches at military conferences. A global superpower like the United States will need its military forces to be connected and wired into information architectures in ways that are not possible now, he said. “How do we network military capability so we can create a common operating picture for decision makers?”
To stay ahead of the enemy, U.S. commanders will need to “create multiple dilemmas” on the ground, at sea, in the air and in space, in order to stress adversaries and weaken their ability to respond, said Goldfein. He described this as “multi-domain command and control.”
Goldfein has called for help from the private sector to realize this “data fusion” vision. And he recognizes that this will require the military to change traditional methods for designing and building weapons and tactical networks. “Key for me is how do we think about this, how do we move beyond a discussion about trucks and cargo and about the highway they ride on?”
The issues raised by the Air Force chief are not new, and they echo similar concerns expressed by other military leaders over the years as they see how the weaponization of data could become decisive in conflicts.
“We create volumes of data,” said Goldfein. “The question is how do we take the data and turn it into decision speed, and create effects? We do multi-domain operations but we don’t achieve the speed I think we need.”
When U.S. forces deploy today, there are “Marine cells,” or “special operations cells” that work side by side. However, Goldfein added, “We procure each cell’s technology through individual, proprietary software.” Future procurements should focus on the network, said Goldfein. “If we can get the highway right, what rides on it is a series of apps and apertures. How we get a common network is the focus we have to have in the future.”
The military’s difficulties harnessing technology have been probed by the Defense Innovation Board, a 20-member panel of tech industry and government officials established last year by the Obama administration. It has called on the Pentagon to embrace artificial intelligence software and machine-learning technologies more aggressively, not just for big data analysis, but also for routine activities such as surveillance.
“Data as a strategic asset needs to be managed differently than it is now,” Schmidt said. Like Goldfein, he lamented that information is locked away in stovepipes and not accessible to the users that need it. He is proposing that the Defense Department create a central repository of data.
Air Force intelligence officer Lt. Gen. Jack Shanahan told the Defense Business Board that on any given day the Air Force collects 22 terabytes of data. “You cannot exploit 22 terabytes of data the way we are doing things today. That is equivalent to five-and-a-half seasons of NFL video or two times the holdings of the entire printed version of the Library of Congress.”
Schmidt suggested many of the cultural barriers to innovation will be tough to take down. An illustration of the current environment is what he called a “two-display problem.” Operators work with dual systems because data is not integrated. That in itself explains the inefficiency and lack of interoperability seen across the military.
Schmidt said the Defense Innovation Board, over weeks-long listening tours, came to appreciate the complexity of the military’s data problems. Sensors that collect data are plentiful, but everyone is at a loss when it comes to fusion. “We told them, ‘That’s what you want but you’re not on your way to building that,’” said Schmidt. “Our recommendation is to create the conditions. You have a lot of information coming in, stored and forgotten. Nobody knows what to do. We keep encountering this over and over again.”
The solution? The Pentagon needs to beef up its talent base, Schmidt said. “They need software people. I would say this applies to the U.S. government in general. There is a lack of computer scientists.”
The big takeaway is a familiar one: The military has to start doing business differently. “The people we met are impressive,” said Schmidt. “The systems are horrific. At some point there will be some kind of crisis that will bring this into form.”
Topics: Acquisition, Defense Innovation