ADVANCED WEAPONS

Pentagon Seeks Smarter Machines for Future Combat (UPDATED)

3/1/2016
By Jon Harper
Photo Credit: iStock


Defense Department officials view autonomous systems and artificial intelligence as the key ingredients of their new warfighting strategy. The Pentagon is now laying the groundwork for the rise of smarter military machines.

Deputy Secretary of Defense Robert Work said several recent studies conducted by the Pentagon and the Defense Science Board focused on answering the following questions: “Where are we missing capabilities and where would we like new capabilities?”

“There was remarkable consistency,” he said at a recent conference hosted by the Center for a New American Security. “The theme that came out over and over and over again is what we call human-machine collaboration and combat teaming. ...What is it that really is going to make human-machine collaboration and combat teaming a reality? That is going to be advances in artificial intelligence and autonomy that we see around us every day.”

Air Force Gen. Paul Selva, vice chairman of the Joint Chiefs of Staff, noted that the commercial world has already developed sophisticated autonomous systems.

“You and I are on the cusp of being able to own cars that will drive themselves,” he told a largely civilian audience at the Brookings Institution, referring to smart vehicles such as ones developed by Google. “The technology exists today. It has been proven.” 

The Pentagon already possesses unmanned ground, undersea and aerial systems, which are remotely piloted by human beings, he noted. Now “we can actually build autonomous vehicles [that don’t require direct human guidance] in every one of those categories.”

The technology is advancing rapidly, said Paul Scharre, director of the 20YY Future of Warfare Initiative at CNAS.

“The rapid growth of computing power is resulting in increasingly intelligent machines. When embodied in physical machines, this trend is allowing the growth of increasingly capable and autonomous munitions and robotic systems,” he said in testimony to the Senate Armed Services Committee in November.

The scientific community has reached an “inflection point” in the power of artificial intelligence and autonomy, which could lead to “entirely new levels of what we refer to as man-machine symbiosis on the battlefield,” Work said.

Greg Zacharias, the Air Force’s chief scientist, said military researchers are focused on the following areas: development of more advanced sensors and data-gathering technology for systems to better understand their operating environment; development of reasoning systems and software to assess situations and make recommendations or decisions; and the refinement of different ways of carrying out those recommendations and decisions, whether through direct action such as guiding another unmanned platform or through recommendations to another human or machine teammate.

“The overall goal here is to enable systems to react appropriately to their environment and perform situationally appropriate tasks, synchronized and integrated with other autonomous human or machine systems,” he said at a House Armed Services Emerging Threats and Capabilities Subcommittee hearing in November.

The Pentagon’s plans include the acquisition of “deep learning” systems. Work outlined a potential scenario for how artificial intelligence could be employed in combat as part of a “learning” network:

“If we launch seven missiles at a surface action group and one missile goes high and is looking at all of the different things that the battle group is doing to defend itself and it sees something new that’s not in its library, it will immediately report back on the learning network, which will go into a learning machine, which will say [to military commanders], ‘There is something you should do’ …  so that the next seven missiles launched will be that much more effective.”

Artificial intelligence and autonomous systems will be critical in future high-tempo warfighting environments, defense experts said.

“A key contest in war will be between adversary cognitive systems, both artificial and human, to process information, understand the battle space, and decide and execute faster than the enemy,” Scharre said. “Advances in machine intelligence show great promise for increasing the ability of artificial cognitive systems to understand and react to information.”

Humans will not be able to match the capabilities of autonomous systems when it comes to certain types of operations such as missile defense or cybersecurity, Work said. “When you’re under attack, especially at machine speeds, we want to have a machine that can protect us. … You cannot have a human operator, operating at a human speed, fighting back against a determined cyber attack. You’re going to have to have a learning machine that does that.”

Learning machines could be especially helpful when it comes to analyzing big data, Selva said. “We have … a requirement to be able to sort some of the largest databases on the planet, if you just think about our intelligence databases and all the digitized information that exists.”

The Defense Department needs algorithms that would allow a machine to detect abnormalities or changes in the operating environment, highlight them for analysts and then make recommendations about how to respond, he said.

“The data sets that we deal with have gotten so large and so complex that if we don’t have something to help us sort them, we’re just going to be buried in the data,” he said. “The deep learning concept of teaching coherent machines ... to advise humans and making them our partners has huge consequences.”

But testing deep learning systems to ensure their effectiveness and reliability could be challenging. Software capable of telling defense officials what a military machine has learned does not yet exist, Selva noted.

“That is one of the milestones we are going to have to cross before we can actually get into a high confidence area where we can say the technology is actually going to do what we want it to do, because not only can we physically test it, we can intellectually test it,” he said.

Autonomy and artificial intelligence could save the Pentagon money and significantly reduce manpower requirements for deploying unmanned systems, officials and analysts said.

“I think inherent to unmanned are tremendous savings” as individuals are able to manage and oversee multiple systems, said Rear Adm. Robert Girrier, director of unmanned warfare systems within the office of the chief of naval operations, during recent remarks at the Center for Strategic and International Studies. “The technology is driving us in that direction. And clearly you can see how that’s something in the back of our mind” in an era where budgets are strained.

Defense Department agencies are researching and developing relatively inexpensive “swarm” systems, which humans could supervise during operations. “These efforts hint at the next paradigm shift in warfare — from fighting as a network of a few, expensive platforms as we do today, to in the future fighting as a swarm of many low-cost assets that can coordinate their actions,” Scharre said.

Work said the Pentagon is “actively looking at a large number of very, very advanced things” such as large-capacity unmanned underwater vehicles that “cascade” smaller UUVs and form underwater networks; as well as autonomous small service vessels and UAVs that could operate together.

“You’re going to see a lot more mother ships whose [autonomous] offspring work to execute the mission,” he said.

The integration of autonomous systems and artificial intelligence into the force are key elements of the Pentagon’s new “third offset strategy,” which is intended to leverage emerging high tech to maintain military superiority over potential adversaries, who are acquiring sophisticated weapons similar to those currently possessed by the United States.

“We are going to place multiple bets” on promising technologies, Selva said.

“My tolerance for risk is pretty high,” he added. “When I go out to Silicon Valley and ask a software engineer to think about a learning software that’s going to answer one of my problems, it’s a really hard question. … In this cycle of rapid innovation in civilian and military technology, we have to be able to accept failure, not in the battle space [but] in the development phase.”

The Pentagon isn’t keeping up with the private sector, Selva said. “There is much more innovation going on at a much faster pace in the software and artificial intelligence sector of the civilian world than in the military world. We’re going to have to take advantage” of that.

“We have a small fund in the department … that we’re using to go out and interact with companies that are operating in those spaces,” he said. “They have been very open to looking at the kinds of innovation we want to involve ourselves in. That goes a little bit to artificial intelligence, it goes a little bit to the deep learning, it goes to the exploitation of big data to answer very specific Defense Department-related questions.”

The Pentagon has “entered into some of those arrangements,” he said without identifying the companies involved.

Developing and procuring these technologies aren’t the only challenges the Pentagon is facing, officials and analysts noted. Organizational and operational paradigms also need to evolve.

“It might be a new unit that does something a new way, that employs a technology in ways that we haven’t seen in the past, or it might be a new doctrine … which completely changes the focus of the entire Army,” Work said. “You need new technological capability to try to achieve overmatch, [but] you need to have new organizational and operational constructs to make it real and to gain operational advantage.”

Convincing troops to rely on autonomous systems to perform well in combat is also critical.

“These innovations in autonomy … need to be nurtured and introduced in a manner which will gain the trust of our sailors and Marines and the public we are here to protect,” Frank Kelley, deputy assistant secretary of the Navy for unmanned systems, told lawmakers in November. “Realizing the vision of a fully integrated unmanned and manned naval force will depend as much on significant military cultural evolution as on the technology innovations.”

The technologies also raise ethical questions. Today, troops are in the loop when it comes to making decisions about killing adversaries. But as technology improves, the Pentagon could eventually face a “Terminator conundrum,” Selva said.

“What happens when that thing can inflict mortal harm and is empowered by artificial intelligence?” he said. “How are we going to deal with that? How are we going to know what’s in the vehicle’s mind ... [and] how do we know with certainty what it’s going to do?

“Those are the problem sets that I think we are going to have to deal with in the technology sector that [are] making building the platform actually a relatively simple problem.”

Correction: A previous version of the story said Paul Scharre testified at a House Armed Services Committee hearing. The testimony referenced in the story was given to the Senate Armed Services Committee.

Topics: Robotics, Science and Engineering Technology

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
 
Please enter the text displayed in the image.