EMERGING TECHNOLOGIES

Algorithmic Warfare: Google Versus The Pentagon, The Fallout

8/2/2018
By Yasmin Tadjdeh

iStock

Current and former Pentagon officials are worried that Google’s withdrawal from Project Maven — one of the Defense Department’s most high-profile artificial intelligence efforts — could have disastrous effects.

The goal of Maven is to develop AI systems that can analyze reams of full-motion video data collected by drones and tip off human analysts when objects of interest pop up.

Over the past several months, the program — which included partnerships with Silicon Valley companies — has been met with vocal opposition from some in the commercial sector, particularly from Google employees.

In April, more than 3,000 Google workers signed a letter stating that the company “should not be in the business of war” and should sever ties with Project Maven.

In May, Google Cloud CEO Diane Greene announced internally that the company would not seek another contract associated with the effort, news reports said.

That move, however, creates an “enormous moral hazard” for the company, said Robert O. Work, former deputy secretary of defense, during a panel discussion at a recent conference.

“They say, ‘Look, this data could potentially down the line at some point cause harm to human life,’” he said. “But it might save 500 Americans, or 500 allies or 500 innocent civilians from being attacked. So I really believe that Google employees are creating a moral hazard for themselves.”

When Project Maven was stood up, it was meant to be a pathfinder to demonstrate how the Defense Department could better use artificial intelligence and machine learning, he said.

“We picked what we considered to be the absolute least objectionable thing, and that is using computer vision and teaching AI to look for things on video,” he said.

Using a sensor called Gorgon Stare, a drone can fly over a city and collect massive amounts of video, he noted. However, even with three seven-person teams working constantly, the Pentagon was only able to analyze 15 percent of the data. “The other 85 percent of the tape was on the floor,” he said.

Artificial intelligence programs, however, could prompt analysts when objects of interest appear, he noted.

Work did not downplay the possibility that it could result in a military strike. “I fully agree that it might end up with us taking a shot,” he said. “But it can easily save lives.”

Work also noted that Google has an AI center based in China, which is cause for alarm.

“In China, they have a concept called military-civil fusion. Anything that’s going on in that AI center ... is going to the Chinese government and then will ultimately wind up in the hands of the Chinese military,” he said. “I didn’t see any Google employees saying, ‘Hmm, maybe we shouldn’t do that.’”

Google did not respond to requests for comment.

Josh Marcuse, the executive director of the Defense Innovation Board, said the Pentagon needs to have partnerships and dialogue with those that work in Silicon Valley as it develops powerful new technologies such as artificial intelligence that call up ethical concerns.

“Engineers who work at these companies who are taking issue with our approach should be active participants in making sure that ethics and safety is at the forefront of what we do, as it has been with every other weapon system,” he said. “We applied international humanitarian law on undersea warfare. We applied it to war in the air. And we will apply it to cyberspace and to AI.”

The Pentagon needs to have knowledgeable partners who the department can consult with on the technology, he added. “We want to be safe practitioners of these tools,” he said. “That requires working with us.”

Navy Capt. Sean Heritage, acting managing partner at the Defense Innovation Unit – Experimental, said during the panel discussion that instances of Silicon Valley companies balking at working with the Defense Department have been few and far between.

“The things that we are reading about in the press are really small in number when you talk about the number of patriots out there who are willing to help solve our problem,” he said.

DIUx is the brain child of former Secretary of Defense Ash Carter and was established to cut through the Pentagon’s bureaucratic red tape and make it easier for firms in Silicon Valley and other tech hubs to do business with the Defense Department.

Too much attention is being paid to these refusals to cooperate, Heritage said. The Pentagon has come a long way in establishing relationships with companies in the region, he added.

Air Force Gen. James “Mike” Holmes, commander of Air Combat Command, said Project Maven has already provided operators with new tools that have streamlined processes and reduced workloads.

“We are seeing some payoff,” he noted during a breakfast meeting with defense reporters in Washington, D.C.

But Holmes said he was worried that Silicon Valley companies could pull out of the burgeoning program.

“Am I concerned? Yes,” he said. “But look, this is part of being an American. … Americans have expectations about what their government does and whether the government uses technology and tools to infringe upon their rights or not.”

For the military to compete and deter peer competitors, it needs to tap into investments being made in the commercial sector, he said. The Pentagon, along with Silicon Valley, will have to work through their respective comfort levels when it comes to developing new technologies and how they are applied to the battlefield, he added.

“We’ll work to make sure that we use them for good in accordance with our Constitution and what it requires us to do,” he said. “What I would like to do is be able to convince people that we’re all in the business of avoiding major war.”

Topics: Cyber, Cybersecurity, Emerging Technologies

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
 
Please enter the text displayed in the image.