ROBOTICS AND AUTONOMOUS SYSTEMS

Defense Applications Envisioned for Cyber-Human Systems

5/2/2018
By Yasmin Tadjdeh

Image: iStock

With computing power increasing by leaps and bounds, the U.S. government and the defense industry are looking at ways to leverage emerging technology and assist human operators.

Cyber-human systems include technologies such as robots, wearable devices, personally-embedded sensors and computers, and virtual and augmented reality, said Jim Keffer, director of cyber at Lockheed Martin and a former Air Force major general who served as the chief of staff for U.S. Cyber Command. Such systems could help military analysts comb through mounds of data or alert commanders to when a soldier is fatigued.

Man-machine teaming is a key component of the Pentagon’s so-called “third offset strategy,” which seeks to give the U.S. military a technological edge on the battlefield against sophisticated adversaries.

“Our world … increasingly revolves around computers, computing, networks, information and data,” Keffer said. “That has come to play a very central role in how we will live our lives as humans — how we live, how we work, how we play, how we learn, how we discover, how we communicate.”

The boundaries between technology and the human dimension are shrinking, he said during a panel discussion at the National Defense Industrial Association’s Cyber-Enabled Emerging Technologies Symposium in McLean, Virginia.

These “environments are just becoming a natural extension of our human existence,” he said.

“This rapid emerging technology will provide boundless opportunities for our businesses, for our people, for mankind,” he said. “At the same time there is absolutely no doubt that we’re going to be faced with some very significant and tough challenges to work through, and some of them are ethical- and values-based.”

William Casebeer, senior manager at Lockheed Martin’s human systems and autonomy team, said his group consists of 35 scientists and engineers who focus on conducting basic research to improve the performance of human-machine teams.

“We see our mission at the end of the day as building what you might call pre-production prototypes that are living examples of human-machine interactions for performance improvement,” he said.

“In order to do that, we use a fairly simple, but hopefully not simplistic, framework that is … created around sensing, assessing and augmenting performance.”

A cyber-human system is when “you put together a sensing assessment and augmentation in a closed loop fashion … that could be used to improve the performance of the human-machine team,” he said.

On the sensing side, Lockheed systems look at a number of factors including neural states, heart rate and how much a person is sweating, Casebeer said. Those are the kinds of sensors that can be used with machine teaming, so the platform can have better insight into a warfighter’s physical condition.

However, that information is just raw data unless there is an assessment component, he noted.

“The most effective [human-machine] teams are ones when you have seamless interaction between the teammates to accomplish the task, mutual accommodation happening in real time like we see in our best human teams,” he said.

The Lockheed lab is making strides in terms of using machine learning techniques to extract meaning from that data, he added.

The next piece of the puzzle is to take information from that assessment and do something with it, Casebeer said. For example, that data could show that an operator is oversaturated with tasks.

“Is there something I can do before a detriment is taken in performance that will improve the performance of the team?” he asked. “Can I reallocate tasks, can I rebalance workload, can I make a go/no-go call for the human teammate?”

Getting that piece right will have major ramifications for the Defense Department, Casebeer said.

“Doing this well can result in radical performance improvement in multiple domains” relevant to the Pentagon such as imagery analysis and control of multiple unmanned assets, he said.

The Defense Department still faces issues implementing the technology, he noted. These include increasing trust and transparency between the human and the machines, ruggedizing sensors and crafting assessment methods to be as near real-time as possible.

James Kilbride, director of augmented reality at General Dynamics Mission Systems, said companies building such products must better understand how cyber systems influence humans.

“We don’t tend to think of the system as actually influencing what the human’s behaviors are,” he said. However, humans are prone to outside influence. For example, with advertising, company executives are able to effect a consumer’s decision, he noted.

“As we design these systems we have to take into account the fact that as the system learns or changes, it is going to be picking up behaviors and biases from the operators,” he said. “It is going to in turn create new bias and behaviors in the humans using [it] … and we need to take those into account in order to ensure that those solutions that we’re getting from the systems are the ones that we want.”

Robert Hoffman, senior research scientist at the Institute for Human and Machine Cognition, said militaries around the globe have long had aspirations to team humans and machines more closely, even going back as far as World War I and II. The U.S. Navy has also been a big advocate for such technology and has tried to introduce more automation into the fleet.

There is an idea that “machines need to be designed so that humans can use them. But more than that, in cognitive systems engineering, we advocate for technologies that are learnable, understandable and useful,” he said.

There must be “convincing, empirical evidence” that the systems are able to complete those three objectives, he said. “We would see a sea change if that were” the case, Hoffman added.

The work process that is created by the use of these technologies must also be observable, he said. The delivery of a product cannot be the end of an effort, he noted.

The delivery should coincide with the “beginning of another phase in which we continue to conduct empirical investigations of the work, because the work is always going to be a moving target,” he said. “The tools are always going to be changing. The adversaries are always going to be changing. The dynamics are going to be different. Unexpected things will happen.”

Col. William D. Bryant, deputy chief information security officer for mission assurance within the Secretary of the Air Force’s office of information dominance, and chief information officer, said there is no such thing as a pure cyber system.

“They are all cyber-physical systems,” he said during the panel discussion. “They all have a virtual, or a logical component and a physical … [footprint] in the real world.”

He pointed to server rooms, which connect computers around the world. To create a cyber-human system, a human must be inside the decision loop of that physical system, he said.

“A modern aircraft is a cyber-human … [system] because you have a human sitting in the seat at the controls,” he said.

When flying an F-16, a pilot may decide he or she wants to move the aircraft a certain number of degrees, but that action is ultimately performed by the flight controls, he noted.

But there are still limitations, Bryant noted.

“Computers are really good at sorting out massive amounts of data. Humans are not. We do very poorly in that,” he said. “What happened with the F-16 in particular is that the answer to every mission problem was [to] slap a pod on the F-16.”

Mechanically, flying the aircraft isn’t challenging, but managing the sensors and information flow and then making sense of that mountain of data is difficult, he said.

With the F-35 joint strike fighter, there are now larger screens where information is fused. “While that’s a step forward, I think the next step is filtering the data in much smarter ways,” Bryant said. That will involve artificial intelligence and machine learning, he noted.

Kevin Yin, CEO of SitScape, Inc., a Tysons Corner, Virginia-based enterprise software company, said developing systems that are mature will be key to advancing cyber-human system technology.

The systems as a whole need to be optimized to understand which areas the human is more capable in versus the machines, he noted.

Humans already rely heavily on cyber-human technology to help them make decisions, Bryant added.

“You do realize that every single one of you, probably without exception is a cyborg already, right?” he said. “Does anyone not have … [a smartphone] within reach? Now, it’s not implanted under your skin yet … [but] this connects me to an unimaginably large amount of information and data that we couldn’t have even imagined five, 10 years ago. The fact that I haven’t implanted it under the skin yet … actually doesn’t matter that much.”

Topics: Robotics, Robotics and Autonomous Systems, Research and Development, Defense Department, Defense Innovation

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
 
Please enter the text displayed in the image.