RESEARCH AND DEVELOPMENT
Army Uses Mad Scientist Gathering to Explore Emerging Technologies
AUSTIN, Texas — To better understand new technologies and scientific efforts that could aid warfighters, the Army is connecting with industry and academia through its “Mad Scientist” initiative.
The service is asking itself where the military challenges and opportunities are moving forward, Lee Grubbs, Mad Scientist director, said in an interview with National Defense on the sidelines of the program’s annual conference. “We translate that into real-person speak — civilian speak. We look at where analogies of that exist in the commercial world.”
As the service has shifted focus from counterinsurgency to large-scale operations, the Army decided the time was right to bring back the Mad Scientist program a few years ago, Grubbs said. The 2018 national defense strategy puts a renewed focus on countering great power adversaries such as Russia and China.
The conference was held in April at the University of Texas-Austin, because its Cockrell School of Engineering is one of the top engineering schools in the country, he said. This year’s themes were robotic technologies and the future of space and ethics.
When picking a school where the conference will take place, the Army considers which has the resident expertise, Grubbs said. “In this case, you have a great confluence of robotics capability, people who were experts in space and the topics we want to” discuss.
Previous events under the Mad Scientist initiative sometimes included classified sessions, he noted. However, the program is now almost entirely open source-based.
“It’s almost completely collaborative across academia and industry,” he said. “We brought it back and changed it based on where the Army was at the time.”
Grubbs said the Mad Scientist initiative is not focused on creating programs of record. Rather, the intent is to “collaborate, bring in creative thinkers [and] explore ideas in different ways.” After those ideas are coalesced, they are presented to individuals and organizations that can carry them forward, he said.
The initiative relies heavily on commercial industry so the Army can explore emerging technologies where it is not leading the way, he noted.
“The explosion in biology is not something that’s happening in a DoD lab,” he said. “It’s originating in commercial labs looking for where there might be profit — artificial intelligence, autonomy, robotics.”
The conference’s themes and topics are tailored to the Army’s warfighting concepts, Grubbs said. Two years ago, the event also followed a robotics theme to align with the release of its robotics and autonomous systems strategy, he noted.
The strategy detailed the Army’s push to integrate such systems into its force to help counter enemies that are becoming increasingly more capable. The document focused on addressing three challenges, which included increased speed of adversary actions, increased use of robotics by adversaries, and increased congestion in dense urban environments where communications will be stretched to the breaking point.
Robotic technology will be a central aspect of future human-machine collaborations and human-machine combat teams, Robert O. Work, former deputy defense secretary and senior fellow at the Center for a New American Security, said at the conference.
“Robotics is going to be a force multiplier. They are being increasingly used to augment — not just replace — humans,” he said. “They will work together as teams.”
Russia already has big ambitions for its investments in robotics, he said, noting it is testing systems in combat in Syria. Moscow has stated its intention to roboticize one-third of its entire force structure by 2030, he added. However, Work expressed doubt about Russia’s ability to reach this goal.
“I don’t think they have the economic wherewithal and I don’t think they have the technical wherewithal,” he said. “They don’t have the innovative ecosystem to do that, but this is the Army’s primary competitor.”
The technology will also be useful for the United States as it tries to reduce costs associated with maintaining its force structure, he said. Manning units has “just become so expensive,” he said. “We’re going to have to have unmanned systems to replace humans simply because we won’t be able to afford all the humans we want.”
During the conference, experts highlighted new technologies that could prove useful in the future. Maruthi Akella, professor of aerospace engineering and engineering mechanics at the University of Texas-Austin, said he is working with the Defense Department on initiatives that will help thwart enemy detection systems. The idea is to fly drones in a way that would make the adversary’s technology think there is a “phantom aircraft.” This would help distract enemies while the military conducts attacks, he noted.
“The enemy resources are temporarily focused on something that doesn’t exist and you go and execute your strike mission,” he explained.
The weakest part of robotic systems is usually the communication channel, which could be hijacked, spoofed or compromised, he said. To secure it, the algorithm of the channel must be “hardened” and the way information is processed must be changed, he noted.
Akella said some of his work with the Army and defense contractor General Dynamics has focused on examining how to make communication channels between drones and ground control systems more robust. For example, the future airborne capability environment initiative focused on maintaining secure channels while ensuring that the systems were still able to operate quickly and autonomously, he noted.
The ethics of using artificial intelligence for military purposes has long been controversial. Ken Fleischmann, associate professor at the school of information at UT-Austin, said applying the technology to the battlefield can be complex because robots must decide if someone is a threat before attacking them.
“We can build AI that does very well at chess ... or Jeopardy,” he said. However, that is “very different from having a robot that is going to behave ethically on the battlefield.”
The most fundamental consideration when applying AI to military systems is ensuring that users are able to understand why machines make their decisions, he said.
“It’s very important that you understand what is the context, how the machine is working and what it’s actually telling you,” Fleischmann said. “That’s going from transparency to trust to human agency.”
The conference also addressed GPS spoofing. The Pentagon has been examining ways to prevent adversaries from replacing GPS signals with false or misleading information that could wreak havoc on battle plans.
Todd Humphreys, associate professor for aerospace engineering and engineering mechanics at UT-Austin, said this technique is being used by the Russians to protect the location of President Vladimir Putin. Humphreys contributed to “Above Us Only Stars: Exposing GPS Spoofing in Russia and Syria,” a report by the think tank C4ADS that explored this phenomenon. The study found that spoofing was found to occur in areas where Putin was present. Moscow is using techniques that are “extremely potent,” Humphreys noted.
Ironically, these activities have actually provided enough information to track Putin’s locations, creating a “space-based Vladimir Putin detector,” Humphreys said.
“His secret service is generating these [spoofing] signals so that they can kind of put a protective bubble around him,” he noted. “But I’m not sure they understood that we could follow that location from space.”
Using information gathered by the International Space Station, Humphreys said he was also able to detect where spoofing activities were coming from. They were originating from an airbase in Syria that the Russians have been operating out of since 2017, he said.
“When we were listening from the altitude of the International Space Station, the only thing we could hear was this massive jammer,” he said. “We call it a ‘spoofer jammer’ because it does a little bit of spoofing and a little bit of jamming.”
Hypersonic vehicles are another technology of interest for the Pentagon. Alex Roesler, deputy director of the integrated military systems center at Sandia National Laboratories, said during the conference that the lab is examining ways to integrate autonomous capabilities into hypersonic systems. This would help the military increase its lethality, engage mobile targets and use systems that are able to attack on their own, he noted.
It would also cut down on the time needed to conduct mission and flight planning for hypersonic tests, he said. Today’s tests require a large number of personnel and weeks of planning. However, with autonomous capabilities, this process could be reduced to a few minutes, he noted.
“The analysis is time consuming,” he said. “What we think is needed are solutions that are artificial intelligence- and machine learning-based that automates that analysis.”
An autonomous hypersonic vehicle could also develop new capabilities by being put through virtual wargames, he noted.
“Set up a virtual environment, let a machine play out thousands upon thousands of hypersonic engagements and have it learn from those what tactics and strategies work and don’t work,” he said. “Then see if you can take that intelligence and embed it onto the platform.”
The Mad Scientist program also hosts crowdsourcing events, Lee said. For example, one recent event was geared toward ideas on how to leverage AI for non-lethal operations such as intelligence gathering and mission preparation, he noted.
“Mad Scientist [isn’t] just conferences,” he said. “It is a full-time exploration of the operational environment from today to the future.”