Military Beefs Up Research Into Swarming Drones
If an anti-aircraft weapon takes down one drone, the others change direction, push through and destroy the target kamikaze style.
Senior Air Force officers and Defense Secretary Ashton Carter are among the military leaders who have touted “swarming” robots lately, although the Air Force Research Laboratory doesn’t like that term. It prefers “distributed collaborative systems.”
The problem with the biological term “swarming” is that it doesn’t fully describe where the Air Force and others are going with this technology, said Kristen Kearns, autonomy portfolio lead at AFRL.
Swarming fish and birds don’t collaborate much, she said in an interview. “That collaboration is where we anticipate where you would be able to gain capability as opposed to blindly following or staying out of the way of everything else in the team.”
Distributed collaborative systems for the Air Force is “about putting that next level of decision making and capability on the platform. Not only can it maintain itself, but it can work other parts of the team, whether those be airmen, or whether those be other machines to perform a mission task.”
Also working on the concept is the Pentagon’s Strategic Technology Office.
Carter in a speech at the Economic Club of Washington D.C., in February highlighted the work the office is doing.
One project uses “swarming autonomous vehicles in all sorts of ways and in multiple domains,” he said. “In the air, they develop micro-drones that are really fast, really resistant. They can fly through heavy winds and be kicked out the back of a fighter jet moving at Mach 0.9, like they did during an operational exercise in Alaska last year, or they can be thrown into the air by a soldier in the middle of the Iraqi desert.”
Peter W. Singer, a strategist at the New America Foundation, said, “Swarming has several potential benefits. It is a way to gain the effect of greater intelligence without each individual unit needing to be intelligent. Think how ants can perform incredibly complex tasks, even though each ant is not all that smart.”
Gen. Ellen Pawlikowski, commander of the Air Force Material Command in a speech last year, said swarming drones “can be very much a game-changing reality for our Air Force in the future.”
Today, once a missile is launched at a target, the human is out of the decision-making loop. He has no control over it. “When we separate the weapon from the aircraft, we separate the weapon from the human,” she said. A swarm of thinking, flying munitions could be commanded in mid-flight to change direction.
Before that happens AFRL has many technological hurdles to overcome, Kearns said. It will require a lot of collaboration and data sharing between the weapon systems, she added.
Machine perception, what the robot sees, is a relatively mature technology. They can identify a large object such as a tank. But the swarming concept poses some difficulties. “The challenge becomes what happens when it can only see half of that object. How can that machine identify what it is when it needs to do some inferencing?” Kearns asked.
That will require more research, she added.
And given that missiles move rapidly, the data stream becomes vital — not only maintaining connections between the robots, but managing what is passed between them, she said.
“What is the amount of information that needs to be shared between the systems to be able to do that collaboration?” she asked.
“Sending a lot of data would be easy but the problem is because of the time and the speed, you have to know exactly the data that needs to be sent in order to have that coordinated interaction,” she said.
A consultant familiar with the Defense Department autonomy programs, who declined to be named, said the human operator may not be able to compete with a fully autonomous system that identifies, analyzes and geolocates a target, especially in such a scenario where the swarm is moving rapidly.
Nevertheless, “the power and the sheer speed of execution would give them a huge advantage over their adversaries,” he said.
There is a fundamental debate in the armed services about the correct amount of human and autonomous operators, he said.
“Getting the right blend of autonomous, semi-autonomous and autonomous systems, which related to drones or robotics or other physical or virtual capabilities, is an evolving debate, but that’s where the future is going,” he said.
When targeting, it’s important to remember that humans are moral creatures. Machines are not, he said.
“The ultimate safeguards on the use of lethality are driven by humans and will be driven by humans until such time as we have sufficient capabilities in moral, smart machines,” he said.
Singer said, “Just like with past human ‘swarms’ in war, like German U-boat wolf packs, it is a lot easier for them to operate if they can easily communicate. But that may not always be possible. So you then have to pack in more and more intelligence, and give them greater and greater autonomy.”
Kearns said: One of the major challenges with any autonomous system is verifying and validating that the decisions it is making are correct. “And that is the case across the Department of Defense, when we work with the Army and the Navy, that is a challenge that we all have.”
Trust, or “verification and validation,” becomes paramount with what is essentially artificial intelligence, Kearns added. “How do we assure safe and effective operations when we put decision making in the platforms?”
Some of the technologies needed to advance the field of distributed collaborative systems will be developed before the problem of verification and validation is solved, she said.
Arati Prabhakar, director of the Defense Advanced Research Projects Agency, said in a briefing with reporters, that there is a powerful new wave of research happening with artificial intelligence but “one of the biggest issues is trust and confidence in what [machines] tell us what they think is happening and what course of action to take.” The field needs more of a rigorous theoretical foundation, she added.
Kearns said the first step in the program will be to develop what the AFRL is calling a “loyal wingman.” In that case, a remotely piloted aircraft would follow a jet fighter. It could serve as a “bomb truck” and carry extra ordnance, or be another set of eyes with its sensors. It could be directed to go to a hazardous area where the pilot doesn’t want to go, she said.
While this is a step toward distributed collaborative systems, it has its own questions and challenges.
What is the airman’s job today, and what can researchers do that enhances and allows him to do that more effectively? How can they efficiently and effectively help pilots to make decisions and support them so there is a fluid operational team of both this intelligent system and airmen working together? she asked.
“What we don’t want to do is hand them a tool ... and actually make their jobs harder,” Kearns said.
The question becomes how to manage the workload of that pilot as you add a robotic wingman, she said. “What work does that pilot need to do and how much workload can you give them before you start impacting performance?” she asked.
“It is critical that your pilot isn’t up there flying his plane and another plane,” she said.
Steve Walker, deputy director of DARPA, said his agency has been working on developing battle management systems with a blend of manned and unmanned vehicles.
“You have humans and unmanned systems and you need data fused together quickly and things are happening fast and you don’t want to overload the human with all that information. … You want to give him or her exactly what he needs to make a decision and have all these distributed effects work together,” he said.
Singer said swarming is “a more resilient approach. Rather than having one single system or single point of failure, it can take losses and still function. This also means it complicates an enemy’s job by having so many more targets to have to take out. And it might also be cheaper than trying to put all your capability in one exquisite system that tries to do it all, that is ‘too big to fail.’”
Hordes of autonomous robots attacking a target or performing other tasks are still a number of years in the future, Kearns said. AFRL is looking to conduct experiments on the loyal wingman concept by fiscal year 2022. Expanding the distributed collaborative system concept would come later.
There is a lot of ongoing work in autonomy across the Defense Department, and AFRL will leverage some of it. For example, there is no need to do a lot of research into autonomous landing when the Navy has already demonstrated this on an aircraft carrier with its UCLASS unmanned aerial vehicle. Landing on a moving ship is a lot harder than landing on the ground, she noted.
Singer said, “Some things are possible today if we were willing to pull the trigger, budget-wise, and some are several years off.” Meanwhile, there are many videos on YouTube showing some amazing advances with robots flying, sailing or moving in formation, he said.
“But one thing to note about the YouTube aspect: it’s also a good illustration of how so much of the advancement in this space is happening outside the defense world,” he added.