VIEWPOINT ROBOTICS AND AUTONOMOUS SYSTEMS
Defense Department Struggles to Define Autonomy
By Dave Chesebrough and Matt Dooley

Photo: Defense Dept.
Defense Secretary James Mattis’ National Defense Strategy maintains that the security environment is directly affected by rapid technological advancements driven by commercial technology, and this has fundamentally changed the character of war.
The strategy specifically identifies autonomy as one of these technologies. We hear a lot today about self-driving cars. Recent fatal accidents in Arizona and California have ignited a debate about the operational efficacy and safety of autonomous vehicles, even with human backup. John Paul MacDuffie, director of program and vehicle mobility at the Wharton School’s Institute for Innovation Management, offers the observation that we are in the early days of the evolution of driverless technology, and accidents are to be expected.
In the automotive world driverless vehicle testing must be conducted on public roads in real-world conditions to prove out the technology. But that involves the general population in the risk pool for what is essentially the test and evaluation of these robotic vehicles. When accidents happen, innocent people not connected with the testing can be the victims. Arizona suspended Uber driverless car testing after a fatal accident involving a pedestrian.
What is the analogy for military systems, which must be evaluated under operational conditions, and which may carry lethal weapons and need to participate in combined arms units? There are complicated questions surrounding integration of autonomous systems into operational forces and how they are tested and evaluated to assure both safety and effectiveness.
At the National Defense Industrial Association’s Ground Robotics Capabilities Conference and Expo this year there was a heavy emphasis on the Army’s 2017 Robotics and Autonomy Strategy and the integration of autonomous robots into operational forces.
Robots have been a subject of science fiction since the genre was invented. We tend to project onto these fictional machines human-like abilities — we like to think they will behave as we do. But the reality is that there are many levels of autonomous capabilities and integrating these into a military force is complex, and is being intensely examined by the defense community.
When a fully autonomous vehicle is deployed in a combined arms operation it has to perform and respond as a vehicle with a human operator would. That means it must conform to the established tactics, procedures and tasks while also being capable of coordinated and independent action and decisions. To be clear, we are not talking about drones or explosive ordnance disposal robots that are controlled by human operators from some distance away, although those will likely be a part of the force. No, the ultimate goal is to have a robot perform all aspects of a task autonomously without human intervention. This includes sensing, planning, or implementing action in a self-governing manner.
The Army strategy states that “effective integration of [the
strategy] improves U.S. forces’ ability to maintain overmatch and renders an enemy unable to respond effectively.”
It identified three compelling challenges that drive the Army’s need for robotic systems: increased speed of adversary actions, including greater standoff distances; increased use of robots by adversaries; and increased congestion in dense urban environments.
The strategy listed five capability objectives: increase situational awareness; lighten the soldiers’ physical and cognitive workloads; sustain the force with increased distribution, throughput and efficiency; facilitate movement and maneuver; and protect the force.
The 2016 Defense Science Board study on autonomy concluded that the Defense Department must accelerate its exploitation of autonomy to realize the potential military value and to remain ahead of adversaries who also will exploit its operational benefits. This underscores the fact that robotic technologies are fundamentally commercial and universally available, enabling adversaries to leapfrog the United States in deployment.
The autonomy study specifically identified the issue of trust as core to the department’s success in broader adoption of autonomy. Trust is established through the design and testing of an autonomous system and is essential to its effective operation. If troops in the field can’t trust that a system will operate as intended, they will not employ it. Operators must know that if a variation in operations occurs or the system fails in any way, it will respond appropriately or can be placed under human control.
To achieve the strategy’s capability objectives we need to understand the attributes of autonomy and their wider implications in order to see how robotic systems should function in autonomous mode when inserted into combined arms forces.
Autonomy describes operations of systems or machines with various degrees of human involvement, the ultimate goal being self-governance, recognition and decision-making. The National Institute of Standards and Technology conducted a series of workshops with the Defense Department and others that defined autonomy as an unmanned system’s own ability of integrated sensing, perceiving, analyzing, communicating, planning, decision-making and acting/executing, to achieve its goals as assigned. (See NIST Special Publication 1011, “Autonomy Levels for Unmanned Systems Framework.”)
The workshops recognized that the joint user community has struggled for years to find a common method of articulating requirements for unmanned vehicles that would guide development and testing, specifying two major parts of the user’s needs: a common vernacular that could be used to articulate capabilities such as common set of definitions; and a means to articulate the amount of autonomy required/expected from an unmanned system.
In 2013, the Department of Transportation’s National Highway Traffic Safety Administration defined five different levels of autonomous driving. In October 2016, it updated its policy to reflect that it has officially adopted the levels of autonomy outlined in SAE International’s J3016 - Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems.
These levels set graduated steps from Level 0 — human controlled with no autonomy, to Level 5 — full autonomous operation with no human involvement.
These two frameworks and other definitional constructs also use these “bookend” definitions, but differ in the gradations between them. Progress has been made, but there is still not a universally understood, agreed upon set of definitions and terms regarding autonomy, and while there is a converging of the commercially oriented SAE, National Highway Traffic Safety Administration and international automotive definitions, these are insufficient for the military’s need.
While the definitional constructs developed to date have advanced our understanding of the levels of autonomy and allowed substantial advancement, they are inadequate to articulate the complex operational activities and platform versus payload behaviors that the U.S. military and its supporting defense industry and academia must have to field operational systems.
The framework does not address the higher levels of autonomy that involve collaborative activity such as swarming, and should be expanded to encompass required end-state capabilities for future autonomous systems that are expected to function both as individual platforms as wells as part of a team. John H. Northrop and Associates in a concept study presented at the conference, stated that there is no formal tool to show the graduated scale of autonomous capabilities and control to enable commanders in the field to choose the right systems for the right tasks.
The concept study suggests committing to developing a refined matrix of operational autonomous levels for unmanned systems, with standardized terms of reference, sliding scale of functional autonomous tactical behaviors and descriptions, and practical application to the Army tactical tasks to alleviate confusion regarding the service’s desired autonomy objectives.
The Army’s chief of staff has specifically directed the fielding of leader/follower autonomous logistics convoy systems as well as unmanned and optionally manned prototypes of the next-generation combat vehicles by fiscal year 2021. The Army strategy stated: “Delivering RAS capabilities will not be easy. And because RAS is a relatively new range of capabilities, execution will require Army leaders to be open to new ideas and encourage bottom-up learning from soldiers and units in experimentation and the Army’s warfighting assessments.”
To accomplish these objectives, industry and government must come together to expand and redefine NIST’s Autonomy Levels for Unmanned Systems Framework construct in operational language.
This will enable the Army to operationalize its requirements terminology and to publicize a militarily useful set of definitions engineers can use. This will also clarify definitions of advanced teaming operational expectations to properly define collaborative robotics requirements.
Achieving the goals of the robotics and autonomy strategy is not possible without the combined collaborative efforts of both industry and government.
Topics: Robotics, Robotics and Autonomous Systems
Comments (0)