Autonomy refers to a system’s ability to accomplish goals independently, or with minimal supervision from human operators in environments that are complex and unpredictable. Autonomous systems are increasingly critical to several current and future Department of Defense (DoD) mission needs. For example, the U.S. Army Robotics and Autonomous Systems (RAS) strategy report for 2015-2040 identifies a range of capability objectives, including enhanced situational awareness, cognitive workload reduction, force protection, cyber defense, logistics, etc, that rely on autonomous systems and higher levels of autonomy.
Tremendous advances have been made in the last decade in constructing autonomous Cyber Physical Systems (CPS), as evidenced by the proliferation of a variety of unmanned systems: air, ground, sea, and undersea vehicles. These advances have been driven by innovations in several areas, such as sensor and actuator technologies, computing technologies, control theory, design methods and tools, modeling and simulation technologies, among others. In spite of these advances, deployment and broader adoption of such systems in safety-critical DoD applications remains challenging and controversial.
Several factors impede the deployment and adoption of autonomous systems:
Historically, assurance has been approached through design processes following rigorous safety standards in development, and demonstrating compliance through system testing. However, these standards have been developed primarily for human-in-the-loop systems, and are bounded in scope, not extending to advanced levels of autonomy where system behavior depends on its memory of received stimuli. Current assurance approaches are predicated on the assumption that once the system is deployed, it does not learn and evolve.
The goal of the Assured Autonomy program is to create technology for continual assurance of Learning-Enabled, Cyber Physical Systems (LE-CPSs). Continual assurance is defined as an assurance of the safety and functional correctness of the system provided provisionally at design time, and continually monitored, updated, and evaluated at operation-time as the system and its environment evolves. An LE-CPS is defined as a system composed of one or more Learning-enabled Components (LECs). A LEC is a component whose behavior is driven by “background knowledge” acquired and updated through a “learning process,” while operating in a dynamic and unstructured environment. This definition generalizes and admits a variety of popular machine learning approaches and algorithms (e.g., supervisory learning for training classifiers, reinforcement learning for developing control policies, algorithms for learning system dynamics). The generalization is intentional to promote abstractions and tools that can be applied to different types and applications of data-driven machine learning algorithms in Cyber Physical Systems (CPSs) to enhance their autonomy.
In order to ground the Assured Autonomy research objectives, the program will prioritize challenge problems in the militarily relevant autonomous vehicle space. However, it is anticipated that the tools, toolchains, and algorithms created will be relevant to other LE-CPSs. The resulting technology from the program will be in the form of a set of publicly available tools integrated into LE-CPS design toolchains that will be made widely available for use in commercial and defense sectors.
You are now leaving the DARPA.mil website that is under the control and
management of DARPA. The appearance of hyperlinks does not constitute
endorsement by DARPA of non-U.S. Government sites or the information,
products, or services contained therein. Although DARPA may or may not
use these sites as additional distribution channels for Department of
Defense information, it does not exercise editorial control over all of
the information that you may find at these locations. Such links are
provided consistent with the stated purpose of this website.
After reading this message, click to continue