Defense Advanced Research Projects AgencyTagged Content List

Algorithms

A process or rule set used for calculations or other problem-solving operations

Showing 11 results for Algorithms + Trust RSS
May 17, 2019 ,
DARPA Conference Center
The Strategic Technology Office is holding a Proposers Day meeting to provide information to potential proposers on the objectives of the new Air Combat Evolution (ACE) program and to facilitate teaming. The goal of ACE is to automate air-to-air combat, enabling reaction times at machine speeds and freeing pilots to concentrate on the larger air battle. Turning aerial dogfighting over to AI is less about dogfighting, which should be rare in the future, and more about giving pilots the confidence that AI and automation can handle a high-end fight.
June 8, 2018,
Executive Conference Center
DARPA’s Defense Sciences Office (DSO) is hosting a Proposers Day to provide information to potential proposers on the objectives of the Systematizing Confidence in Open Research and Evidence (SCORE) program. SCORE aims to develop and deploy automated tools to assign "confidence scores" to different social and behavioral science (SBS) research results and claims. Confidence scores are quantitative measures that should enable a DoD consumer of SBS research to understand the degree to which a particular claim or result is likely to be reproducible or replicable. The event will be available via a live webcast for those who would like to participate remotely.
In order to transform machine learning systems from tools into partners, users need to trust their machine counterpart. One component to building a trusted relationship is knowledge of a partner’s competence (an accurate insight into a partner’s skills, experience, and reliability in dynamic environments). While state-of-the-art machine learning systems can perform well when their behaviors are applied in contexts similar to their learning experiences, they are unable to communicate their task strategies, the completeness of their training relative to a given task, the factors that may influence their actions, or their likelihood to succeed under specific conditions.
As new defensive technologies make old classes of vulnerability difficult to exploit successfully, adversaries move to new classes of vulnerability. Vulnerabilities based on flawed implementations of algorithms have been popular targets for many years. However, once new defensive technologies make vulnerabilities based on flawed implementations less common and more difficult to exploit, adversaries will turn their attention to vulnerabilities inherent in the algorithms themselves.
The Department of Defense (DoD) often leverages social and behavioral science (SBS) research to design plans, guide investments, assess outcomes, and build models of human social systems and behaviors as they relate to national security challenges in the human domain. However, a number of recent empirical studies and meta-analyses have revealed that many SBS results vary dramatically in terms of their ability to be independently reproduced or replicated, which could have real-world implications for DoD’s plans, decisions, and models. To help address this situation, DARPA’s Systematizing Confidence in Open Research and Evidence (SCORE) program aims to develop and deploy automated tools to assign "confidence scores" to different SBS research results and claims.