Summary
In the Moment (ITM) aims to support the development of algorithms that are trusted to independently make decisions in difficult domains. Military operations – such as mass casualty triage and disaster relief – require complex and rapid decision-making in dynamic situations where there is often no human consensus and no ground truth.
By identifying key attributes underlying trusted human decision-making in dynamic settings and computationally representing those attributes, the ITM program seeks to generate a quantitative alignment framework for a trusted human decision-maker and an algorithm. This alignment framework will form the basis for understanding algorithm alignment with trusted humans, enable algorithms to incorporate key trusted attributes, and support the development of algorithms that can be tuned to align with specific, trusted humans.
A dedicated team will provide ethical, legal, and social issues guidance, as well as consider future policy options for autonomous decision-making systems.
ITM will investigate these decision-making problems in the context of two specific domains: small unit triage in austere environments in Phase 1 and mass casualty triage in Phase 2.
The ITM program will encourage cross-disciplinary research in the fields of decision-making, cognitive science, experimental psychology, simulation environments, artificial intelligence, machine learning, algorithm evaluation, and decision-making policy for autonomous systems.
When Should Machines Decide? | Ep 83 (Voice from DARPA podcast)
ITM Proposers Day presentation
Opportunity: HR001122S0031: In the Moment (ITM)