Program Summary
Embedded computing systems are ubiquitous in critical infrastructure, vehicles, smart devices, and military systems. Conventional wisdom once held that cyberattacks against embedded systems were not a concern since they seldom had traditional networking connections on which an attack could occur. However, attackers have learned to bridge air gaps that surround the most sensitive embedded systems, and network connectivity is now being extended to even the most remote of embedded systems. In short, embedded systems are now subject to cyberattacks, either as the end goal of the cyber assailant or as a means to a greater end, and there is a critical need to protect and defend embedded systems in a cyber-context. The mechanisms currently employed to secure embedded systems include development of software using cyber best practices, adapting mechanisms from information technology (IT) systems, and penetration testing followed by patching. Unfortunately, these methods have proven to be generally ineffective.
Critical systems are built by requirements-based engineering and it is an accepted axiom of systems engineering that requirements are positive, testable statements about the system—statements on the systems’ functional behaviors and non-functional properties often captured as “shall” statements. This style of engineering has proven to be ineffective in engineering cyber resilient systems because cyber requirements are often statements on what the system should not do, i.e., “shall not” statements.
The goal of CASE is to develop the necessary design, analysis and verification tools to allow system engineers to design-in cyber resiliency and manage tradeoffs as they do other nonfunctional properties when designing complex embedded computing systems. Cyber resiliency means the system is tolerant to cyberattacks in the same way that safety critical systems are tolerant to random faults—they recover and continue to execute their mission function. Achieving this goal requires research breakthroughs in:
- the elicitation of cyber resiliency requirements before the system is built;
- the design and verification of systems when requirements are not testable (i.e., when they are expressed in shall not statements);
- tools to automatically adapt software to new non-functional requirements; and
- techniques to scale and provide meaningful feedback from analysis tools that reside low in the development tool chain.
The CASE BAA is available at https://www.fbo.gov/index?s=opportunity&mode=form&id=3353e66b84ebc0e87b2e5e263c3da5a4&tab=core&_cview=1