Talk about finding a needle in a haystack.
The Defense Advanced Research Projects Agency says it wants to develop sophisticated code that can find faults in key algorithms used to anchor major software packages that for example implement hash tables or conduct password checks.
“As new defensive technologies make old classes of vulnerability difficult to exploit successfully, adversaries move to new classes of vulnerability. Vulnerabilities based on flawed implementations of algorithms have been popular targets for many years. However, once new defensive technologies make vulnerabilities based on flawed implementations less common and more difficult to exploit, adversaries will turn their attention to vulnerabilities inherent in the algorithms themselves,” DARPA stated.
+More on Network World: The weirdest, wackiest and coolest sci/tech stories of 2014 (so far!)+
Enter the Space/Time Analysis for Cybersecurity (STAC) program which DARPA announced this week and says is focused on with vulnerabilities inherent in algorithms – specifically, vulnerabilities that stem from the algorithms’ space and time resource usage. Some of these vulnerabilities enable adversaries to mount algorithmic complexity attacks. Others enable adversaries to mount side channel attacks, the agency stated.
DARPA went on to state: “The STAC program is concerned with resource usage vulnerabilities in programs expressed in Java bytecode. Vulnerabilities in programs expressed in languages other than these are out of scope. Although resource usage vulnerabilities also occur in software written in other languages, the fundamental nature of these vulnerabilities is not language-specific, and focusing on a narrow language selection will make the effectiveness of the new techniques developed in the STAC program more easily comparable. Focusing on bytecode rather than source code will make the new analysis techniques applicable in scenarios where analysts must examine third-party software for which they do not possess source code.”
DARPA says that STAC is looking for two main advances scale and speed. “Scale refers to the need for analyses that are capable of considering larger pieces of software, from those that implement network services typically in the range of hundreds of thousands of lines of source code to even larger systems comprising millions or tens of millions of lines of code. Speed refers to the need to increase the rate at which human analysts can analyze software with the help of automated tools, from thousands of lines of code per hour to tens of thousands, hundreds of thousands, or millions of lines of code per hour.”
DARPA said that because algorithmic resource usage vulnerabilities are the consequence of problems inherent in algorithms themselves rather than the consequence of traditional implementation flaws, traditional defensive technologies such as Address Space Layout Randomization, Data Execution Prevention, Reference Count Hardening, Safe Unlinking, and even Type-Safe programming languages do nothing to mitigate them. They will require new and different kinds of program analyses to find in software.
“Reasoning about the behavior of software is notoriously difficult. In fact, it is provably impossible to construct a perfect automated tool that can always provide a correct answer to any non-trivial question about program behavior. Despite this, there remains the potential for imperfect but practically useful automated tools that provide answers to some pertinent questions with an acceptable level of accuracy. While entirely manual solutions may be too slow for our purposes, and fully-automated analyses too inaccurate, combined semi-automated analyses may offer a solution,” DARPA stated.
Check out these other hot stories: