Home / Technology / AI & IT / DARPA’s AMP to develop rapid, assuredly safe, and scalable software patches for reducing vulnerabilities of mission-critical systems

DARPA’s AMP to develop rapid, assuredly safe, and scalable software patches for reducing vulnerabilities of mission-critical systems

Our society’s infrastructure is increasingly dependent on software deployed on a wide variety of computing devices other than commodity personal computers, such as industrial equipment, automobiles, and airplanes. Unlike commodity computers that have short upgrade cycles and are easily replaceable in case of failure, these computing devices are intended for longer service, and are hard to replace. Thus, the amount of deployed software that needs to be maintained is continually increasing, while the growing use of telemetry on such devices potentially exposes their software to cyber-attacks.

 

In computer security, a vulnerability is a flaw or weakness in a system or network that could be exploited by a threat actor, such as an attacker, to manipulate or cause damage to the system in some way. These vulnerabilities can exist because of unanticipated interactions of different software programs, system components, or basic flaws in an individual program. Vulnerabilities can allow attackers to run code, access system memory, install different types of malware and steal, destroy or modify sensitive data. WannaCry and NotPetya aggressive outbreaks were caused by the disclosure of EternalBlue zero-day.

 

Patch management consists of scanning computers, mobile devices or other machines on a network for missing software updates, known as “patches” and fixing the problem by deploying those patches as soon as they become available. Patches are a type of code that is inserted (or patched) into the code of an existing software program. It is typically a stop-gap measure until a new full release of the software becomes available. Patches are created by software companies when they know of an existing vulnerability and ensure that hackers don’t use that vulnerability to break into your corporate network.

 

In patch management, an individual team or an automated software determines which tools need patches and when fixes need to be made. Many times, installation can be done to a central administrative computer and be reflected across all other devices. In some cases, patches have to be installed separately on different devices – especially if the patches are for software installed only on a few computers. Patch management also involves determining which patches are essential and when they should be installed on a system.

 

To fix cybersecurity flaws in software, vendors distribute patched versions of the software. Unfortunately, even after a particular flaw has been fully understood, and a remediation approach has been developed and expressed as a source code change in the current version of the software, the ability of vendors to produce patches for all of their deployed devices in a timely, assuredly safe, and scalable manner is limited. Additional challenges arise when the exact source code version has been lost, the process for building the software from source code was not documented, and/or the original software development environment is not available. These limitations and challenges result in mission-critical software going unpatched for months to years, increasing the opportunity for attackers.

 

The goal of the Assured Micropatching (AMP) program is to create the capability for rapid patching of legacy binaries in mission critical systems, including the cases where the original source code version and/or build process is not available.

 

Patches are necessary to ensure that the systems are fixed, up to date and protected against security vulnerabilities and bugs that were present in the software. Failure to patch makes a network doubly vulnerable – not only is the vulnerability there, but it has now also been publicized, making it more likely to be exploited by malicious users, hackers and virus writers.

 

Assured Micropatching (AMP) program

Although software flaws are commonly understood and fixed at the source code level, the actual operation of a device is controlled by the binary executable image of the software, obtained from the source via the build process. The build process produces executable binary units from the source code units via compilation and then unifies these units via linking into a single executable binary image to be loaded onto the device. The culmination of the software development process for a platform is the integration stage, wherein all the binary software modules obtained via their separate build processes are functionally tested as a whole.

 

An empirically viable alternative method of fixing known flaws is the so-called manual binary micropatching process. In this process, experts manually decompile and reverse engineer the binary, then analyze the results to find the locus of the desired source code change. They then
translate the source code change into a change in the binary executable code that respects the structure of the existing binary, including the original compiler’s conventions and artifacts. Further manual analysis ensures that the changes will not disrupt the code paths inherent in the binary’s baseline functionality.

 

However, there is no automated methodology for reasoning about the effects of a binary micropatch on the rest of the system. The changed binary code must undergo the same extensive testing accomplished during the original software’s integration stage. In the end, the data flow
and control flow properties of a manual binary micropatch are subject to the manual analysis by the expert, unaided by any kind of potentially relevant sophisticated analyses typically performed by the compiler when optimizing code for efficiency. While side-stepping the risks of fromscratch recompilation, manual binary micropatching remains an unscalable approach.

 

For embedded software, the situation is further exacerbated by restrictive licensing and customization of build tools chains, which may be constrained to run on a single computer with an outdated operating system and programming environment. In a typical scenario, a flawed
version of a component library is included in an embedded software development kit (SDK) and becomes part of many different firmware images. The version of the library included in the SDK may be modified to integrate with other components of the SDK or to address the embedded platform’s features and constraints. When flaws are fixed in the stock version of the library, the SDK version may be left behind—especially in the case when it has been modified. As a result, the availability of a well-understood patch for a stock version does not automatically entail the availability of patches for a multitude of SDK-derived binary firmwares. In such cases, there is no viable automated path for rebuilding these firmwares, and no scalable way of micropatching them.

 

Today’s software methodologies and tools do not support systematic assured modification of binaries, such as synthesizing a change for an existing binary from a source-code level description and safely applying and analyzing the change. Instead, the binary is regarded as an
opaque end-point of the build process, to be discarded and re-created from scratch when any changes need to be applied to it. This approach disregards the growing footprint of deployed legacy binaries and the difficulties of preserving build processes for binaries meant to be
deployed in long-serving mission-critical infrastructure assets.  The AMP program seeks to address these gaps in the current software development paradigm, elevating the tasks of assured manipulation of existing binaries to the first-class status currently enjoyed by the compiler analysis for performance optimization and software verification.

 

The AMP program will address rapid patching of software in mission-critical systems by combining techniques from compiler research, binary decompilation and analysis, and program verification. Compiler research to date treats creating executable code as a clean-slate task for every compilation unit, without regard for restrictions imposed by the binary environment into  which the resulting compiled code must be re-integrated. Decompilation has not made effective use of composable semantically-equivalent transformations, which drive state-of-the-art compilation research. And finally, program verification focuses on proving behaviors of programs with respect to their specifications, rather than proving intended behavior equivalence between patched and unpatched binary versions. AMP will create challenges to spur
collaborations among experts in these areas, to enable assured modification of binaries via automated micropatching

 

DARPA is soliciting innovative research proposals in the area of creating targeted security binary patches (micropatches) to repair legacy binaries of mission-critical systems, with strong guarantees that the patch will not impact the functions of the system. AMP aims to create new capabilities to analyze, modify, and fix legacy software in binary form, capable of producing assured targeted micropatches for known security flaws in existing binaries. Micropatches change the fewest possible bytes to achieve their objective, which minimizes potential side effects, and should enable proofs that the patches will preserve the original baseline functionality of the system. With these proofs, the time to test and deploy the patched system should be reduced from months to days.

 

Proposed research should investigate innovative approaches that enable revolutionary advances in science, devices, or systems. Specifically excluded is research that primarily results in evolutionary improvements to the existing state of practice. The program will produce theories, technologies, and formal proof techniques leading to experimental prototype(s) that will demonstrate the use of targeted micropatches for repairing legacy mission-critical binaries with strong guarantees that the patch will not impact the baseline functions of the system. It is expected that these prototypes will provide a starting point for technology transition to mission-critical software for cyber physical system domains.

 

To achieve this goal, the AMP program seeks to address gaps in the current software development paradigm through breakthroughs in and novel approaches to technical challenges, including but not limited to:

  • Identifying modular units in executable binary images, and identifying modules’ interfaces, interactions, and linking artifacts to enable subsequent assured relinking and re-integration of patched binary modules;
  • Decompiling the executable binary code into forms suitable for automatically situating a patch for a known security flaw existing in the binary;
  • Generating minimal-change binary micropatches for existing binary images and for rigorous reasoning about their effects and testing these effects to verify noninterference of the changes with the binary’s baseline functionality; and
  • Using available sources of information, such as source code and binary samples, to recover missing relevant parts of the source code and the build process.

 

The AMP is a 48 month program divided into three Technical Areas (TAs), TA1 Goal-driven decompilation, TA2 Assured recompilation and TA3 Evaluation, organized into three (3) phases;  Phase I will be 18 months, followed by an 18-month Phase 2, and then concluded with Phase 3 at 12 months.

 

AMP awards

GRIMM, a cybersecurity research firm, has been awarded a DARPA subcontract to research Assured MicroPatching (AMP). The research is intended to advance the generation of custom security patches, with the added benefit of improving the binary analysis tooling required for such cybersecurity research.

 

GRIMM is part of the AMP Technical Area 3 (TA3) team, which is responsible for developing and providing vulnerability-patching challenges created to test the wares of other contract performers on both TA1 and TA2, which are responsible for taking vulnerability research, patch generation and patch testing to a new level. GRIMM’s role on the team will help validate the cybersecurity elements of the project. In doing so, GRIMM will be providing a heavy-trucking Electronic Control Unit (ECU) simulator, emulating a PowerPC system, while leveraging real-world firmware.

 

GRIMM’s Principal Security Researcher, Matthew Carpenter, says, “The virtual ECU will allow the performers access to ECUs wherever they are, without needing to manage custom and closed hardware, and will support power systems and networking modules.” This work is based on research and PowerPC emulation GRIMM developed in 2019. Carpenter also states, “In addition to making software patching more of a reality, this project is advancing the very tools used to identify cybersecurity vulnerabilities, making high-tech bug hunting easier and more powerful.”

 

References and Resources also include:

https://www.businesswire.com/news/home/20201112005028/en/GRIMM-Wins-DARPA-Award-to-Research-Assured-MicroPatching-AMP

 

About Rajesh Uppal

Check Also

Advancing Metamaterial Research: Exploring Laser Ultrasonics for Property Validation

In the realm of materials science, metamaterials stand out as extraordinary structures with engineered properties …

error: Content is protected !!