Federal agencies spend billions of dollars each year to develop, acquire, and build major systems, facilities, and equipment, including fighter aircraft, nuclear waste treatment facilities, electronic baggage screening equipment, and telescopes for exploring the universe. Managing these complex acquisitions has been a long-standing challenge for federal agencies.Many of the government’s most costly and complex acquisition efforts require the development of cutting-edge technologies and their integration into large and complex systems.
The inability of DoD programs to sufficiently reduce technology risk prior to allowing a program to enter formal systems development has, as measured from 2007 to 2012, contributed to a 13% cost growth in weapon systems acquisition, and a 17% increase in cycle time to Initial Operational Capability, or IOC (GAO, 2013). Acquisition cycle time is defined as that span of time from program start to deployment of IOC to the warfighter. When compared to First Full Estimates, the DoD major defense acquisition program (MDAP) portfolio total acquisition cost had grown an average 38%; correspondingly, product cycle time increased an average 37% (GAO, 2013).
Today’s economic climate continues to threaten available DoD funds and underscores the need for streamlined but effective systems engineering. “In the face of decreasing budgets, rapidly evolving threats, and a shift in defense strategy, … it’s imperative that every dollar spent increases warfighting capability,” said VADM David Dunaway, Commander of the Naval Air Systems Command.
GAO has found that in many programs, cost growth and schedule delays resulted from overly optimistic assumptions about technology maturity. Experts have also found that many program managers and technology developers suffer from the assumption that they can deliver state-of-the-art technology upgrades within a constrained budget before evidence is available that the technology will perform as expected in the environment for which it is planned.
A TRA is a systematic, evidence-based process that evaluates the maturity of CTs (hardware, software, process, or a combination thereof) that are vital to the performance of a larger system or the fulfillment of the key objectives of an acquisition program. A Technology Readiness Assessment (TRA) is a systematic, metrics-based process that assesses the maturity of, and the risk associated with, critical technologies to be used in Major Defense Acquisition Programs (MDAPs). It is a normal outgrowth of the system engineering process and relies on data generated during the course of technology or system development. The TRA frequently uses a maturity scale—technology readiness levels (TRLs)—that is ordered according to the characteristics of the demonstration or testing environment under which a given technology was tested at defined points in time.
The scale consists of nine levels, each one requiring the technology to be demonstrated in incrementally higher levels of fidelity in terms of its form, the level of integration with other parts of the system, and its operating environment than the previous, until the final level where the actual operation of the technology is in its final form and proven through successful mission operations. The TRA evaluates CTs at specific points in time for integration into a larger system. In general, TRLs are measured along a 1-9 scale, starting with level 1 paper studies of the basic concept, moving to laboratory demonstrations around level 4, and ending at level 9, where the technology is tested and proven, integrated into a product, and successfully operated in its intended environment.
In addition to TRAs, organizations use other types of assessments and reviews to examine the technical aspects of acquisition. For example, systems engineering reviews are used to examine the integration of components into systems, test reports are used to detail the outcomes of developmental tests, and manufacturing readiness assessments are used to examine the maturity of the processes that will be applied to manufacture the product.
Technology Readiness Assessment (TRA)
Acquisition programs and projects in many organizations are broadly divided into phases of technology development, product development, production, and operation activities. These phases may be further divided by decision points or stage gates with criteria and activities that should be met or completed before committing additional resources to the project. Passing from one decision point to the next requires evidence and documentation, such as test reports, data analysis, and other assessments to demonstrate that these criteria have been met. During the acquisition life-cycle, TRAs can monitor the progress of maturing technologies and determine how ready a technology is to make a transition from technology development to subsequent phases.
A TRA focuses on the program’s “critical” technologies (i.e., those that may pose major technological risk during development, particularly during the Engineering and Manufacturing Development (EMD) phase of acquisition). A Critical Technology Element (CTE) represents an enabling technology that is deemed critical to meet operational performance of the system to be acquired and is also (a) a technology or application of a technology that is considered either new or novel, or (b) represents an area that poses a significant technological risk during product development (i.e., EMD) (DoD, 2009). Assessing the maturity of a particular technology involves determining how ready it is for operations across a spectrum of environments or close to transitioning it to the user. Application to an acquisition program also includes determining the fitness of a particular technology to meet the customer’s requirements and desired outcome for operations.
A technology element is considered a critical technology if it is new or novel, or used in a new or novel way, and it is needed for a system to meet its operational performance requirements within defined cost and schedule parameters.Heritage technologies are technologies that have been used successfully in operation. These technologies may be used in new ways where the form, fit or function changed; the environment to which they will be exposed to in their new application is different than those for which they were originally qualified; or process changes have been made in their manufacture.
The National Defense Authorization Act (NDAA) of 2006 established statutory law for the Milestone Decision Authority to certify that all critical technologies (i.e., referred to as critical technology elements) have been demonstrated in a relevant environment (i.e., TRL 6) before granting an MDAP approval to enter EMD (NDAA, 2006). With the advent of key legislation and resulting DoD acquisition reform initiatives, weapon systems programs are now required to enforce a technology development strategy that can foster true risk reduction prior to entering systems development.
It is provided to the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) and will provide part of the bases upon which he advises the Milestone Decision Authority (MDA) at Milestone (MS) B or at other events designated by the MDA to assist in the determination of whether the technologies of the program have acceptable levels of risk—based in part on the degree to which they have been demonstrated (including demonstration in a relevant environment)—and to support risk- mitigation plans prepared by the PM. Thus, it is important to identify all appropriate technologies that bear on that determination. These technologies should be identified in the context of the program’s systems engineering process, based on a comprehensive review of the most current system performance and technical requirements and design and the program’s established technical work breakdown structure (WBS).
A TRA is required by Department of Defense Instruction (DoDI) 5000.02 for MDAPs at MS B (or at a subsequent Milestone if there is no MS B). It is also conducted whenever otherwise required by the MDA. It is required for space systems by Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) memorandum Transition of the Defense Space Acquisition Board (DSAB) Into the Defense Acquisition Board, dated March 23, 2009.
TRLs are the most common measure for systematically communicating the readiness of new technologies or new applications of existing technologies or heritage technologies to be incorporated into a system or program. Technology Readiness Levels (TRLs) can serve as a helpful knowledge-based standard and shorthand for evaluating technology maturity, but they must be supplemented with expert professional judgment. NASA originally developed Technology Readiness Levels (TRLs) in the 1970s and 1980s . Other governmental agencies followed in the ensuing decades, including the U.S. Department of Defense (DoD) and Department of Energy (DOE), as well as European and international agencies. This systematic, metrics-based process assesses the maturity of, and the risk associated with, critical technologies.
According to DoD (DoD, n.d.; Taylor, 2007) and Public Law (NDAA, 2006, 2008), technologies that are TRL 6 or better are considered as meeting the minimum maturity level acceptable to enter system development (i.e., EMD) at Milestone B. When considering a production decision at Milestone C, DoD best practice requires technologies to be at least TRL 7 to be considered mature enough to enter a production decision. A similar relationship applies when considering readiness for deployment; those technologies not yet TRL 8 (i.e., fully qualified, specification-compliant, and ready to enter operational test) would not be considered mature enough to enter the capstone Operational Evaluation (OPEVAL). DoD considers TRL 9 as the level when a critical technology can be considered fully mature (i.e., when the system is considered suitable and effective by the user and deployed to field).
While a TRA uses TRLs as a key measure for evaluating CTs, an assessment is more than just a single number at single points in time. TRAs are a compilation of lower-level assessments that may span several years, depending on the program schedule and complexity of the development. Assessments can help gauge the progress of a technology’s development, inform project plans, and identify potential concerns for decision makers throughout acquisitions. Conducting TRAs periodically and during the earlier phases of development can identify potential concerns before risks are carried into the later and more expensive stages of system development. TRAs can also facilitate communication between technology developers, program managers, and acquisition officials throughout development and at key decision points by providing a common language for discussing technology readiness and related technical risks. Finally, TRA results can inform other assessments and planning activities, such as cost and schedule estimates, risk assessments, and technology maturation plans.
Product development activities include the continued maturation of technologies, development and refinement of the design including the preparation of detailed design drawings, construction of higher fidelity prototypes of components and systems, integration activities to ensure that the components work together, testing to ensure that performance and reliability expectations can be met, and demonstrations of manufacturing capabilities to show that the product can be consistently produced within cost, schedule, quality, and performance goals.
The primary purpose for using a prototype is to mitigate risk (cost, schedule, or performance) to product development and to the timely delivery of an affordable and compliant end-item to the customer. Prototypes focus on high-risk areas considered essential to achieve system performance and are deemed important to achieve market or user introduction. The cost and relative complexity that a prototype can take on will vary depending on the need and the significance of the function being mitigated. From small-scale, relatively simple models for desktop experiments to larger, more complex full-scale integrated system demonstrators, the primary goal for the use of a prototype is to yield insightful knowledge that can be used to reduce end-item risk System prototype demonstrations not only validate the state of technology maturity for enabling technologies, but also provide for early mitigation of system/subsystem integration risk.
To reduce the risk associated with entering EMD, DoDI 5000.02 requires Requests for Proposals (RFPs) to incorporate language that prevents the award of an EMD contract if it includes technologies that have not been demonstrated adequately. Certain certifications and determinations are statutorily required to be made prior to a DOD major acquisition program’s milestone B decision. The Department of Energy (DOE), Office of Environmental Management, requires that TRAs and technology maturation plans (TMP) be conducted for major projects where new critical technologies are being developed prior to critical decision
SMEs assess whether adequate risk reduction to enter EMD (or other contemplated acquisition phase) has been achieved for all technologies under consideration, including, specifically, demonstration in a relevant environment. The assessment should be based on objective evidence gathered during events, such as tests, demonstrations, pilots, or physics- based simulations. Based on the requirements, identified capabilities, system architecture, software architecture, concept of operations (CONOPS), and/or the concept of employment, the SME team will evaluate whether performance in relevant environments and technology maturity have been demonstrated by the objective evidence. If demonstration in a relevant environment has not been achieved, the SMEs will review the risk-mitigation steps intended by the PM and make a determination as to their sufficiency to reduce risk to an acceptable level
A TRA is conducted using an independent review panel to reconcile program CTEs and associate TRLs based on the level and quality of integrated prototype demonstrations accomplished. It is conducted by the Pro- gram Manager (PM) with the assistance of an independent team of subject matter experts (SMEs). DoD’s TRA guidance includes a skeletal template for TRAs, comprising a program overview, identification of critical technologies, and an assessment of program technology risks and readiness. It also provides TRL definitions, descriptions, and supporting information.
Technology Readiness Levels (TRL)
There are actually several versions of the original NASA-developed TRL scale, depending on the application (software, manufacturing, etc.), but all rate a technology based on the amount of development completed, prototyping, and testing within a range of environments from lab (or “breadboard”) to operationally relevant.
Technology Readiness Levels are a set of nine graded definitions/descriptions of stages of technology maturity. They were originated by the National Aeronautics and Space Administration and adapted by the DOD for use in its acquisition system. A copy of the definitions is provided below for convenience.
Technology Readiness Level | Description | Supporting Information |
1. Basic principles observed and reported. | Lowest level of technology readiness. Scientific research begins to be translated into applied research and development (R&D). Examples might include paper studies of a technology’s basic properties. | Published research that identifies the principles that underlie this technology. References to who, where, when
|
2. Technology concept and/or application formulated. | Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative and there may be no proof or detailed analysis to support the assumptions. Examples are limited to analytic studies. | Publications or other references that out- line the application being considered and that provide analysis to support the concept. |
3. Analytical and experimental critical function and/or characteristic proof of concept. | Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative. | Results of laboratory tests performed to measure parameters of interest and com- parison to analytical predictions for critical subsystems. References to who, where, and when these tests and comparisons were performed. |
4. Component and/or breadboard validation in laboratory environment. | Basic technological components are integrated to establish that they will work together. This is relatively “low fidelity” compared to the eventual system. Examples include integration of “ad hoc” hardware in the laboratory. | System concepts that have been considered and results from testing laboratory- scale breadboard(s). References to who did this work and when. Provide an estimate of how breadboard hardware and test results differ from the expected sys- tem goals. |
5. Component and/or breadboard validation in relevant environment. | Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so it can be tested in a simulated environment. Examples include “high fidelity” laboratory integration of components. | Results from testing laboratory breadboard system are integrated with other supporting elements in a simulated operational environment. How does the “relevant environment” differ from the expected operational environment? How do the test results compare with expectations? What problems, if any, were encountered? Was the breadboard system refined to more nearly match the expected system goals |
6. System/subsystem model or prototype demonstration in a relevant environment. | Representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology’s demonstrated readiness. Examples include testing a prototype in a high-fidelity laboratory environment or in simulated operational environment. | Results from laboratory testing of a proto- type system that is near the desired con- figuration in terms of performance, weight, and volume. How did the test environment differ from the operational environment? Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level? |
7. System prototype demonstration in an operational environment. | Prototype near, or at, planned operational system. Represents a major step up from TRL 6, requiring demonstration of an actual system prototype in an operational environment such as an aircraft, vehicle, or space. Examples include testing the prototype in a test bed aircraft. | Results from testing a prototype system in an operational environment. Who per- formed the tests? How did the test com- pare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level? |
8. Actual system completed and qualified through test and demonstration. | Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation of the system in its intended weapon system to determine if it meets design specifications. | Results of testing the system in its final configuration under the expected range of environmental conditions in which it will be expected to operate. Assessment of whether it will meet its operational requirements. What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before finalizing the design? |
9. Actual system proven through successful mission operations. | Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. Examples include using the system under operational mission conditions. | OT&E reports |
Definitions
- BREADBOARD: Integrated components that provide a representation of a system/subsystem and which can be used to determine concept feasibility and to develop technical data. Typically configured for laboratory use to demonstrate the technical principles of immediate interest. May resemble final system/subsystem in function only.
- HIGH FIDELITY: Addresses form, fit and function. High fidelity laboratory environment would involve testing with equipment that can simulate and validate all system specifications within a laboratory setting.
- LOW FIDELITY: A representative of the component or system that has limited ability to provide anything but first order information about the end product. Low fidelity assessments are used to provide trend analysis.
- MODEL: A reduced scale, functional form of a system, near or at operational specification. Models will be sufficiently hardened to allow demonstration of the technical and operational capabilities required of the final system.
- OPERATIONAL ENVIRONMENT: Environment that addresses all of the operational requirements and specifications required of the final system to include platform/packaging.
- PROTOTYPE: The first early representation of the system which offers the expected functionality and performance expected of the final implementation. Prototypes will be sufficiently hardened to allow demonstration of the technical and operational capabilities required of the final system.
- RELEVANT ENVIRONMENT: Testing environment that simulates the key aspects of the operational environment.
- SIMULATED OPERATIONAL ENVIRONMENTAL: Environment that can simulate all of the operational requirements and specifications required of the final system or a simulated environment that allows for testing of a virtual prototype to determine whether it meets the operational requirements and specifications of the final system.
TRLs should be tracked over time to ensure that a technology is maturing as expected and, if it is not, to determine whether an alternative technology should be pursued.
Technology hype cycle
One way to look at technology maturity is through a Gartner hype cycle : a graphic representation of the maturity, adoption, and business application of specific technologies. Gartner uses hype cycles to characterize the over-enthusiasm, or “hype,” and subsequent disappointment that typically follow the introduction of new technologies.
A hype cycle in Gartner’s interpretation has five steps:
Technology Trigger: The first phase of a hype cycle is the “technology trigger” or breakthrough, product launch, or other event that generates significant press and interest.
Peak of Inflated Expectations: In the next phase, a frenzy of publicity typically generates over-enthusiasm and unrealistic expectations. Some technology applications may be successful, but typically more are failures.
Trough of Disillusionment: Technologies enter the “trough of disillusionment” because they fail to meet expectations and quickly become unfashionable. Consequently the press usually abandons the topic and the technology.
Slope of Enlightenment: Although the press may have stopped covering the technology, some businesses continue through the “slope of enlightenment” and experiment to understand the benefits and practical application of the technology.
Plateau of Productivity: Mainstream adoption starts to take off. Criteria for assessing provider viability are more clearly defined. The technology’s broad market applicability and relevance are clearly paying off.
When program stakeholders give significant attention to new research, technologies, or technology development programs or demonstrations, the targeted technology should be objectively evaluated and assessed for maturity as soon as possible before committing any significant program investment funding.
Technology maturity. A generic depiction of technology maturity is shown by the s-curve. In general, technology can be defined as follows:
- New technology has not reached the first tipping point in the s-curve of technology maturity.
- Improving, or emerging, technology is within the exponential development stage of the curve after the first tipping point and before the second tipping point.
- Mature technology follows the second tipping point before the curve starts down.
- Aging technology is on the downward tail.
The most universally accepted methodology for assessing the upward slope of this curve is the Technology Readiness Level (TRL) scale
Selecting technology alternatives
For assessing which technology to employ to satisfy requirements, various fitness criteria can be used to select which alternative will best realize the sponsor’s desired outcomes from the total spectrum of technologies available. Criteria that consider both the technology and the sponsor’s ability to assimilate it are more likely to succeed than those that consider only the technology (as in the use of TRLs). Moore identifies types of sponsor as innovators, early adopters, early majority, late majority, and laggards.
DoD acquisition programs are required to assess all threshold capabilities in the Capabilities Description Document for maturity; those deemed to be met with immature technology (a TRL of less than 6) will not be considered further as “threshold” and may jeopardize the program milestone decision. Programs structured to inject developing technologies could be more receptive to innovation and less mature technologies, but in this case be sure to carefully evaluate the risks involved
ABC alternatives.
Another dimension of the selection criteria considers the capabilities of technology providers. Former Director of the Defense Information Systems Agency, Lt. Gen. Charles Croom, devised a new philosophy for acquisition called ABC. In the ABC concept, “A” stands for adopt existing technology, “B” is buy it, and “C” is create it yourself. Adopt may seem an obvious decision if the technology fits the purpose, but both the technology and the provider should be evaluated for reliability and sustainability. With the buy alternative, vendor responsiveness, capability, and viability are concerns (see the Integrated Logistics Support topic in this section). Create is the choice of last resort, but it may be the best alternative in certain circumstances. When choosing Create, consider the entire systems engineering life cycle, including operations and maintenance, as this will have an impact on life-cycle cost.