The Importance of Early Prototyping in Defense Research, Engineering, Acquisition, and Sustainment

Summary

The U.S. Department of Defense (DoD) operates across the entire life cycle of capability development, including exploratory science, research and development (R&D), production and deployment, operations and sustainment, and disposal. The processes used by the DoD to acquire systems are based on the ability to plan and budget for long-term, stable mass production of solutions addressing invariable and predictable scenarios. However, current global political conditions, the rate of technological change, and the application of emerging technologies create uncertainty in the future security environment.

Prototyping ideas throughout a system’s life cycle offers an opportunity to continuously keep pace with change and uncertainty. A survey of the literature reveals a desire to prototype early in the development process as a means of generating knowledge while reducing risk. However, budget and legal authorities pin prototyping to technology maturity assessment and as a pathway to accelerate delivery of near-final, full-up systems. Missing is a complementary strategy for early prototyping.

This article presents an overview of prototyping in the DoD and promotes consideration of, and funding for, physical and virtual prototyping throughout the life cycle as an effective way to proactively test hypotheses, engage stakeholders early, reduce risk, prune decision paths, and ultimately deliver the right capability faster to Warfighters.

Introduction

Adversaries of the United States are competent, quick, and effective at fielding emerging technologies, which presents an impediment to maintaining overmatch both now and in the future [1–5]. The processes used by the DoD to develop and acquire its capabilities are all based on the ability to plan and budget for long-term, stable mass production of solutions addressing invariable and predictable scenarios [6]. The result is defense systems designed to requirements based on long-term accuracy and certainty  that take years to design and build because of their inherent complicatedness. However, current global political conditions, the rate of technological change, and the application of emerging technologies create uncertainty in the future security environment; “future warfare will feature constant myriad technological advances that come at a tempo that disallows mass production” [7]. The decision-making processes that worked for mass-producing solutions to accurately predictable conditions may be incompatible with the conditions under which defense systems are envisioned to operate in the future and the speed with which these systems must be developed and integrated.

An increasingly effective path to ensuring faster fielding of the right solution is through an approach that engages technologists with users to learn their pain points, puts form to function early on in the development process, tests hypotheses through research experiments, quickly gets hardware into the hands of a user, and collects feedback to assess whether the product or service solution is on the path to meeting user needs [8–10]. This approach can occur as early as the ideation stage, and the early form-to-function effort can look like a sketch, a computer code with merely basic functionality, a hardware mockup, a controlled scientific experiment, or some other mechanism to tactilely communicate an idea or concept before proceeding into the next phase of development. Colloquially, this process is called prototyping, and the operand in the process is called a prototype. Prototyping generates information that supports a decision; if faster decision-making is the end, then prototyping offers the way and prototypes are the means. A classic example of using nonfunctional prototype hardware to gain information is when the Industrial Design team at Apple made wooden iPhone mock-ups to determine the optimum size of icons for a human finger [11].

Issues surrounding DoD prototyping activities have recently arisen—definitions are inconsistent and budgets for prototyping are largely constrained to the later stages of technology maturity, revealing risk aversion in the acquisition system. Further, although there now exists a middle acquisition tier in which prototyping takes center stage to accelerate delivery of capability to the Warfighter, the Government Accountability Office (GAO) recently found that “[the] DoD has yet to fully determine how it will oversee middle-tier acquisition programs, including what information should be required to ensure informed decisions about program selection and how to measure program performance” [12].

A review of the John S. McCain National Defense Authorization Act (NDAA) for fiscal year 2019 (FY19) [13] shows that only 2 of 11 prototyping programs have been authorized as applied research—$10M to accelerate Army railgun development and prototyping and $160M for Innovative Naval Prototypes (INPs) applied research, where INPs “comprise potentially game-changing or disruptive technologies…developed around anticipated Naval needs rather than in response to established requirements” [14].  All other such FY19 program authorizations ($678M) fall under advanced technology development and advanced component development and prototype budget activities. Overall, there is disagreement across the research and acquisition spectrum on what even constitutes a “prototype.”

What Is a Prototype Anyway?

The word “prototype” originates from the Greek “prōtotupos,” which literally means first impression, mold, or pattern or the first from which all subsequent copies will derive [15]. Within the DoD alone, a multitude of definitions exists, each influenced by a product’s life cycle stage, the level of system hierarchy for which the prototype is built, the type of knowledge the prototype aims to uncover, and the final disposition of the prototype [16–19].

The Defense Acquisition University (DAU), the go-to source for defense acquisition professional training, defines a prototype as “a preliminary type, form, or instance of a system or system element that serves as a model for what comes later. They can be at the system level or can focus on subsystems or components” [20–24].

The DAU’s continuous learning module on Prototyping and Experimentation (CLE082) uses the definition of a prototype model from DAU Acquipedia [20]:

A physical or virtual model used to evaluate the technical or manufacturing feasibility or military utility of a particular technology or process, concept, end item, or system. Prototype models have various types depending [on] the phase of the system life cycle. Prototype models range from the early development phase to production ready as from breadboard, brassboard, engineering development model to production repetitive models.

Note that a breadboard refers to an experimental device used in prototyping electronic circuits where components are “plugged” into the board. A breadboard enables temporary prototypes and circuit design experimentation in the laboratory [22]. A brassboard refers to an experimental or demonstration test model intended for field testing outside the laboratory environment. A brassboard follows the breadboard prototyping stage and contains both the functionality and approximate physical configuration of the final operational product [21].

The DoD Prototyping Guidebook, a living document that consolidates prototyping approaches, best practices, and recommendations into a single guide, offers this definition: “a model (e.g., physical, digital, conceptual, and analytical) built to evaluate and inform its feasibility or usefulness” [25].

With the Other Transaction (OT) Authority contracting mechanism becoming the preferred approach for certain eligible R&D efforts, the Office of the Under Secretary of Defense for Acquisition and Sustainment (OUSD[A&S]) defines a prototype project in the context of an OT in their Other Transactions Guide [26]:

A prototype project addresses a proof of concept, model, reverse engineering to address obsolescence, pilot, novel application of commercial technologies for defense purposes, agile development activity, creation, design, development, demonstration of technical or operational utility, or combinations of the foregoing. A process, including a business process, may be the subject of a prototype project. A prototype may be physical, virtual, or conceptual in nature.

The MITRE Systems Engineering Guide, which is based on MITRE’s application of systems engineering across the federally-funded research and development centers (FFRDCs) it operates for the U.S. government, offers a similarly broad definition spanning the life cycle [27]:

Prototyping is a practice in which an early sample or model of a system, capability, or process is built to answer specific questions about, give insight into, or reduce uncertainty or risk in many diverse areas. This includes exploring alternative concepts, technology maturity assessments, requirements discovery or refinement, design alternative assessments, and performance or suitability issues.

Collectively, these definitions cover a wide range of system hierarchy, element fidelity, product realization, and life cycle phase. Yet the conventional mindset in DoD research, development, test, and evaluation (RDT&E) of a “prototype” is not one of permission to proactively assess the realm of the possible. Rather, it is one of either a full-scale working model that closely resembles the final product that will be mass produced or a low-volume, reactive solution to a specific capability need. The root cause of these differing perspectives is unclear.

A search of DoD RDT&E regulations reveals further inconsistency in perspectives. The U.S. Army Regulation on Test and Evaluation Policy defines a prototype as “an article in final form employing standard parts and representative of articles to be produced on a production line with production tooling” [28].

The U.S. Air Force Test and Evaluation Guide defines a prototype as “a model suitable for evaluation of design, performance, and production potential. Note: The Air Force uses prototypes during development of a technology project or acquisition program for verification or demonstration of technical feasibility. Prototypes are not usually representative of the final production item” [29]. While not applicable to science and technology (S&T) programs that operate pre-Milestone A, the guide recommends that S&T activities follow its intent as much as possible and tailor applying its principles.

Neither the Navy’s Operational Test Director’s Manual [30] nor the DoD Test and Evaluation Management Guide [31], a generic reference written for all personnel involved in DoD acquisition management, provide a definition.

The Marine Corp’s Integrated Test and Evaluation Handbook also does not provide a definition but refers to using prototypes in pre-production qualification and production prove-out tests [32]. So, across Service RDT&E regulations, prototypes are considered both as near-final articles and a proactive means of evaluating feasibility.

After its independent review of the literature, the OUSD Research and Engineering (R&E) Emerging Capability and Prototyping (EC&P) concluded that the seemingly endless reasons for why S&T, R&D, and acquisition professionals conduct prototyping efforts boil down to generating information that supports a specific decision [25].

History and Law

When trying to understand the scope and complexity of an ubiquitous activity such as prototyping, it is important to map out the origin and history.

In 2009, the Weapon Systems Acquisition Reform Act (WSARA) [33] included a requirement that the Acquisition Strategy of each Major Defense Acquisition Program (MDAP) provide competitive prototyping prior to Milestone B approval. If a waiver for competitive prototyping was approved, then the program was required to produce a prototype prior to Milestone B approval—only if the life cycle’s benefit exceeded the cost; if full system prototyping was not feasible, then prototyping at the system or critical subsystem level may be required. With WSARA, near-final system prototypes were preferred, but the door was opened to build and evaluate prototypes at the subsystem level.

The prototyping requirements in WSARA initiated discussion on the role of prototyping in defense acquisition. Prototypes were then considered to be a valid means of solving a long-standing problem with the defense acquisition system (e.g., too slow, too cumbersome, too much risk, and too little information) and were acknowledged to have roles across the spectrum of concept generation, technology development, system integration, and test [34]. Further, prototypes were acknowledged to provide multiple opportunities, such as reducing risk early, fostering innovation, inspiring a new generation of designers and engineers, recruiting and retaining technical leaders, and increasing public interest [35].

The competitive prototyping requirements for MDAPs mandated by WSARA and codified as amended at 10 U.S.C. § 2430 were repealed by the NDAA for FY16 [36] and replaced with language referring to prototyping as one of several alternative risk management and mitigation approaches. The revised language is codified as amended at 10 U.S.C. § 2431b, where a comprehensive approach to technical, cost, and schedule risk management and mitigation in an acquisition strategy starts with the technique of “prototyping (including prototyping at the system, subsystem, or component level and competitive prototyping, where appropriate).”

The same public law (NDAA for FY16 [36]) that repealed portions of WSARA also produced Section 804—Middle Tier of Acquisition for Rapid Prototyping and Rapid Fielding—codified as amended at 10 U.S.C. § 2302, as well as Section 815—authority of the DoD to carry out certain prototype projects—codified as amended at 10 U.S.C. § 2371b. The acquisition-focused nature of these authorities requires that system prototypes be demonstrated in an operational environment within 5 years and potentially provide residual operational capability. However, 10 U.S.C. § 2371b gives the Director of the Defense Advanced Research Projects Agency (DARPA), the secretary of a military department, or any other official designated by the Secretary of Defense the authority to carry out prototype projects “that are directly relevant to enhancing the mission effectiveness of military personnel and the supporting platforms, systems, components, or materials proposed to be acquired or developed by the Department of Defense, or to improvement of platforms, systems, components, or materials in use by the armed forces.”

Although no specific guidance is available on how the research, innovation, disruption, and entrepreneur communities of the DoD should plan, fund, or implement prototyping efforts, 10 U.S.C. § 2371b enables application of agency funding to such efforts prior to the initiation of an acquisition program. (For the most relevant current directions and authorities pertaining to prototype projects, see the DoD’s Prototyping Guidebook [25].)

The Role of Prototyping

The act of prototyping has historically been greatly beneficial in terms of risk reduction and concept demonstration prior to system development. Specifically, prototyping has advanced new technologies, enhanced industry workforce skills between major acquisitions, and dissuaded adversaries by showcasing new capabilities [3, 18, 35, 37].

A common perception of the engineering development process is that ideas flow linearly from the laboratory into prototyping, then through engineering and development, and eventually into production and sustainment. This linear model is incompatible with the current global environment, where the desire to emerge from competition as the victor drives both industry and government to a risk-tolerant, “fail early, fail often” approach in every phase of development [38–41].

Not all implementations of prototyping need to be 7-year, $700M [42] or 4-year, $938M [43] efforts performed by teams of defense contractors. There are organizations who employ prototyping as a tool and a strategy while operating outside of, and even independent of, defense acquisition, such as academia, internal R&D departments across private industry, FFRDCs, university-affiliated research centers, DARPA, and corporate laboratories within the Services. It is important to recognize that these organizations are not ineligible to create prototypes or prototype their ideas before making long-term investment decisions. To the contrary, this strategy is desired.  Scientists and engineers routinely prototype their ideas to test hypotheses, reduce uncertainty, and explore feasibility [44, 45]. Whether hardware or software, these assume various names and forms, such as mock-ups, wire frames, breadboards, brassboards, proofs of concept, subscale models, and digital representations.

The DoD Prototyping Guidebook emphasizes prototyping as an enabler across all communities involved in system development, including exploratory S&T, R&D, and acquisition, regardless of whether the prototyping activity is occurring inside of, in support of, or completely independent of a program of record. The DAU course on prototyping and experimentation summarizes benefits and applications that span the development spectrum from exploration, to engineering, to acquisition, and even includes emerging capability shortfalls given the speed at which policies are adapting to the current national security climate. Prototyping is happening across the system’s life cycle and helps all communities understand the problem, come up with alternative solutions, assess the alternatives, learn (through success and failure), and make informed decisions.

Enabling rapid, but disciplined, progression from idea to prototype allows early and continual testing of ideas to screen for promising concepts and iterate to a validated solution before committing to the pursuit of operational viability and full, operational capability.

The purposes of prototyping are well documented. They are as follows:

  • Generates information that supports specific decisions [46].
  • Helps justify subsequent investments made in technology and technology maturation [25].
  • Creates a preliminary version of something to resolve risk and explore operational potential [16].
  • Ensures that new, innovative, and disruptive technologies are available to include in potential future systems and demonstrates the value of new technologies or systems [18].
  • Increases user buy-in and participation, develops a better understanding of the product and its requirements, and reduces risk [47].
  • Gains practical, operational knowledge and experience shared across defense industrial base, appeals to public interest in new technology, and inspires future innovators [35, 48].
  • Helps bridge the gap between research and applications by enabling individuals and organizations with different technical backgrounds to exchange ideas in a common, intuitive, and understandable form [49].

Lauff’s in-depth study of prototyping reveals that prototypes are static objects until they are given meaning through the socially-constructed contexts and environments in which they are being used; they are a form of design language that enables communication, aids in learning, and informs decision making [50].

Horning et al. [51] promote prototyping as an operational strategy to reduce development time and continually deliver mission-custom solutions by blending mission engineering, digital engineering, early synthetic prototyping, and advanced manufacturing.

Mulenburg and Gundo [52] promote prototypes as a means of quickly assessing design feasibility through trying out ideas. The authors present merits of a design-by-prototype process, applied to small, high-risk projects, using three case studies. In each case, designers and decision makers interacted with low-fidelity, tangible hardware mock-ups that informed final product decisions and led to a successful outcome [52].

Prototyping in research may not have anything to do with a user—neither a product nor a market may yet exist—and will instead focus on testing the feasibility of an idea and reducing uncertainty through a learning process [53]. An example is the transistor prototype created by Bell Telephone Laboratories researchers in late 1947 [54].  After an underpinning theory was verified through experimentation, researchers built a rough prototype device and demonstrated its functionality in the laboratory. A group of top engineers then spent the next 6 months considering applications of the technology. The transistor, a replacement for vacuum tube technology, is considered one of the most important technologies of the twentieth century.

Breaking the Paradigm

Despite the documented flexibility in time and purpose of prototyping, the prevailing mental model in defense systems acquisition appears to be tied to a near-final, full-up system. One reason for this may be the use of the term in the Technology Readiness Level (TRL) scale [55] definitions and in the DoD Financial Management Regulation (FMR) [56] that defines an RDT&E budget activity (BA).

The TRL scale, conceived by the National Aeronautics and Space Administration (NASA) and tailored by each organization that implements the concept, is a measurement system that supports assessments of readiness, or maturity, of a technology on a scale of 1 (least mature) to 9 (most mature). Practical examples of each level of the TRL scale are provided by Grudo [57].

The RDT&E budget activities span from Activity 1 (basic research; systematic study tailored toward greater knowledge or understanding of a scientific principle) to Activity 7 (operational system development and upgrades).

The first appearance of the term “system prototype” in the TRL scale is at TRL 6, while the first appearance of the terms “prototype system” and “high-fidelity operating environment” in the FMR is at BA 6.4. On the surface, this presents a problem for the S&T community when they want to perform prototyping to reduce risk, gain knowledge, and prune decision paths early because S&T funding ends with BA 6.3. Though the TRL scale is the most widely used measure of technology maturity, organizations such as DoD, NASA, and the Department of Energy routinely tailor the definitions to suit their application and serve as a common language [58].

The current DoD paradigm of the relationships between technology maturity, funding, technology description [55], environment, and prototype category that should be reconsidered, in part because of its assumption of linear and sequential development, is summarized in Table 1. The linear model of innovation, where development progresses from basic research, to applied research, to development, and to production originated in the early 20th century and evolved over the next several decades as natural scientists, R&D business industry scholars, and economists built on the original taxonomy [59].

The prototype category column of Table 1 derives from the course Prototyping and Experimentation to Improve Acquisition [17], which subsequently formed the basis for an update to DAU CLE 082 [16] and resulted in the following descriptions of each of the three types of prototypes to clarify their roles in defense research, engineering, acquisition, and sustainment.  Of note is the introduction of the conceptual prototype category, which aligns with the conventional budget and technology maturity categories found in S&T.

Conceptual. Demonstrate the art of the possible, provide evidence of overcoming specific technical risks and barriers, and evaluate S&T with a DoD corporate focus. Conceptual prototypes can also be used to support the analysis of a proof of concept or demonstrate feasibility prior to Milestone A. Conceptual prototype models are often breadboards and may be ready to demonstrate or prove in a laboratory environment.

Developmental.  Validate the technical feasibility and explore the operational value of a capability that has already been proven in laboratory and relevant environments during the Technology Maturation & Risk Reduction phase. A developmental prototype will define the form, fit, function, and “ilities” of the system or technology.

Operational. Develop the technology or system so that Warfighters can use it in the field after it has been demonstrated in a realistic operational environment during the Production & Deployment or Operation & Support phases. An operational prototype can rapidly provide a needed capability to the field.

GAO recommends in numerous reports that technologies be demonstrated in a realistic (i.e., operational) environment prior to starting development. However, DoD permits an MDAP to proceed with development once the milestone decision authority certifies that the technology has been demonstrated in a relevant environment. This difference in demonstrated maturity level, commonly referred to as a technology “valley of death,” results in residual risk that a program of record is neither inclined nor budgeted to further reduce. (“Valley of death” is the colloquial phrase for when a technology is unable to cross ownership boundaries, e.g., from S&T into development, development into production, or production into sustainment. Some reasons include higher-than-acceptable residual risk, inadequate funding, and misaligned technological capability.)

From Table 2 [17], it is clear that prototyping has value across live, virtual, and constructive experimentation venues, even when technology maturity is low, primarily benefitting from gaining feedback on user needs.

The DoD Emerging Capability & Prototyping Office promotes a strategy that spans nearly the entire defense acquisition system and TRL scale, as shown in Figure 1 [60]. There are similarities between Proof of Principle, Pre-Engineering & Manufacturing Development (EMD), and Fieldable prototypes in Figure 1 and Conceptual, Developmental, and Operational prototypes in Table 2, but interchangeability should not be assumed. Instead, the take-away should be that prototyping has a valid role across the acquisition spectrum—from pre-conceptual feasibility testing through operational qualification and assessment—and helps to build up an understanding of system-level impacts.

Figure 1: Prototyping’s Role in DoD Acquisition (Source: Deputy Assistant Secretary of Defense EC&P).
Figure 1: Prototyping’s Role in DoD Acquisition (Source: Deputy Assistant Secretary of Defense EC&P).

Moving Forward With a Strategy

With the concept of breaking the linear innovation and technology maturation paradigm now broached, a strategy is needed for how to implement and execute prototyping across the system’s development life cycle. Studies conclude that prototyping should be implemented as both a strategy and a tool to complement experimentation, aid in innovation, and help cross the S&T valley of death [1, 3, 4].  Choosing a strategy to implement prototyping practices in early S&T is not trivial, but the literature provides alternatives for classifying strategies based on purpose, motivation, and expected learning.

Lauff [50] compiles a comprehensive review of literature on prototype frameworks, taxonomies, and strategies and performs three case studies to understand state of the practice and document findings. A research outcome is a prototyping canvas that guides users to build the simplest prototype possible to quickly gather feedback through a problem statement, assumptions and questions, stakeholder interactions, a testing plan, resource identification, and an approach to a “minimum viable prototype.” With such a tool, the community can navigate ambiguity and reduce wasted resources [61, 62].

In a seminal work on prototyping, Drezner [63] stresses the importance of understanding timing, level in the system integration spectrum, and goals. The portion of Drezner’s proposed taxonomy that best suits prototyping during R&D starts with the purpose of technology viability, which focuses on generating information to reduce technological risk. Technology viability can be assessed outside the normal weapon system acquisition program structure and done without a specified military mission. The two objectives associated with this purpose in the taxonomy are experimentation to demonstrate a new idea, a new technology, or an existing technology in a new application and exploration to evaluate the possible performance envelope.

Carr and Verner [53] present a framework based on their experience in software development and promote prototypes as instruments used strategically throughout the development process. Three approaches are described—exploratory, experimental, and evolutionary. Exploratory prototyping is used to engage with customers early to discover and clarify requirements, elicit value and preferences, and understand how the technology will be used. At this level, a “quick-and-dirty” or “throw-it-away” prototype can serve as a partial implementation of the system and may not look anything like the anticipated final system; the objective is to learn.

Experimental prototyping uses a breadboard approach to assess feasibility of new ideas and features. At this level, the prototype varies from partial implementation to a mock-up of the anticipated final solution. Evolutionary prototyping is used in environments of high uncertainty, e.g., when requirements are not all known ahead of time and when the user may neither know how the technology will be used nor the environment in which it is intended to operate. The evolutionary approach enables adaptation to continuous learning of user needs, external factors, and requirements.

Beynon-Davies et al. [47] develop a taxonomy of information system prototype practices and further promote prototypes as a tool and a technique. Three forms of prototype are described—throwaway, incremental, and evolutionary. Throwaway prototypes are used to gain a specific piece of knowledge and then discarded (exploratory) or continuously test the feasibility of some feature (experimental) offline. An incremental prototype is refined gradually and becomes either a part of the final delivered system or the delivered system itself (e.g., block 1 has limited functionality, block 2 has increased functionality, etc.). An evolutionary prototype is part of a system that is to be delivered in increments (e.g., new features, modules, or plug-ins added over time until full functional capability has been achieved). Incremental and evolutionary prototypes are intended for eventual operational use, whereas throwaway prototypes are intended for offline learning.

Menold et al. [64] compile an extensive literature review related to prototyping frameworks and strategies. Their objective was to help bridge the gap between research and practice by providing designers with a structured set of methods for prototyping activities. The resulting framework is based on three phases—frame, build, and test. In this way, implementers can focus on testing assumptions for gaining knowledge.

Lichter et al. [65] identify a difference between prototyping in practice and prototyping in theory and use case studies to validate a framework that describes prototypes using kinds (e.g., presentation, prototype proper, breadboard, and pilot system), goals (e.g., exploratory, experimental, and evolutionary), construction techniques (e.g., horizontal and vertical), and relationship to application system (e.g., building block, throwaway, and problem clarification). The framework helps describe prototypes and promote them as a basis for discussion via experimentation.

Implementing prototyping as a tool and a strategy shifts the focus from near-final design validation to a purposeful and methodical approach of a manageable number of unknowns and hypotheses across the capability development span. In this way, it becomes a vehicle for early and continuous learning that can guide investments in technologies to close threat-based gaps in a rapidly-innovating global environment.

Conclusions and Recommendations

In the current global environment, the aim is to deliver the right capabilities to the Warfighter, quickly and efficiently [66]. There is increased emphasis within the DoD on using prototyping and experimentation to explore new capabilities and reduce technical, cost, and schedule risk prior to entering systems acquisition [16, 18, 67]. Further, prototyping is seen as an enabler for innovation [4, 41], which is important for the DoD as it seeks innovation across operational concepts, organizational structure, business processes, and technology [66].

Prototyping offers an innovative approach to solving technical challenges and a scientific approach to answering research questions, potentially through invention. However, the focus of prototyping in acquisition reform has been on integrating existing technologies to form a new or enhanced system-level capability that intends to be fielded from the outset. The entire defense development community can benefit by treating prototyping as a license to explore and test hypotheses, reduce risk, and learn on an object that may not resemble the anticipated final system in form, fit, or function. While acquisition and sustainment considers prototyping as part of acquisition agility [68], perhaps R&E can think of prototyping as “knowledge agility.”

The conditions are present for prototyping and experimentation to provide early and enduring benefit, from knowledge generation all the way through product sustainment. For example, in the pursuit of informing future concepts vis-à-vis a novel propulsion system, an organization might test basic hypotheses of bearings, lubricants, and transmissions on live, virtual, or constructive mock-ups as opposed to building an entire rotorcraft vehicle. In this case, prototyping at the subsystem and component levels will help reveal where innovation can address a gap and where a scientific breakthrough might be needed to address a lagging subsystem capability, such as power density or mass.

The DoD should consider breaking the paradigm of prototypes aligned with budget activity and assessments of technology maturity. This way, technology maturity, prototype maturity, and integrated system maturity can take place independently and potentially complement each other instead of competing with each other. As part of a formal technology readiness assessment for critical technologies, the TRL scale is used to provide a consistent maturity evaluation standard, not to assess the readiness of any technology based on prototype test results at increasingly higher levels of system hierarchy. Perhaps the TRL scale can be modified to include a reference to prototypes at every level, 1 through 9.

The anticipated time and cost benefits of prototyping will only be realized through a targeted and intentional strategy based on learning objectives. Various frameworks, taxonomies, and strategies are documented in the literature and readily available for tailoring.

Research shows the importance of prototyping across the research and technology development spectrum and reveals that various communities use prototypes as leverage to generate knowledge independent of technology maturity levels and budget activity. A recommendation is made for relaxing definitions and policy constraints so that all communities can benefit. A common goal is generating knowledge and unlocking decision paths to get a product into the hands of the Warfighter. Prototyping offers a way to intelligently and methodically evaluate feasibility, resolve risks, refine requirements, gain stakeholder buy-in, and assess military utility. Efficiency and effectiveness in acquisition is the goal, and prototyping ideas early is an enabling solution.


Note From the Editor

DSIAC regularly provides information research capabilities, technical expertise, and a network of subject matter experts to U.S. Army Research Laboratory (ARL) engineers and scientists to enhance their technical efforts. Last year, ARL asked DSIAC to identify documented examples in support of their research effort to use early prototyping for science concepts and ideas in hardware and software. DSIAC was subsequently requested to conduct a technical peer review of a position paper authored by ARL on the span of prototyping in defense research, engineering, acquisition, and sustainment. That discussion paper evolved into the article presented in this journal.

 


References

  1. U.S. Army Science Board. “Improving Transition of Laboratory Programs Into Warfighting Capabilities Through Experimentation.”  Defense Technical Information Center, Fort Belvoir, VA, https://apps.dtic.mil/dtic/tr/fulltext/u2/1063618.pdf, 2017.
  2. Hencke, R. “Prototyping: Increasing the Pace of Innovation.”  Defense AT&L, July–August 2014.
  3. National Research Council. “Assessment to Enhance Air Force and Department of Defense Prototyping for the New Defense Strategy:  A Workshop Summary.”  The National Academies Press, Washington, DC, https://www.nap.edu/catalog/18580/ assessment-to-enhance-air-force-and-department-of-defense-prototyping-for-the-new-defense-strategy, 2013.
  4. U.S. Army Science Board. “The Strategic Direction for Army Science and Technology.”  Defense Technical Information Center, Fort Belvoir, VA, https://apps.dtic.mil/dtic/tr/fulltext/u2/a571038.pdf, 2013.
  5. National Academies of Sciences, Engineering, and Medicine. “Creating Capability for Future Air Force Innovation:  Proceedings of a Workshop in Brief.”  The National Academies Press, Washington, DC, https://www.nap.edu/catalog/25220/creating-capability-for-future-air-force-innovation-proceedings-of-a, 2018.
  6. Danzig, R. “Driving in the Dark:  Ten Propositions About Prediction and National Security.”  Center for a New American Security, Washington, DC, 2011.
  7. Leonhard, R. R. The Principles of War for the Information Age.  Ballantine Books, New York, p. 215, 1998.
  8. Doorley, S., S. Holcomb, P. Klebahn, K. Segovia, and J. Utley. “Design Thinking Bootleg.”  Hasso Plattner Institute of Design at Stanford University, Stanford, CA, https://dschool.stanford.edu/s/9wuqfxx68fy8xu67khdiliueusae4i, 2018.
  9. Ries, E. “The Lean Startup.”  Crown Business, New York, NY, 2011.
  10. Holland, T. “How the Army Ought to Write Requirements.”  Military Review, vol. 97, no. 6, pp. 100–105, https://www.armyupress.army.mil/Portals/7/military-review/archives/ENGLISH/November-December-2017-English-book.pdf, November–December 2017.
  11. Merchant, B. “The Secret Origin Story of the iPhone:  An Exclusive Excerpt From the One Device.”  The Verge, https://www.theverge.com/2017/6/13/15782200/one-device-secret-history-iphone-brian-merchant-book-excerpt, 13 June 2017.
  12. GAO.  “Leadership Attention Needed to Effectively Implement Changes to Acquisition Oversight; Report No. GAO-19-439.”  Government Accountability Office, p. 2, Washington, DC, https://www.gao.gov/assets/700/699527.pdf, 2019.
  13. U.S. DoD. “John S. McCain NDAA for FY19, Public Law 115-232, https://www. congress.gov/115/plaws/publ232/PLAW-115publ232.pdf, 13 August 2018.
  14. Office of Naval Research. “Innovative Naval Prototype.”  Office of Naval Research Science & Technology, https://www.onr.navy.mil/en/Science-Technology/Departments/Code-35/All-Programs/air-warfare-and-naval-applications-352/innovative-naval-prototype, 29 August 2019.
  15. Merriam-Webster. “Definition of Prototype.”  https://www.merriam-webster.com/dictionary/prototype, 15 July 2019.
  16. DAU. “CLE 082 Prototyping and Experimentation.”  http://icatalog.dau.mil/onlinecatalog/courses.aspx?crs_id=12166, accessed 8 January 2018.
  17. Sciarretta, A., S. Ramberg, J. Lawrence III, A. Gravatt, P. Robinson, and M. Armbruster. “Innovative Prototyping and Rigorous Experimentation (iP&rE):  A One-Week Course to Build Culture and a Cadre.”  Center for Technology and National Security Policy, National Defense University, Institute for National Strategic Studies, Washington, DC, 2017.
  18. GAO. “Prototyping Has Benefited Acquisition Programs, But More Can Be Done to Support Innovation Initiatives.” Report no. GAO-17-309.  Government Accountability Office, Washington, DC, https://www.gao.gov/assets/690/685478.pdf, 2017.
  19. GAO. “Adopting Best Practices Can Improve Innovation Investments.” Report no. GAO-17-499.  Government Accountability Office, Washington, DC, https://www.gao.gov/assets/690/685524.pdf, 2017.
  20. DAU. “DAU Acquipedia Definition of Prototype.”  https://www.dau.mil/acquipedia/pages/articledetails.aspx#!375, accessed 15 July 2019.
  21. DAU. “DAU Glossary Definition of Brassboard.”  https://www.dau.mil/glossary/Pages/Glossary.aspx#!both|B|26935, accessed 15 July 2019.
  22. DAU. “DAU Glossary Definition of Breadboard.”  https://www.dau.mil/glossary/Pages/Glossary.aspx#!both|B|26936, accessed 15 July 2019.
  23. DAU. “DAU Glossary Definition of Prototype.”  https://www.dau.mil/glossary/Pages/Glossary.aspx#!both|P|28295, accessed 15 July 2019.
  24. DAU. “DAU Glossary Definition of System Element.”  https://www.dau.mil/glossary/Pages/Glossary.aspx#!both|S|28602, accessed 8 August 2019.
  25. OUSD[R&E] EC&P. Prototyping Guidebook.  “Emerging Capability and Prototyping, Office of the Under Secretary of Defense for Research and Engineering.”  DoD, Washington, DC, pp. 3, 34–36, 2018.
  26. OUSD[A&S]. Other Transactions Guide.  “Office of the Under Secretary of Defense for Acquisition and Sustainment.”  DoD, Washington, DC, p. 31, 2018.
  27. MITRE. MITRE Systems Engineering Guide:  Competitive Prototyping.  MITRE, https://www.mitre.org/publications/systems-engineering-guide/acquisition-systems-engineering/contractor-evaluation/competitive-prototyping, accessed 29 August 2016.
  28. Headquarters, U.S. Department of the Army. Army Regulation 73-1 – Test and Evaluation; Test and Evaluation Policy.  Washington, DC, p. 91, 2018.
  29. U.S. Air Force. “Air Force Instruction 99-103 – Capabilities-Based Test and Evaluation.”  Arlington, VA, https://static.e-publishing.af.mil/production/1/af_te/publication/afi99-103/afi99-103.pdf, pp. 13–14, 106, 2017.
  30. U.S. Department of the Navy. Operational Test Director’s Manual; Commander Operational Test and Evaluation Force Instruction (COMOPTEVFORINST) 3980.2I.  Norfolk, VA, https://www.public.navy.mil/cotf/OTD/otd%20Manual.pdf, 2019.
  31. DAU. Department of Defense Test & Evaluation Management Guide.  Sixth edition, Defense Acquisition University Press, Fort Belvoir, VA, https://www.dau.mil/tools/Lists/ DAUTools/Attachments/148/Test%20and%20Evaluation%20Management%20Guide,%20December%202012,%206th%20Edition%20-v1.pdf, 2012.
  32. U.S. Marine Corps. Integrated Test and Evaluation Handbook.  Marine Corps Systems Command, Deputy Commander for Systems Engineering, Interoperability, Architecture and Technology, Quantico, VA, https://www.hqmc.marines.mil/Portals/ 61/Docs/MCOTEA/Signed%20USMC%20Integrated%20TE%20Handbook%20Version%201-2.pdf?ver=2012-09-28-111543-427,p. 24, 2010.
  33. WSARA. Public Law 111-23, https://www.congress.gov/111/plaws/publ23/PLAW-111publ23.pdf, 22 May 2009.
  34. Permanent Subcommittee on Investigations. “Defense Acquisition Reform:  Where Do We Go From Here?  A Compendium of Views by Leading Experts.”  U.S. Senate, Committee on Homeland Security and Governmental Affairs, Washington, DC, p. 109, 2014.
  35. Williams, E., and A. R. Shaffer. “The Defense Innovation Initiative:  The Importance of Capability Prototyping.”  Joint Force Quarterly, vol. 77 (2nd Quarter), pp. 34–43, 2015.
  36. NDAA for FY16. Public Law 114-92, https://www.congress.gov/114/plaws/publ92/PLAW-114publ92.pdf, 25 November 2015.
  37. Medlej, M., S. M. Stuban, and J. R. Dever. “Assessing the Likelihood of Achieving Prototyping Benefits in Systems Acquisition.”  Defense Acquisition Review Journal, vol. 24, no. 4, pp. 626–655, https://www.dau.edu/library/arj/ARJ/ARJ83/ARJ83%20Article% 202%20-%2071-774%20Medlej.pdf, October 2017.
  38. DVIDS. “At Altitude – Dr. Will Roper.” Audio podcast, https://airman.dodlive.mil/2019/05/07/podcast-dr-will-roper/, 7 May 2019.
  39. Vergun, D. “New Army Futures Command Success Hinges on Relationship Building, Says McCarthy.”  U.S. Army, https://www.army.mil/article/200323/new_army_futures_ command_success_hinges_on_relationship_building_says_mccarthy, 8 February 2018.
  40. Basulto, D. “The New #Fail:  Fail Fast, Fail Early and Fail Often.”  The Washington Post, https://www.washingtonpost.com/ blogs/innovations/post/the-new-fail-fail-fast-fail-early-and-fail-often/2012/05/30/gJQAKA891U_blog.html?noredirect=on, 30 May 2012.
  41. Dougherty, G. M. “Promoting Disruptive Military Innovation:  Best Practices for DoD Experimentation and Prototyping Programs.”  Defense Acquisition Research Journal, vol. 25, no. 1, pp. 2–29, https://www.dau.edu/library/arj/ARJ/arj84/ARJ84%20Article%201%20-%2017-782%20Dougherty.pdf, January 2017.
  42. Judson, J. “Next-Gen Combat Vehicle Prototyping Kicks Off.”  Defense News, https://www.defensenews.com/digital-show-dailies/ausa/2017/10/10/next-gen-combat-vehicle-prototyping-kicks-off/, 10 October 2017.
  43. General Services Administration. “Future Attack Reconnaissance Aircraft Competitive Prototype (FARA CP).”  FedBizOpps, https://www.fbo.gov/notices/eb262d7ba8990420012ec3bd47de65cd, 13 June 2019.
  44. Defense Systems Information Analysis Center. Response to Technical Inquiry 67426 (“Prototyping Early Science Concepts”).  Belcamp, MD, 25 June 2019.
  45. Deininger, M., S. R. Daly, K. H. Sienkoand, and J. C. Lee. “Novice Designers’ Use of Prototypes in Engineering Design.”  Design Studies, vol. 51(C), pp. 25–65, https://europepmc.org/article/med/29398740, July 2017.
  46. Drezner, J. A., and M. Huang. “On Prototyping:  Lessons From RAND Research.”  RAND Corporation, National Defense Research Institute, Santa Monica, CA, http://www.rand.org/content/dam/rand/pubs/occasional_papers/2010/RAND_OP267.pdf, 2009.
  47. Beynon-Davies, P., D. Tudhope, and H. Mackay. “Information Systems Prototyping In Practice.”  Journal of Information Technology, vol. 14, no. 1, pp. 107–120, https://link.springer.com/article/10.1080/026839699344782, 1 March 1999.
  48. Young, J. J. “Prototyping and Competition.”  DoD, Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, Washington, DC, 2007.
  49. Dai, F., W. Felger, T. Fruhauf, M. Gobel, D. Reiners, and G. Zachmann. “Virtual Prototyping Examples for Automotive Industries.”  https://www.researchgate.net/publication/2362802_Virtual_Prototyping_Examples_for_Automotive_Industries, 1996.
  50. Lauff, C. A. “Prototyping in the Wild:  the Role of Prototypes in Companies.”  Doctoral Dissertation, University of Colorado at Boulder, https://scholar.colorado.edu/mcen_ gradetds/175, 2018.
  51. Horning, M. A., R. E. Smith, and S. Shidfar. “Mission Engineering and Prototype Warfare:  Operationalizing Technology Faster to Stay Ahead of the Threat.”  NDIA Ground Vehicle Systems Engineering and Technology Symposium (GVSETS), https://apps.dtic.mil/dtic/tr/fulltext/u2/1057589.pdf, 2018.
  52. Mulenburg, G. M., and D. P. Gundo. “Design by Prototype:  Examples From the National Aeronautics and Space Administration.”  The 11th International Conference on the Management of Technology, Moffett Field:  NASA Ames Research Center, https://ntrs.nasa.gov/search.jsp?R=20040081092, 2002.
  53. Carr, M., and J. Verner. “Prototyping and Software Development Approaches.”  Department of Information Systems, Hong Kong, China:  City University of Hong Kong, 1997.
  54. Gertner, J. The Idea Factory:  Bell Labs and the Great Age of American Innovation.  New York, NY: The Penguin Press, pp. 92–106, 2012.
  55. NASA. “NPR 7123.1B – NASA Systems Engineering Processes and Requirements (Updated with Change 4).”  Washington, DC, https://nodis3.gsfc.nasa.gov/npg_ img/N_PR_7123_001B_/N_PR_7123_001B_.pdf, 2013.
  56. OUSD[C]. “DoD 7000.14-R Financial Management Regulation, Volume 2B, Chapter 5:  Research, Development, Test, and Evaluation Appropriations.”  DoD, Washington, DC, https://comptroller.defense.gov/Portals/45/documents/fmr/current/ 02b/02b_05.pdf, accessed 14 May 2019.
  57. Grudo, G. “Technology Readiness Levels, Explained.”  Air Force Magazine, pp. 22–23, August 2016.
  58. GAO. “Exposure Draft of Technology Readiness Assessment Guide:  Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects.” Report no. GAO-16-410G, GAO, Washington, DC, https://www.gao.gov/products/GAO-16-410G, pp. 16, 131–136, 2016.
  59. Godin, B. “The Linear Model of Innovation:  The Historical Construction of an Analytical Framework.”  Science, Technology, and Human Values, vol. 31, no. 6, pp. 639–667, https://journals.sagepub.com/doi/10.1177/0162243906291865, 1 November 2006.
  60. ODASD[EC&P]. “Prototyping’s Role in Acquisition.”  USD(A&S), https://www.acq.osd.mil/ecp/OUR_STRATEGY/ PrototypingCharacteristics.html, accessed 11 July 2019.
  61. Lauff, C., J. Menold, and K. Wood. “Prototyping Canvas:  Design Tool for Planning Purposeful Prototypes.”  Proceedings of the Design Society:  International Conference on Engineering Design, vol. 1, no. 1, pp. 1563–1572, https://www.cambridge.org/core/ journals/proceedings-of-the-international-conference-on-engineering-design/article/ prototyping-canvas-design-tool-for-planning-purposeful-prototypes/ A535E9D0DFC923B7F41C6288FA0AF3E1, 2019.
  62. SUTD-MIT International Design Centre (IDC). “Prototyping Canvas.”  Design Innovation Course – Prototyping Canvas, https://www.dimodules.com/prototypingcanvas, 2019.
  63. Drezner, J. A. “The Nature and Role of Prototyping in Weapon System Development.”  RAND Corporation, Santa Monica, CA, https://www.rand.org/pubs/reports/R4161.html, 1992.
  64. Menold, J., K. Jablokow, and T. Simpson. “Prototype for X (PFX):  A Holistic Framework for Structuring Prototyping Methods to Support Engineering Design.”  Design Studies, vol. 50, pp. 70–112, https://pennstate.pure.elsevier.com/en/publications/ prototype-for-x-pfx-a-holistic-framework-for-structuring-prototyp, 2017.
  65. Lichter, H., M. Schneider-Hufschmidt, and H. Zullighoven. “Prototyping in Industrial Software Projects—Bridging the Gap Between Theory and Practice.”  IEEE Transactions on Software Engineering, vol. 20, no. 11, pp. 825–832, 11 November 1994.
  66. Mattis, J. “Summary of the 2018 National Defense Strategy of The United States of America.”  DoD, Washington, DC, 2018.
  67. U.S. Army Combat Capabilities Development Command (CCDC). “Army Comms R&D: From the Ground to Space and CyberBlitz w/Mr. Michael Monteleone, SES.”  Audio podcast, Aberdeen Proving Ground, MD, https://ccdc.podbean.com/e/army-comms-rd-from-the-ground-to-space-and-cyberblitz/, 2019.
  68. NDAA for FY17. Public Law 114-328, Subtitle B (“Department of Defense Acquisition Agility”), 23 December 2016.

Biographies

ERIC SPERO is a systems engineering and technical lead in the Vehicle Technology Directorate of the CCDC Army Research Laboratory (ARL), where he provides live and virtual air and ground autonomous robotic systems in support of research experimentation. He has over 20 years of industry and government experience in aerospace and defense, including researching, designing, engineering, testing, and project managing complex aerospace systems. He is an Associate Fellow of the American Institute of Aeronautics and Astronautics. Mr. Spero holds a B.S. in chemical engineering from Colorado State University and an M.S. in aerospace engineering, with a concentration in system design and optimization, from the Georgia Institute of Technology.

ZEKE TOPOLOSKY is the Army xTechSearch Program Manager in the Office of Strategy Management of CCDC ARL, where he executes the Army Expeditionary Technology Search competition and supports other technology transition initiatives. Mr. Topolosky holds a B.S. and an M.S. in mechanical engineering from the University of Maryland, College Park.

KARL A. KAPPRA is the Director of the Futures Division in the CCDC ARL, where he serves as principal advisor to the ARL Director on strategic S&T investments and business practices to enable the efficient, timely, and systematic integration of scientific research outcomes and technology forecasts with Army Warfighting concepts and the future operating environment. Mr. Kappra holds a B.S. in mechanical engineering from Villanova University and an M.S. in mechanical engineering from the Johns Hopkins University.



Author(s): Eric Spero, Zeke Topolosky, and Karl A. Kappra