Digital Engineering (DE) is a surprisingly elusive concept. On its surface, it seems straight-forward: develop a digital “twin” of a weapons system, or any other kind of military hardware, so that 3D renderings and models are exact representations. And, in addition, develop a digital “thread” of the system so that its history can be understood—such that modifications are visible in the evolution of the digital twin over time.
Yet the true purpose of DE is much larger than a simple electronic record. The goal is not only to understand what exists (that’s important enough), but to help understand what could be (that’s perhaps even more important).
Think of it this way: Throughout the useful life of a system, design is something that happens repeatedly. The original design ends up being analyzed and modified many times, as new capabilities are added, old deficiencies resolved, or changes made to extend the useful life of the system. From that standpoint, the baseline design is the enabler of everything that is to come. For this reason, digital twins have increasing importance over time.
Nor is DE simply about making the design, engineering, and sustainment process as efficient as possible. It certainly does that, but that is not its greatest virtue. Rather, DE’s true power is as an enabler of creativity and innovation. DE helps engineers think about systems holistically and to place those systems in an operational context to evaluate performance long before any “metal is cut.”
Costs mount in an inverse relationship with design flexibility as a program matures. The further a program is to the right on the development spectrum, the more restrictive the trade-space, and costly it is to fix it if something goes wrong. This means that errors and major design tradeoffs must be discovered as early as possible in a program’s life cycle.
If there is a hard part about DE, it’s understanding what is required to make it happen. The US Department of Defense (DOD), for instance, has been working for months with hundreds of people taken from across the enterprise to develop a comprehensive and integrated DE solution. Progress has been made, but not as much as the DOD would like. Part of the problem is that DE is a bit like knowledge management: There is no agreed-upon definition and stakeholders’ needs are quite variable. Yet, confusion aside, there are pragmatic solutions that already exist within the DOD that can be shaped to help overcome the DE challenge.
In 2016, Guidehouse began working with the High Performance Computing Modernization Program (HPCMP), which is the Office of Secretary of Defense-directed entity that manages, among other things, the DOD’s supercomputing assets. Guidehouse has been a close partner of HPCMP and has been involved in many facets of its operations and strategic planning. Guidehouse has undertaken numerous HPCMP analyses, including those related to utilization, workload, insider threat, cloud computing, economic modeling, stakeholder outreach, business model development, research institute development, and many others.
Our involvement with HPCMP has made Guidehouse an expert in the high performance computing (HPC) space, and especially areas related to business model development, cloud access strategies, and DE-enablement through the application of HPCMP’s ecosystem of capabilities.
Recently, Guidehouse played an important advisory role to HPCMP on the development of two congressional reports—the first dealing with the introduction of commercial cloud to the HPCMP computational ecosystem, and the second offering a perspective on how HPCMP can support digital engineering and virtual prototyping and testing. Both reports were mandated by Congress and Guidehouse was the lead author in their development.
The HPCMP computational ecosystem is technically complex. In addition to the five DOD Supercomputing Resource Centers (DSRCs) and their powerful supercomputers, HPCMP also maintains the DOD Research and Engineering Network (DREN) (at both the unclassified and secret levels), an assortment of world-class, physics-based design and simulation software tools, the cybersecurity framework to protect it all, and the subject matter expertise to help users solve their computational and DE problems.
Scientists and engineers use the ecosystem to help better understand how weapons systems interact with the physical and electromagnetic world. However, Guidehouse has long seen the potential of the ecosystem in an expanded mission set (as has Congress). Beginning as early as 2016, Guidehouse suggested applying HPCMP’s capabilities to acquisition engineering workloads, in addition to those associated with science and technology (S&T). Guidehouse had analyzed use-cases and the demand-signal and knew some workloads could be removed from on-premises resources, using commercial cloud, so that critical workloads could be given priority within the DSRCs. We also saw the need to use DE to bring new rigor to the inherently manual processes within acquisition environments, thereby accelerating decision-making.
There have always been examples of clever people using the HPCMP systems in this way—one recent example was the Future Attack Reconnaissance Aircraft (FARA) program that saw four rotorcraft virtual designs whittled down to just two simply by looking at the concepts through the lens of their theoretical performance. The reality is that aircraft must interact with their physical environment, and the laws of physics are immutable. A given design cannot move through the air faster than the laws of physics dictate. This may seem like a silly point, but it is important that the Pentagon be a “smart” buyer of the systems the private sector develops in response to DOD requirements. Part of the benefit is being able to avoid risk and make decisions faster and with more and better information. Government engineers must be able to validate vendor designs before incurring the massive costs associated with building prototypes. Modeling and simulation of this type helps prevent a situation where program managers discover, too late, that the platform does not perform as expected. The FARA example is an indication of the power of HPC as an enabler of DE and the tremendous opportunity that HPCMP represents. If the DOD can bring greater intelligence earlier in the process to its acquisition programs, it can make more informed decisions, with lower risk, and potentially save billions of dollars.
For those with a more detailed understanding of DE, it will be clear that physics alone is not the only challenge. In fact, much of the time DE revolves around much more mundane processes, such as access to a secure, collaborative environment where data can be moved and shared, and users can interact with that data, and one another, while still protecting intellectual property. The environment must also house the necessary software tools to allow engineers to manage workflows rapidly and effectively, and to do the what-if analysis that is so important to developing their designs. Of course, the DOD is vast, and engineers are scattered across hundreds of locations within the US, so a workable DE architecture must also connect all the relevant users in a networked environment. Obviously, computational resources are also needed, as is secure access.
HPCMP has all these assets in its existing ecosystem. In fact, in a recent telling example of the effectiveness of pulling these capabilities together, HPCMP worked with the US Air Force and the Information Technology Laboratory (ITL) at the US Army Corps of Engineers, Engineer Research and Development Center (ERDC) to build a DE collaboration architecture tailored to the needs of the B-52 Commercial Engine Replacement Program.
The B-52 needs new engines, but it is not as easy as just removing the old ones and replacing them with new ones. There are many design and engineering implications of engine replacement on things such as wing loading, necessary changes to the engine nacelles, weight and balance considerations, and potentially different ground handling and flight characteristics. The HPCMP-enabled architecture is helping the USAF understand these implications. The B-52 DE architecture, known as the Integrated Digital Environment (IDE), is an excellent example of the power of HPCMP/ITL solution. A tailored, pragmatic approach to DE-enablement is helping to solve a vexing problem—this same approach can be replicated across hundreds of programs of record.
Guidehouse has worked to develop a business model for how HPCMP/ITL can take its DE capabilities to market. Beyond the simple articulation of a vision or resource-sharing approach, Guidehouse has developed a detailed business plan for a new entity called the Digital Engineering Services Enterprise (DESE). Guidehouse has defined the services, transactional model, funds flow, and other specifics that would allow HPCMP/ITL to offer its DE capabilities to the broader DOD customer community. As just one example, we have developed detailed views of the customer base, as shown in the graphic below, which will allow DESE to conduct more effective outreach.
A key part of DESE is “productizing” a standard DE architecture, like that used for the B-52, to offer a consistent and robust DE solution to the user community. Guidehouse will be working with HPCMP and ITL to package the capability, as well as do the outreach necessary to DOD Program Executive Officers and Program Managers to build awareness.
In addition, Guidehouse worked closely with HPCMP to roll out a Request for Information to the commercial cloud sector to poll industry on how it can add its capabilities to those HPCMP already maintains. The RFI generated 24 responses from major players like Microsoft, Amazon, and Google, as well as the highly specialized integrators that are critical in this domain. A commercial cloud presence will allow enhanced access, scalability, and flexibility of the HPCMP ecosystem for S&T and DE users. Our workload analysis has helped determine how much cloud should be brought online and over what timeline. We have also contributed to funding proposals in collaboration with HPCMP to solicit new funds in support of Digital Transformation from the General Services Administration. Perhaps more importantly, Guidehouse has been laying the groundwork for a pragmatic DE service offering, value proposition, business model, and the implementation planning needed to make it all happen.
Guidehouse has been HPCMP’s business analyst, advisor, and champion throughout this process. Our job is to understand the user community need (demand-signal), technical environment, economic imperatives, and political realities at the core of a major new initiative like DE, so that HPCMP/ITL can shape its solution to be as impactful as possible. In doing so, we will help HPCMP operationalize its mission, drive new capabilities into the DOD, and ultimately solve significant acquisition dilemmas.
Far from an academic exercise, doing a better job in analyzing designs upfront not only solves and prevents future problems in terms of system deficiencies, but also helps get superior capabilities into the hands of our warfighters at a time when the US no longer can boast uncontested dominance on the battlefield.
HPCMP has received numerous accolades based on Guidehouse work. The most recent was related to our DE congressional report that has been very well-received and has stimulated broad discussion among the stakeholder community.
Working together, Guidehouse and HPCMP/ITL have developed a practical approach to the resolution of the ongoing DE challenge.