U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Https

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Breadcrumb

  1. Home
  2. Defense Acquisition Magazine
  3. Defense Acquisition Magazine May-June 2022
  4. Better Program Managing Through Digital Engineering

Better Program Managing Through Digital Engineering

Better Program Managing Through Digital Engineering

Paul Solomon


In 2018, the Department of Defense (DoD) Digital Engineering Strategy (DE Strat) was published to guide the planning, development, and implementation of the DE transformation across the DoD. In 2019, the DoD’s transformational Adaptive Acquisition Framework (AAF) was published. This article provides guidance to unite DE Strat with AAF for better program management of software-intensive major capability acquisitions and other acquisition pathways.

The PM’s Information Needs

Achieving DE Strat’s Goal 2, Provide an Authoritative Source of Truth (ASOT), will enable better program management. An ASOT will provide the program manager (PM) with timely and accurate schedule status and situational awareness of program execution for proactive resolution of issues impacting cost, schedule, and technical achievement of program objectives. It will also provide the PM with situational awareness of the degree of product quality as measured by functional completeness.

Goal 2 elements include:

2.2 Managing policies, procedures, and standards will ensure proper governance of the ASOT and enhance data quality across the life cycle. Executing governance of the ASOT should increase stakeholder confidence in the integrity of the ASOT.

2.3 Using the ASOT as the technical baseline for informed and timely decisions on managing cost, schedule, performance, and risks. For example, contract deliverables should be traced and validated from the ASOT. This will allow stakeholders at various levels to respond knowledgeably to the development … of the system, thereby avoiding technical and management barriers to mission success. …

Using the ASOT to produce digital artifacts, support reviews, and inform decisions.

Why Better?

The most prevalent source of schedule status and situational awareness of program execution, for most software-intensive major capability acquisitions, is the contractor’s earned value management system (EVMS), which must comply with the guidelines of the EVMS standard, EIA (Electronic Industries Alliance)-748. However, compliance with guidelines does not ensure that the contractor-provided data are accurate or reliable. The guidelines are silent on the technical baseline or “product scope,” progress against requirements, requirements traceability, risk management, and risk mitigation. Even the use of technical performance measures (TPM) is optional.

Three reports to Congress have similar assessments of the veracity or integrity of EVMS reports. First, per a DoD report in 2009, the “utility of EVM has declined to a level where it does not serve its intended purpose.” Contractors “keep EVM metrics favorable and problems hidden. If good technical performance measures are not used, programs could report 100 percent of EV even though behind schedule in validating requirements, completing the preliminary design, meeting the weight targets, or delivering software.”

The report added that “the PM should ensure that the EVM process measures the quality and technical maturity of technical work products instead of just the quantity of work performed.” The report stated that EVM can be an effective PM tool only if the EVM processes are augmented with a rigorous systems engineering (SE) process and SE products are costed and included in EVM tracking.

In 2018, the Section 809 Report of the Advisory Panel on Streamlining and Codifying Acquisition Regulations, Volume 2, stated that “another substantial shortcoming of EVM is that it does not measure product quality. A program could perform ahead of schedule and under cost according to EVM metrics but deliver a capability that is unusable by the customer. … Traditional measurement using EVM provides less value to a program than an Agile process in which the end user continuously verifies that the product meets the requirement.” It concluded that “EVM has been required on most large software programs but has not prevented cost, schedule, or performance issues.” It also stated that “The current system focuses on process, not product. This focus takes PMs’ attention away from the fundamentals of cost, schedule, and performance, and is one of the major contributors to negative acquisition outcomes.”

If the DE Strat is implemented as intended, reported schedule performance would be product-oriented, based on technical performance, and based on the completed digital artifacts in the ASOT. This is in sharp contrast with the EIA-748 process discussion that “EV is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes.”

It is recommended that PMs and the Defense Contract Management Agency obtain information about completed digital artifacts in the ASOT and compare it with planned completions at any point in time to derive schedule performance. Then they should investigate significant differences with the schedule performance reported by contractors.

DE Strategy Supports AAF Policies

A successful DE Strategy would support the AAF policies included in Table 1.

Table 1. Successful DE Strategy Supports AAF Policies
SUBJECT SECTION EXCERPT
Technical Performance DoD Directive (DoDD) 5000.01 81.2.g.(2) Program goals for cost, schedule, and performance parameters (or alternative quantitative management controls) will describe the program over its life cycle. Approved program baseline parameters will serve as control objectives.
Technical Performance DoDD 5000.01 1.2.k Employ Performance-Based Acquisition Strategies
“Performance-based strategy” means a strategy that supports an acquisition approach structured around the results to be achieved as opposed to the manner by which the work is to be performed.
Technical Performance DoDD 5000.01
1.2.o
Conduct Integrated Test and Evaluation (T&E)
(1) T&E will be integrated throughout the defense acquisition process. Test and evaluation will be structured to provide essential information to decision makers, assess attainment of technical performance parameters, and determine whether systems are operationally effective, suitable, survivable, and safe for its intended use.
(2) The conduct of T&E, integrated with modeling and simulation, will:
... (b) Assess technology maturity and interoperability.
... (d) Confirm performance against documented capability needs and adversary capabilities.
Technical Performance DoDD 5000.02 4.1.b.(6) Establish a risk management program to ensure program cost, schedule, and performance objectives are achieved, and to communicate the process for managing program uncertainty.
Technical Baseline DoDD 5000.02 4.1.b.(7) When consistent with pathway requirements, develop engineering plans and processes applicable to the pathways to mature technology, conduct necessary systems engineering trade-offs, and produce and manage appropriate technical baselines through the use of systems engineering technical reviews.
Technical Performance DoDD 5000.85 3.c.3 Management activities will be designed to achieve the cost, schedule, and performance parameters specified in the MDA (Milestone Decision Authority)-approved acquisition program baseline and will include product support considerations.
Technical Baseline DoDD 5000.85 3.11.b.(1) A critical design review assesses design maturity, design build-to or code-to documentation, and remaining risks, and establishes the initial technical baseline.
Technical Baseline
Technical Performance
Requirements Traceability
DoD Instruction (DoDI) 5000.88
3.4 Program Technical Planning and Management
a. Systems Engineering Plan (SEP)
(3) … the SEP will contain these elements, unless waived by the SEP approval authority:
… (b) The engineering management approach to include technical baseline management; requirements traceability; configuration management; risk, issue and opportunity management; and technical trades and evaluation criteria.
… (c) The software development approach to include architecture design considerations; software unique risks; software obsolescence; inclusion of software in technical reviews; identification, tracking, and reporting of metrics for software technical performance, process, progress, and quality; software system safety and security considerations; and software development resources.
… (g) Specific technical performance measures and metrics, and SE leading indicators to provide insight into the system technical maturation relative to a baseline plan. Include the maturation strategy, assumptions, reporting methodology, and maturation plans for each metric with traceability of each performance metric to system requirements and mission capability characteristics.
… (k) The timing, conduct, and entry and exit criteria for technical reviews.
… (l) A description of technical baselines (e.g., concept, functional, allocated, and product), baseline content, and the technical baseline management process.
Technical Baseline
Technical Performance
DoDI 5000.88 3.4.c Configuration and Change Management (3) Provide for traceability of mission capability to system requirements to performance and execution metrics.
Authoritative Sources of Models, Data, and Test Artifacts DoDI 5000.89 3.1.i As part of the DE Strategy ... tools .… must provide authoritative sources of models, data, and test artifacts (e.g., test cases, plans, deficiencies, and results).
Technical Performance DoD Software Modernization Strategy
3 Unifying Principles
Resilient software must be defined first by … quality. These attributes can be achieved at speed by aggressively adopting modern software development practices that effectively integrate performance throughout the software development life cycle.

see guidebook hereDigital Artifacts

DoD published the Systems Engineering Guidebook in February 2022. The guidebook “provides guidance and recommended best practices for defense acquisition programs.”

Typical artifacts that should be the base measures of schedule performance are outputs from the measurement and verification processes in that guidebook. These outputs are ASOTs for PMs. 

When DE is employed, the digital versions of these artifacts should be automatically transferred from the engineering to the program management organizations. Typical DE artifacts per that guidebook are illustrated in Table 2.

Another source of ASOTs is Capability Maturity Model Integration (CMMI). Typical work products from CMMI processes are shown in Table 3. The digital versions of these artifacts should also be automatically transferred from the engineering to the program management organizations.

DE Metrics and Artifacts

It is recommended that DoD develop and publish metrics specifications for DE that support the information needs of PMs. The metrics specifications should be used as digital ASOTs for three PM responsibilities. First, the PM should develop the time phased schedule to complete the requirements definitions. The time phased plan should reside in an automatedly linked scheduling system. Second, the PM should assess the schedule progress of defining and completing requirements. Schedule progress should also reside in an automatedly linked scheduling system.

Third, the PM should use digital artifacts from the ASOT as base measures of DE metrics. These digital artifacts are ASOT that SE work products are completed, such as requirement definitions, including approved TPMs, verification methods, and completion criteria in the functional and allocated baselines, trade studies, completed products in the product baseline (including the Minimum Viable Product and Minimum Viable Capability Release baselines, if applicable) and test artifacts (e.g., test cases, plans, deficiencies, and results).

Cost Effective

Per DE Strat, “the exchange of information between … organizations should take place via automated … transformations.” If the exchange of schedule performance information between engineering and program management is automated, then costs will be reduced by eliminating or reducing manual entry. Also, the automation supports DoDD 5000.01’s policy of adopting innovative practices, including best commercial practices and electronic business solutions that reduce cycle time and cost while encouraging teamwork.

Table 2. Typical DE Artifacts
6.3.7.4 MEASUREMENT PROCESS OUTPUTS
… c) Measurement data with the following attributes:
1) Provides data on established TPMs [technical performance measures] for use in project assessment and control to support the assessment of the system technical performance, and for an assessment of risk in achieving the measures of effectiveness or measures of performance and associated operational requirements.
NOTE—TPMs are a subset of measures that evaluate technical progress (i.e., product maturity) and support evidence-based decisions at key decision points such as technical reviews or milestone decisions.
2) Provides technical project measurement data for use in project assessment and control to support the assessment of technical progress toward fulfilling system requirements.
6.4.9.4 VERIFICATION PROCESS OUTPUTS
a) Planned system verification with the following attributes:
1) Quantitatively verifies that each system product … meets all of its requirements and design constraints in accordance with the verification method for each requirement or constraint in the allocated baseline.
b) Verification results that:
1) Verify required performance of all critical characteristics by demonstration or test.
2) Verify risks identified in the Risk Management process are mitigated to levels acceptable for continued development of the system as planned.
… d) Acceptance verification data that:
1) Verifies that each delivered hardware product, each constituent product of a delivered hardware product, and each system product that is used to manufacture, verify, integrate, or deploy end products that are to be delivered meets each of its requirements … in the maintained, allocated, or product baselines in accordance with the applicable verification method or verification requirements.

Successful Application (Before Digital Artifacts)

More than 20 years ago, I supported the B-2 bomber upgrade programs, Joint Standoff Weapon/Generic Weapon Interface System (JSOW-GWIS) and Link-16 as an EVM surveillance monitor. I also was a member of a process improvement team formed to increase our CMMI maturity level. We selected engineering artifacts that became base measures of EV. The resultant schedule performance measurement processes and new base measures of EV replaced processes that had failed to provide accurate information to the PM.

The schedule performance information resided in requirements traceability matrices. Instead of the percentage of source lines of code (SLOC) or drawings completed, EV was based on the requirements status, such as requirements that have been defined and allocated to software components, allocated to test cases, and successfully tested. On test status, the measure is used to evaluate whether the required functionality has been demonstrated against the specified requirements. We accounted for deferred functionality when a software build or engineering design was released despite falling short of its baseline requirements. The percentage of complete SLOC or drawings had no relationship to requirements completed. The percentage of work completed is not a true indicator of the status of validating requirements, completion of the preliminary design, conformity to the weight targets, or delivery of software. We also accounted for rework, in developing the performance measurement baseline and in determining schedule performance and the estimate at completion.

These practices improved our management effectiveness and increased customer satisfaction. My article in AerospaceAcquisition 2000 cited our success as follows: 

“The B-2 Spirit Stealth Bomber Program implemented several innovative process improvements using EVM. These include integrating EV with systems engineering processes and defining improved software engineering metrics to support EVM.

“These changes paid off during upgrades of the B-2 weapon system. One of those upgrades was the development of the JSOW/GWIS [Joint Standoff Weapon/ Generic Weapon Interface System], a software-intensive effort. The new metrics helped to make it a very successful program. The PBEV [Performance-Based Earned Value] methodology was used to ensure that the warfighter received the most functionality from software development efforts. On JSOW, we provided 85 percent more than originally planned.”

The IBM Engineering Requirements matrix, DOORS, was the ASOT for planning and tracking the status of requirements on the Link-16 upgrade program and was used for contractual reporting. I published an article that described the practices, “Practical, Performance-Based Earned Value,” in the May 2006 issue of CrossTalk, The Journal of Defense Software Engineering. These practices were presented at numerous SE and software engineering conferences. The last presentation was at the Naval Postgraduate School in 2020. Attendees confirmed that the practices were never utilized by defense contractors. The following is an excerpt from the CrossTalk article:

“Example 3 demonstrates a method for measuring progress of the SE effort to perform requirements management, traceability, and verification. Typical activities include: Define the requirement, validate the requirement, determine the verification method, allocate the requirement, document the verification procedure, and verify that the requirement has been met. The RTM [Requirements Traceability Matrix] should be used to record the status of each requirement as it progresses through this cycle. A time-phased schedule for the planned completion of these activities is the basis for the Performance Measurement Baseline. A measure of the status of the system or subsystem requirements in the RTM should be a base measure of EV” (or schedule performance).

Table 3. Typical Systems Engineering Work Products/Artifacts in Capability Maturity Model Integration (CMMI)
CMMI PROCESS AREA TYPICAL WORK PRODUCTS/ARTIFACTS
Requirements Development
  • Customer requirements
  • Derived requirements
  • Product requirements
  • Product-component requirements
  • Interface requirements
  • Functional architectures
  • Activity diagrams and use cases
  • Technical performance measures
  • Results of requirements validation
Technical Solution
  • Product component operational concepts, scenarios, and environments
  • Documented relationships between requirements and product components
  • Product architectures
  • Product-component designs
  • Allocated requirements
  • Key product characteristics
  • Required physical characteristics and constraints
  • Interface requirements
  • Material requirements
  • Verification criteria used to ensure that requirements have been achieved
  • Conditions of use (environments) and operating/usage scenarios, modes, and states for operations, support, training, and verifications throughout the life cycle
  • Interface design specifications
  • Interface control documents
  • Implemented design
Validation
  • Validation results
Verification
  • Exit and entry criteria for work products
  • Verification results
Measurement and Analysis
  • Specifications of base and derived measures
Decision Analysis and Resolution
  • Results of evaluating alternate solutions

Source: CMMI Institute, a subsidiary of ISACA

CMMI and NAVAIR References

To publicize the processes and measures used on the B-2 program, I authored or co-authored two publications that are relevant today. I was a visiting scientist at the Carnegie Mellon University/Software Engineering Institute and published Technical Note CMU/SEI-2002-TN-016, October 2002, “Using CMMI to Improve Earned Value Management.” I was a contributor to the Naval Air Systems Command (NAVAIR) handbook, Using Software Metrics and Measurements for Earned Value Toolkit, October 2004. Both documents are useful tools in implementing the DE Strat.

Conclusion

If the DE Strat is successfully implemented, and if the status of the digital artifacts in the ASOT is used to inform the PM of schedule performance and the degree of product quality, the PM will be able to take corrective actions more quickly. If the schedule performance data is automatically transferred to the PM’s scheduling system instead of being manually entered, program costs will be reduced and the accuracy of that data will increase.


Defense Acquisition Magazine May-June 2022 cover

Read the full issue of
Defense Acquisition magazine

 

 


SOLOMON received the David Packard Excellence in Acquisition Award in 1998 for his effort on the Integrated Program Management Initiative Joint Team that developed the first EVMS standard. He also received a letter of appreciation from Sen. John McCain for his “continued efforts in working to improve our acquisitions process.”

The author can be contacted at [email protected].


The views expressed in this article are those of the author alone and not the Department of Defense. Reproduction or reposting of articles from Defense Acquisition magazine should credit the author and the magazine.


Subscribe LinkPrint Button