Sign In
App icon


Available Now!
Get the App
Click Here to Continue Browser Session   ❯



Better Program Managing Through Digital Engineering Program Managing Through Digital Engineering2022-06-29T16:00:00Z,<div class="ExternalClass8E2ABE21B59747308924D2BE6F0529DD">In 2018, the Department of Defense (DoD) Digital Engineering Strategy (DE Strat) was published to guide the planning, development, and implementation of the DE transformation across the DoD. In 2019, the DoD’s transformational Adaptive Acquisition Framework (AAF) was published. This article provides guidance to unite DE Strat with AAF for better program management of software-intensive major capability acquisitions and other acquisition pathways. <h2>The PM’s Information Needs</h2> Achieving DE Strat’s Goal 2, Provide an Authoritative Source of Truth (ASOT), will enable better program management. An ASOT will provide the program manager (PM) with timely and accurate schedule status and situational awareness of program execution for proactive resolution of issues impacting cost, schedule, and technical achievement of program objectives. It will also provide the PM with situational awareness of the degree of product quality as measured by functional completeness.<br> <br> <strong>Goal 2 elements include: </strong><br> 2.2 Managing policies, procedures, and standards will ensure proper governance of the ASOT and enhance data quality across the life cycle. Executing governance of the ASOT should increase stakeholder confidence in the integrity of the ASOT.<br> <br> 2.3 Using the ASOT as the technical baseline for informed and timely decisions on managing cost, schedule, performance, and risks. For example, contract deliverables should be traced and validated from the ASOT. This will allow stakeholders at various levels to respond knowledgeably to the development … of the system, thereby avoiding technical and management barriers to mission success. …<br> <br> Using the ASOT to produce digital artifacts, support reviews, and inform decisions. <h2>Why Better?</h2> The most prevalent source of schedule status and situational awareness of program execution, for most software-intensive major capability acquisitions, is the contractor’s earned value management system (EVMS), which must comply with the guidelines of the EVMS standard, EIA (Electronic Industries Alliance)-748. However, compliance with guidelines does not ensure that the contractor-provided data are accurate or reliable. The guidelines are silent on the technical baseline or “product scope,” progress against requirements, requirements traceability, risk management, and risk mitigation. Even the use of technical performance measures (TPM) is optional.<br> <br> Three reports to Congress have similar assessments of the veracity or integrity of EVMS reports. First, per a DoD report in 2009, the “utility of EVM has declined to a level where it does not serve its intended purpose.” Contractors “keep EVM metrics favorable and problems hidden. If good technical performance measures are not used, programs could report 100 percent of EV even though behind schedule in validating requirements, completing the preliminary design, meeting the weight targets, or delivering software.”<br> The report added that “the PM should ensure that the EVM process measures the quality and technical maturity of technical work products instead of just the quantity of work performed.” The report stated that EVM can be an effective PM tool only if the EVM processes are augmented with a rigorous systems engineering (SE) process and SE products are costed and included in EVM tracking.<br> <br> In 2018, the Section 809 Report of the Advisory Panel on Streamlining and Codifying Acquisition Regulations, Volume 2, stated that “another substantial shortcoming of EVM is that it does not measure product quality. A program could perform ahead of schedule and under cost according to EVM metrics but deliver a capability that is unusable by the customer. … Traditional measurement using EVM provides less value to a program than an Agile process in which the end user continuously verifies that the product meets the requirement.” It concluded that “EVM has been required on most large software programs but has not prevented cost, schedule, or performance issues.” It also stated that “The current system focuses on process, not product. This focus takes PMs’ attention away from the fundamentals of cost, schedule, and performance, and is one of the major contributors to negative acquisition outcomes.”<br> <br> If the DE Strat is implemented as intended, reported schedule performance would be product-oriented, based on technical performance, and based on the completed digital artifacts in the ASOT. This is in sharp contrast with the EIA-748 process discussion that “EV is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes.”<br> <br> It is recommended that PMs and the Defense Contract Management Agency obtain information about completed digital artifacts in the ASOT and compare it with planned completions at any point in time to derive schedule performance. Then they should investigate significant differences with the schedule performance reported by contractors. <h2>DE Strategy Supports AAF Policies</h2> A successful DE Strategy would support the AAF policies included in Table 1. <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 1. Successful DE Strategy Supports AAF Policies</caption> <thead> <tr> <th scope="col">SUBJECT</th> <th scope="col">SECTION</th> <th scope="col">EXCERPT</th> </tr> </thead> <tbody> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td>DoD Directive (DoDD) 5000.01 81.2.g.(2)</td> <td>Program goals for cost, schedule, and performance parameters (or alternative quantitative management controls) will describe the program over its life cycle. Approved program baseline parameters will serve as control objectives.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td>DoDD 5000.01 1.2.k</td> <td>Employ Performance-Based Acquisition Strategies<br> “Performance-based strategy” means a strategy that supports an acquisition approach structured around the results to be achieved as opposed to the manner by which the work is to be performed.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td>DoDD 5000.01<br> 1.2.o</td> <td>Conduct Integrated Test and Evaluation (T&E)<br> (1) T&E will be integrated throughout the defense acquisition process. Test and evaluation will be structured to provide essential information to decision makers, assess attainment of technical performance parameters, and determine whether systems are operationally effective, suitable, survivable, and safe for its intended use.<br> (2) The conduct of T&E, integrated with modeling and simulation, will:<br> ... (b) Assess technology maturity and interoperability.<br> ... (d) Confirm performance against documented capability needs and adversary capabilities.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td>DoDD 5000.02 4.1.b.(6)</td> <td>Establish a risk management program to ensure program cost, schedule, and performance objectives are achieved, and to communicate the process for managing program uncertainty.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Baseline</strong></td> <td>DoDD 5000.02 4.1.b.(7)</td> <td>When consistent with pathway requirements, develop engineering plans and processes applicable to the pathways to mature technology, conduct necessary systems engineering trade-offs, and produce and manage appropriate technical baselines through the use of systems engineering technical reviews.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td>DoDD 5000.85 3.c.3</td> <td>Management activities will be designed to achieve the cost, schedule, and performance parameters specified in the MDA (Milestone Decision Authority)-approved acquisition program baseline and will include product support considerations.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Baseline</strong></td> <td>DoDD 5000.85 3.11.b.(1)</td> <td>A critical design review assesses design maturity, design build-to or code-to documentation, and remaining risks, and establishes the initial technical baseline.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Baseline<br> Technical Performance<br> Requirements Traceability</strong></td> <td>DoD Instruction (DoDI) 5000.88<br> 3.4 Program Technical Planning and Management<br> a. Systems Engineering Plan (SEP)</td> <td>(3) … the SEP will contain these elements, unless waived by the SEP approval authority:<br> … (b) The engineering management approach to include technical baseline management; requirements traceability; configuration management; risk, issue and opportunity management; and technical trades and evaluation criteria.<br> … (c) The software development approach to include architecture design considerations; software unique risks; software obsolescence; inclusion of software in technical reviews; identification, tracking, and reporting of metrics for software technical performance, process, progress, and quality; software system safety and security considerations; and software development resources.<br> … (g) Specific technical performance measures and metrics, and SE leading indicators to provide insight into the system technical maturation relative to a baseline plan. Include the maturation strategy, assumptions, reporting methodology, and maturation plans for each metric with traceability of each performance metric to system requirements and mission capability characteristics.<br> … (k) The timing, conduct, and entry and exit criteria for technical reviews.<br> … (l) A description of technical baselines (e.g., concept, functional, allocated, and product), baseline content, and the technical baseline management process.</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Baseline<br> Technical Performance</strong></td> <td>DoDI 5000.88 3.4.c Configuration and Change Management</td> <td>(3) Provide for traceability of mission capability to system requirements to performance and execution metrics.</td> </tr> <tr> <td style="text-align:center;"><strong>Authoritative Sources of Models, Data, and Test Artifacts</strong></td> <td>DoDI 5000.89 3.1.i</td> <td>As part of the DE Strategy ... tools .… must provide authoritative sources of models, data, and test artifacts (e.g., test cases, plans, deficiencies, and results).</td> </tr> <tr> <td style="text-align:center;"><strong>Technical Performance</strong></td> <td style="text-align:center;">DoD Software Modernization Strategy<br> 3 Unifying Principles</td> <td>Resilient software must be defined first by … quality. These attributes can be achieved at speed by aggressively adopting modern software development practices that effectively integrate performance throughout the software development life cycle.</td> </tr> </tbody> </table> <div style="text-align:center;"> </div> <h2><a href="" target="_blank"><img alt="see guidebook here" src="/library/defense-atl/DATLFiles/May-June2022/seeguidebookhere.jpg" style="margin-left:3px;margin-right:3px;float:left;width:25%;" /></a>Digital Artifacts</h2> <div>DoD published the Systems Engineering Guidebook in February 2022. The guidebook “provides guidance and recommended best practices for defense acquisition programs.”<br> <br> Typical artifacts that should be the base measures of schedule performance are outputs from the measurement and verification processes in that guidebook. These outputs are ASOTs for PMs.<br> <br> When DE is employed, the digital versions of these artifacts should be automatically transferred from the engineering to the program management organizations. Typical DE artifacts per that guidebook are illustrated in Table 2.<br> <br> Another source of ASOTs is Capability Maturity Model Integration (CMMI). Typical work products from CMMI processes are shown in Table 3. The digital versions of these artifacts should also be automatically transferred from the engineering to the program management organizations.</div> <h2>DE Metrics and Artifacts</h2> <div>It is recommended that DoD develop and publish metrics specifications for DE that support the information needs of PMs. The metrics specifications should be used as digital ASOTs for three PM responsibilities. First, the PM should develop the time phased schedule to complete the requirements definitions. The time phased plan should reside in an automatedly linked scheduling system. Second, the PM should assess the schedule progress of defining and completing requirements. Schedule progress should also reside in an automatedly linked scheduling system.<br> <br> Third, the PM should use digital artifacts from the ASOT as base measures of DE metrics. These digital artifacts are ASOT that SE work products are completed, such as requirement definitions, including approved TPMs, verification methods, and completion criteria in the functional and allocated baselines, trade studies, completed products in the product baseline (including the Minimum Viable Product and Minimum Viable Capability Release baselines, if applicable) and test artifacts (e.g., test cases, plans, deficiencies, and results).</div> <h2>Cost Effective</h2> <div>Per DE Strat, “the exchange of information between … organizations should take place via automated … transformations.” If the exchange of schedule performance information between engineering and program management is automated, then costs will be reduced by eliminating or reducing manual entry. Also, the automation supports DoDD 5000.01’s policy of adopting innovative practices, including best commercial practices and electronic business solutions that reduce cycle time and cost while encouraging teamwork. <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 2. Typical DE Artifacts</caption> <tbody> <tr> <td><strong> MEASUREMENT PROCESS OUTPUTS</strong></td> </tr> <tr> <td>… c) Measurement data with the following attributes:<br> 1) Provides data on established TPMs [technical performance measures] for use in project assessment and control to support the assessment of the system technical performance, and for an assessment of risk in achieving the measures of effectiveness or measures of performance and associated operational requirements.<br> NOTE—TPMs are a subset of measures that evaluate technical progress (i.e., product maturity) and support evidence-based decisions at key decision points such as technical reviews or milestone decisions.<br> 2) Provides technical project measurement data for use in project assessment and control to support the assessment of technical progress toward fulfilling system requirements.</td> </tr> <tr> <td><strong> VERIFICATION PROCESS OUTPUTS</strong></td> </tr> <tr> <td>a) Planned system verification with the following attributes:<br> 1) Quantitatively verifies that each system product … meets all of its requirements and design constraints in accordance with the verification method for each requirement or constraint in the allocated baseline.<br> b) Verification results that:<br> 1) Verify required performance of all critical characteristics by demonstration or test.<br> 2) Verify risks identified in the Risk Management process are mitigated to levels acceptable for continued development of the system as planned.<br> … d) Acceptance verification data that:<br> 1) Verifies that each delivered hardware product, each constituent product of a delivered hardware product, and each system product that is used to manufacture, verify, integrate, or deploy end products that are to be delivered meets each of its requirements … in the maintained, allocated, or product baselines in accordance with the applicable verification method or verification requirements.</td> </tr> </tbody> </table> <h2>Successful Application (Before Digital Artifacts)</h2> More than 20 years ago, I supported the B-2 bomber upgrade programs, Joint Standoff Weapon/Generic Weapon Interface System (JSOW-GWIS) and Link-16 as an EVM surveillance monitor. I also was a member of a process improvement team formed to increase our CMMI maturity level. We selected engineering artifacts that became base measures of EV. The resultant schedule performance measurement processes and new base measures of EV replaced processes that had failed to provide accurate information to the PM.<br> <br> The schedule performance information resided in requirements traceability matrices. Instead of the percentage of source lines of code (SLOC) or drawings completed, EV was based on the requirements status, such as requirements that have been defined and allocated to software components, allocated to test cases, and successfully tested. On test status, the measure is used to evaluate whether the required functionality has been demonstrated against the specified requirements. We accounted for deferred functionality when a software build or engineering design was released despite falling short of its baseline requirements. The percentage of complete SLOC or drawings had no relationship to requirements completed. The percentage of work completed is not a true indicator of the status of validating requirements, completion of the preliminary design, conformity to the weight targets, or delivery of software. We also accounted for rework, in developing the performance measurement baseline and in determining schedule performance and the estimate at completion.<br> <br> These practices improved our management effectiveness and increased customer satisfaction. My article in AerospaceAcquisition 2000 cited our success as follows:<br> “The B-2 Spirit Stealth Bomber Program implemented several innovative process improvements using EVM. These include integrating EV with systems engineering processes and defining improved software engineering metrics to support EVM.<br> <br> “These changes paid off during upgrades of the B-2 weapon system. One of those upgrades was the development of the JSOW/GWIS [Joint Standoff Weapon/ Generic Weapon Interface System], a software-intensive effort. The new metrics helped to make it a very successful program. The PBEV [Performance-Based Earned Value] methodology was used to ensure that the warfighter received the most functionality from software development efforts. On JSOW, we provided 85 percent more than originally planned.”<br> The IBM Engineering Requirements matrix, DOORS, was the ASOT for planning and tracking the status of requirements on the Link-16 upgrade program and was used for contractual reporting. I published an article that described the practices, “Practical, Performance-Based Earned Value,” in the May 2006 issue of CrossTalk, The Journal of Defense Software Engineering. These practices were presented at numerous SE and software engineering conferences. The last presentation was at the Naval Postgraduate School in 2020. Attendees confirmed that the practices were never utilized by defense contractors. The following is an excerpt from the CrossTalk article:<br> <br> “Example 3 demonstrates a method for measuring progress of the SE effort to perform requirements management, traceability, and verification. Typical activities include: Define the requirement, validate the requirement, determine the verification method, allocate the requirement, document the verification procedure, and verify that the requirement has been met. The RTM [Requirements Traceability Matrix] should be used to record the status of each requirement as it progresses through this cycle. A time-phased schedule for the planned completion of these activities is the basis for the Performance Measurement Baseline. A measure of the status of the system or subsystem requirements in the RTM should be a base measure of EV” (or schedule performance). <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 3. Typical Systems Engineering Work Products/Artifacts in Capability Maturity Model Integration (CMMI)</caption> <thead> <tr> <th scope="col">CMMI PROCESS AREA</th> <th scope="col">TYPICAL WORK PRODUCTS/ARTIFACTS</th> </tr> </thead> <tbody> <tr> <td style="text-align:center;"><strong>Requirements Development</strong></td> <td> <ul> <li>Customer requirements</li> <li>Derived requirements</li> <li>Product requirements</li> <li>Product-component requirements</li> <li>Interface requirements</li> <li>Functional architectures</li> <li>Activity diagrams and use cases</li> <li>Technical performance measures</li> <li>Results of requirements validation</li> </ul> </td> </tr> <tr> <td style="text-align:center;"><strong>Technical Solution</strong></td> <td> <ul> <li>Product component operational concepts, scenarios, and environments</li> <li>Documented relationships between requirements and product components</li> <li>Product architectures</li> <li>Product-component designs</li> <li>Allocated requirements</li> <li>Key product characteristics</li> <li>Required physical characteristics and constraints</li> <li>Interface requirements</li> <li>Material requirements</li> <li>Verification criteria used to ensure that requirements have been achieved</li> <li>Conditions of use (environments) and operating/usage scenarios, modes, and states for operations, support, training, and verifications throughout the life cycle</li> <li>Interface design specifications</li> <li>Interface control documents</li> <li>Implemented design</li> </ul> </td> </tr> <tr> <td style="text-align:center;"><strong>Validation</strong></td> <td> <ul> <li>Validation results</li> </ul> </td> </tr> <tr> <td style="text-align:center;"><strong>Verification</strong></td> <td> <ul> <li>Exit and entry criteria for work products</li> <li>Verification results</li> </ul> </td> </tr> <tr> <td style="text-align:center;"><strong>Measurement and Analysis</strong></td> <td> <ul> <li>Specifications of base and derived measures</li> </ul> </td> </tr> <tr> <td style="text-align:center;"><strong>Decision Analysis and Resolution</strong></td> <td> <ul> <li>Results of evaluating alternate solutions</li> </ul> </td> </tr> </tbody> </table> <h5><em>Source</em>: CMMI Institute, a subsidiary of ISACA<br> </h5> <h2><img alt="a pc on a desk" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article6_image01.jpg" style="margin-left:6px;margin-right:6px;float:left;width:50%;" />CMMI and NAVAIR References</h2> <div>To publicize the processes and measures used on the B-2 program, I authored or co-authored two publications that are relevant today. I was a visiting scientist at the Carnegie Mellon University/Software Engineering Institute and published Technical Note CMU/SEI-2002-TN-016, October 2002, “Using CMMI to Improve Earned Value Management.” I was a contributor to the Naval Air Systems Command (NAVAIR) handbook, Using Software Metrics and Measurements for Earned Value Toolkit, October 2004. Both documents are useful tools in implementing the DE Strat.</div> <h2>Conclusion</h2> <div>If the DE Strat is successfully implemented, and if the status of the digital artifacts in the ASOT is used to inform the PM of schedule performance and the degree of product quality, the PM will be able to take corrective actions more quickly. If the schedule performance data is automatically transferred to the PM’s scheduling system instead of being manually entered, program costs will be reduced and the accuracy of that data will increase. <hr />SOLOMON received the David Packard Excellence in Acquisition Award in 1998 for his effort on the Integrated Program Management Initiative Joint Team that developed the first EVMS standard. He also received a letter of appreciation from Sen. John McCain for his “continued efforts in working to improve our acquisitions process.”<br> <br> The author can be contacted at <a class="ak-cke-href" href=""></a>. <h6>The views expressed in this article are those of the author alone and not the Department of Defense. Reproduction or reposting of articles from Defense Acquisition magazine should credit the author and the magazine.</h6> </div> </div></div>string;#/library/defense-atl/blog/Better-Program-Managing
Data Rights Marking Sleuths Rights Marking Sleuths2022-06-15T16:00:00Z,<div class="ExternalClass633B6800AAD347D7A0BF5FC573B9BF3D"><h2>The Case Begins</h2> We would like to explain a very curious and difficult detective case that we accepted with the Department of Defense (DoD). On one unusually hot and sticky day in San Diego, we received a phone call from a DoD customer asking us to help solve a case regarding data rights markings on noncommercial technical data and computer software source code (henceforth referred to as data). Initially, this case seemed too boring and convoluted, but since business was slow and the offeror was good, we decided to take the case.<br> <br> First things first, we have resolved a few cases with improper data rights markings in the past. We have even acquired a cool computer software tool to search and identify improper data rights markings. Before we get into the specifics of some of the complicated cases we undertook, we would like to provide some clues regarding data rights markings that result in Legends being placed on data that is delivered to the DoD.<br> <br> Under copyright law, the author, identified in a simple copyright marking, holds the copyright to the material. If that material was created under a government contract, the copyright would be subject to a government license under that contract.<br> <br> For DoD contracts, generally the originator (contractor) of the data has sole ownership of that data and protects that intellectual property, since it is the lifeblood of the contractor’s company and a major source of revenue and profit. Contractors protect their intellectual property by placing the proper data rights markings on all data delivered under a procurement contract. These markings thus restrict the ability for the DoD to use or share the data. The Defense Federal Acquisition Regulation Supplement (DFARS) is the authoritative document that provides the exact format and language on how to mark specific types of data.<br> <br> Authorized markings are provided in the following sections of the Code of Federal Regulations:<br> <br> <strong>48 CFR § 252.227-7013–Rights in technical data–Noncommercial items.<br> 48 CFR § 252.227-7014–Rights in noncommercial computer software and noncommercial computer software documentation.<br> 48 CFR § 252.227-7018–Rights in noncommercial technical data and computer software–Small Business Innovation Research Program.</strong><br> <br> DFARS defines two types of improper data rights markings—nonconforming and unjustified:<br> <br> When a contractor places a restrictive data rights marking on noncommercial technical data or noncommercial computer software that it delivers or otherwise furnishes to the DoD, and that is based on a data rights restriction authorized by the contract, but not in the format authorized by the contract, it is deemed to be a “nonconforming” marking. Refer to DFARS 252.227–7013(h) (2) and DFARS 252.227–7014(h) and Table 1.<br> <br> An “unjustified” data rights marking is a marking that does not accurately depict restrictions applicable to the government’s use, modification, reproduction, release, performance, display, or disclosure of the marked technical data or computer software. Refer to DFARS 227.7103-12(b)(1) and DFARS 227.7203-12(b)(1) and Table 1. <table border="1" cellpadding="1" cellspacing="1" style="width:75%;"> <caption>Table 1. Improper Data Rights Markings</caption> <thead> <tr> <th scope="col">NONCONFORMING</th> <th scope="col">UNJUSTIFIED</th> </tr> </thead> <tbody> <tr> <td>A marking that does not match the exact wording and formatting for noncommercial technical data/noncommercial computer software as specified by DFARS.</td> <td>If the contractor or subcontractor includes a data rights marking on a deliverable that is noncommercial technical data or noncommercial computer software that is in the correct format but inappropriately restricts the government’s use, it is an unjustified marking.</td> </tr> <tr> <td>“Proprietary” or “Company Confidential,” they are not markings/legends authorized by DFARS.</td> <td>If the contractor is entitled to assert Government Purpose Rights in a deliverable that is a noncommercial computer software source code, perhaps because it was developed with mixed funding, and the contractor includes the Restricted Rights marking on that noncommercial computer software source code, this is an unjustified marking.</td> </tr> </tbody> </table> <h2>The Investigation</h2> A fundamental part of solving any mystery is careful investigation, including gathering and examining clues. In data rights markings cases, clues can include contracts, data rights assertion lists, the marked documents, emails, records of conversations, and licenses.<br> <br> <img alt="Figure 01" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article5_figure1.jpg" style="width:100%;" /> <h5><em>Source of figure and tables:</em> The authors</h5> <br> A key part of conducting any investigation is having the right tools. Part of the reason that we have been able to solve so many data rights markings cases in a relatively short time is our unique software search tool, IpScan. As shown in Figure 1, the tool’s operator prompts it to run a scan through the desired set of files— e.g., a group of source code files. In a matter of minutes the tool can read tens of thousands of software source code files. It can also handle documents in most well-known formats like Word, PDF, and text files. While reading software source code files or documents, it checks for all potential data rights markings and then outlines in a red box any of those that are deemed as cautionary (e.g., “Restricted Rights” or “Proprietary”). Then it organizes the markings for each file or document into an easy-to-read report where the tool operator can preview each potential data rights marking and click on the file in question. It also groups the potential data rights markings so that the operator can see which markings appear together—(e.g., if a particular company copyright always appears with a Restricted Rights marking, the report will group those files together).<br> <br> Without this tool we could still solve data rights cases, but significantly more time would be required to sift through the various markings using just a traditional file searching tool at the risk of reduced accuracy. In the case at hand, we used IpScan to check the computer software source code files and immediately found the nonconforming (and potentially unjustified) Restricted Rights marking in Table 2 on 229 of the files.<br> <br> <img alt="a board with post it notes" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article5_image01.jpg" style="margin-left:6px;margin-right:6px;float:left;width:50%;" />This was problematic because our client stored its data in the Department of Defense Information Repository (DoD IR), and the client needed to release those files to authorized users. These Restricted Rights markings would prevent such release. Per DFARS 252.227-7014(b)(3), the terms of Restricted Rights in noncommercial computer software mean that the government has no right to redistribute or share the software source code with other contractors (which does not include covered contractors and subcontractors).<br> <br> In a perfect world, a copy of the contract and assertion list would correctly clarify exactly from the beginning the government’s data rights in the software source code files. But in an imperfect world, the contracts and assertion lists are not always available to help resolve these issues. In some cases, the contracts are just too old and we cannot locate the Contracting Officers. In other cases, the contracts exist but are extremely difficult to obtain, particularly for clients like the DoD IR that are not part of a program office. In these cases where we must proceed without a copy of the contract or assertion list, the data rights must be deduced from the markings.<br> <br> Another key part of investigations is that detectives must never lose sight of the big picture. Clues must always be looked at as a piece to a bigger unknown puzzle. In this case, the IpScan also found two contradictory data rights markings on a file named “License.txt” as shown in Table 3: the same nonconforming Restricted Rights marking from Table 2 along with a nonconforming Government Purpose Rights (GPR) marking.<br> <br> This new clue told us that we needed to further investigate all of those Restricted Rights markings to see if they were also unjustified. Because of that single GPR marking, we went back to the program office, showed them our findings, and asked them to help us get to the bottom of these strange markings. They came back a few weeks later with a memorandum from the prime contractor affirming that the government had GPR to all of the files that the subcontractor had marked as Restricted Rights. <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 2. Nonconforming Restricted Rights Marking</caption> <thead> <tr> <th scope="col">Restricted Rights</th> </tr> </thead> <tbody> <tr> <td>#!bin/bash</td> </tr> <tr> <td># © 2019</td> </tr> <tr> <td># Data Rights (LICENSE.txt): RESTRICTED RIGHTS</td> </tr> <tr> <td># Notices (NOTICES.txt): ITAR</td> </tr> <tr> <td># Security Classification: UNCLASSIFIED</td> </tr> </tbody> </table> <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 3. Contradictory Data Rights Markings</caption> <thead> <tr> <th scope="col">Restricted Rights</th> </tr> </thead> <tbody> <tr> <td>#!bin/bash</td> </tr> <tr> <td># © 2019</td> </tr> <tr> <td># Data Rights (LICENSE.txt): RESTRICTED RIGHTS</td> </tr> <tr> <td># Notices (NOTICES.txt): ITAR</td> </tr> <tr> <td># Security Classification: UNCLASSIFIED</td> </tr> <tr> <td><strong><span style="background-color:#D3D3D3;">DoD Contractor</span></strong></td> </tr> <tr> <td>GOVERNMENT PURPOSE RIGHTS</td> </tr> <tr> <td>Contract No:</td> </tr> <tr> <td>Contractor Name:</td> </tr> <tr> <td>Contractor Address:</td> </tr> <tr> <td>The government’s rights to use, modify, reproduce, release, perform, display, or disclose these technical data are restricted by paragraph (b)(2) of the Rights in Technical Data—Noncommercial Items clause contained in the above identified contract. No restrictions apply after the expiration date shown above. Any reproduction of technical data or portions thereof marked with this legend must also reproduce the markings.</td> </tr> </tbody> </table> <br> Our client will be extremely pleased because this means that they can publish 100 percent of the source code in the DoD IR. Otherwise, they would only have been able to publish only 75 percent of it. And to think that we could have missed this essential clue if it hadn’t been for our trusty scanning tooltool that was able to scan all 829 source code files. Just imagine a case where hundreds of thousands of files need to be manually reviewed!<br> <br> To close this case, we are still working to get a memorandum from the subcontractor affirming the GPR marking and stating that they incorrectly marked the files as “Restricted Rights.” In addition, we are also still trying to get a copy of the contract to confirm our findings with the assertion list to clear up this mystery once and for all. Stay tuned. <h2>The Case of the MELPe Files</h2> Our customer then directed us to perform the same investigation on the Mixed Excitation Linear Prediction enhanced (MELPe) software source code. Again, as usual, we started with IpScan. This time we scanned a set of 1,711 computer software source code files, measuring 45,135 source lines of code (SLOC) that were part of a reference development platform critical to developing and testing tactical radios and waveforms. The scan revealed four restrictive markings, occurring repeatedly in 119 source code files. Those markings included both DFARS and commercial markings, and one of the good features of the IpScan tool is that it doesn’t just find DFARS data rights markings—it also finds commercial copyright notices (Table 4).<br> <br> The data rights markings addressed MELPe voice-encoding technology that allows human speech to traverse narrow radio frequency (RF) communications channels used by the military. It was developed from an earlier technology, MELP. Military RF links must deal with signal dropouts induced by noise, weather, and unpredictable motion of radios in helicopters, trucks, and infantry packs, without affecting voice quality. Specified for key tactical waveforms, MELPe aims at near cellphone quality in a military environment.<br> <br> Reviewing Table 4, the Company X Copyright (Marking 1) and Company Y Copyright (Marking 2) provide a first impression of typical commercial copyright legends, not a technology that had been a product of a government contract. Of course, the files could have been delivered outside of the contract deliverables delivery or could be third-party copyrighted materials inserted into a deliverable. Marking 3 appears to be a DFARS marking, but there was no contract given, and no expiration date. Copyright markings 1 and 2 are acceptable for commercial computer software source code, but not for computer software produced under a government contract, which we had thought was the origin of MELPe.<br> <br> <img alt="magnifying glass" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article5_image02.jpg" style="margin-left:6px;margin-right:6px;float:left;width:50%;" />values is to protect the data rights of the owner of the data, even if the markings are improper. Absent valid DFARS markings, the case for government rights to the data becomes questionable and the case for company intellectual property becomes plausible. Other developers could not be provided these files and could not be put under contract, possibly locking the government into a sole-source situation.<br> <br> Normally, when unjustified or nonconforming markings are found, the standard DFARS procedure is for the contracting officer to require the contractor to correct the markings. In this case, we had expected to see a government contract identified, but it wasn’t there. There was no contracting officer to contact in order to challenge the marking.<br> <br> This set of data rights markings provided no assurance that the government obtained GPR or equivalent data rights to the newer MELPe computer software, and this did not align with expectations for a technology developed for widespread government use.<br> <br> To help determine whether the government had data rights in the newer MELPe software, we contacted government and industry experts in the fields of tactical waveforms and speech coding. In most data rights cases, the contracting officer will determine the course of action, and in all cases we recommend that legal counsel be fully engaged. The actions shown in Table 5 were taken over seven arduous months of sleuthing, and they revealed a series of clues concerning the timelines and circumstances behind development of the MELPe technology. <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption><strong>Table 4. Markings in MELPe Source Code</strong></caption> <tbody> <tr> <td><strong>Company X<br> Copyright (Marking 1)</strong></td> <td>Speechcoder ANSI-CSource Code<br> SC1200 1200 bps speech coder<br> Fixed Point Implementation Version 7.0<br> Copyright © 2000,<br> All rights reserved.</td> </tr> <tr> <td><strong>Company Y<br> Copyright (Marking 2)</strong></td> <td>2.4 kbps MELP Proposed Federal Standard speech coder<br> Fixed-point C code, version 1.0<br> Copyright © 1998,<br> has intellectual property right on the MELP algorithm. The contact for licesing issues for commercial and non-government use is , Director, Government Contracts, (phone )</td> </tr> <tr> <td><strong>Marking 3</strong></td> <td>Contract No: N/A<br> Contractor Name:<br> Contractor Address:<br> Expiratrion Date: None<br> The government’s rights to use, modify, reproduce, release, perform, display, or disclose these technical data are restricted by paragraph (b)(2) of the Rights in Technical Data—Noncommercial Items clause contained in the above identified contract. No restrictions apply after the expiration date shown above. Any reproduction of technical data or portions thereof marked with this legend must also reproduce the markings.</td> </tr> <tr> <td><strong>Marking 4</strong></td> <td>** Copyright Status: This material is declared a work of the U.S. Government and is not subject to copyright protection in the United States. Distribution and handling may be limited from public disclosure.</td> </tr> </tbody> </table> <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption><strong>Table 5. Intellectual Property/Data Rights Detective Checklist</strong></caption> <tbody> <tr> <td style="text-align:center;"><strong>✓</strong></td> <td><strong>Obtain NATO Standard Agreement 4951 - MELPe</strong></td> <td>Publicly declares U.S. Government paid for intellectual property rights in MELPe developed by Company Y (but does not address Company X intellectual property).</td> </tr> <tr> <td style="text-align:center;"><strong>✓</strong></td> <td><strong>Contact at U.S. Government Lab</strong></td> <td>Obtained key scientific publications dated 2000 and 2002 by authors associated with various U.S. companies including Company Y and Company X, and the U.S. Government.</td> </tr> <tr> <td style="text-align:center;"><strong>✓</strong></td> <td><strong>Contact at University</strong></td> <td>Learned that MELPe technology relies on MELP as its basis. Learned that the government paid Company Y for rights in MELP in the late 1990s in order to put the MELPe work on contract. Obtained government memo conveying MELPe data rights purchased from Comopany to NATO governments.</td> </tr> <tr> <td style="text-align:center;"><strong>✓</strong></td> <td><strong>Contact at University</strong></td> <td>Learned that Company X bought the company that was awarded the government’s MELPe contract, under which inventions were made and patented. Obtained MELPe Patent Numbers.</td> </tr> <tr> <td style="text-align:center;"><strong>✓</strong></td> <td><strong>Research Patent and Trademark Office</strong></td> <td>Learned patents for MELPe were assigned to Company X. Each patent declared that the invention was made under DoD contract, contract#MDA904-98-C-A857, and that the government had rights to the invention.</td> </tr> </tbody> </table> <br> After figuring out the circumstances around development of the MELPe technology, the markings on the 119 files could be addressed (Table 6).<br> <br> Since we discovered that the patents declare government rights in the MELPe invention, DoD Legal Counsel phoned Company X and presented the problem—that although there is ample evidence that MELPe was developed under contract, the markings neither reference the contract nor the government’s rights to the files. The inference is that proper data rights markings were never put into files developed under a government contract. The GPR marking we expected to be with Marking 1 (Company X Copyright) was simply missing. <table border="1" cellpadding="3" cellspacing="3" style="width:75%;"> <caption>Table 6. Addressing Markings</caption> <thead> <tr> <th scope="col">Marking</th> <th scope="col">Justified</th> <th scope="col">Conforming</th> <th scope="col">Notes</th> </tr> </thead> <tbody> <tr> <td><strong>Company X<br> Copyright (Marking 1)</strong></td> <td style="text-align:center;"><strong>✓</strong></td> <td style="text-align:center;">No</td> <td>Company X can legally mark their copyright under DFARS but there isn’t any declaration of Government Purpose Rights and no mention of DoD Contract MDA904-98-C-A857.</td> </tr> <tr> <td><strong>Company Y<br> Copyright (Marking 2)</strong></td> <td style="text-align:center;"><strong>✓</strong></td> <td style="text-align:center;">No</td> <td>Company Y’s marking does not contest government/DoD use, and other sources provide ample evidence of Government Rights to the IP, but there should be a better indication of government data rights in the legend.</td> </tr> <tr> <td><strong>Marking 3</strong></td> <td style="text-align:center;">No</td> <td style="text-align:center;">No</td> <td>No contract mentioned, no expiration date, unjustified.</td> </tr> <tr> <td><strong>Marking 4</strong></td> <td style="text-align:center;">No</td> <td style="text-align:center;">No</td> <td>Contradicts Company X Copyright (Marking 1), unjustified.</td> </tr> </tbody> </table> <br> Seven months from the arrival and scanning of the files, Company X responded to us with an email acknowledging that the government may use, modify, and release the source code for government purposes. This was a major break in the case. Because the contractor did not put the proper data rights markings into the source code, it took us seven months to obtain evidence sufficient to allow government use of the files. Based on our findings, the government signed an Intellectual Property Rights memo, now included with the MELPe source code, to let authorized companies know that they may use it for government purposes. Due to our successful efforts, our customer can now allow authorized contractors to use, modify, and upgrade the source code for MELPe, removing barriers to competition and reducing sustainment costs for this important military technology, but the case illustrates that markings for contract deliverables should comply with DFARS clauses prior to their acceptance. <h2><img alt="case closed" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article5_image03.jpg" style="margin-left:6px;margin-right:6px;float:left;width:50%;" />Case Closed</h2> Once again, our trusty IpScan helped us to close another case. Even though this last case took us nearly seven months to wrap up, it could have taken a great deal longer without our tool to search through the 45,135 SLOC to reveal the data rights markings discussed above. Without this tool, a lot more effort would be spent simply obtaining the markings, and less time would be available to devote to the investigation. As illustrated, to be an effective data rights marking sleuth, you need both the right tools and the right knowledge skill set, including a comprehensive understanding of the DFARS and the ability to conduct a proper investigation. Both of the cases demonstrated how the combination of an effective data rights searching tool and skillset can be used to resolve data rights marking mysteries.<br> <br> While we wrote this article as “independent detectives,” we actually work for the Joint Tactical Network Center (JTNC) in San Diego, California. We do this type of detective work every day in our work on data rights markings. We have an experienced team and the tools that can assist your program with data rights markings and ensure the proper resolution.<br> <br> For more information on obtaining a copy of the (IpScan) tool, contact the JTNC at <a class="ak-cke-href" href=""></a>. This JTNC-developed tool is freely available from the DoD IR and can be used by DoD personnel and contractors. <hr />FRANK is a computer engineer in San Diego at the JTNC with 30 years’ experience in software development and operations, information assurance and radio frequency communications. He has a Bachelor of Science degree in Electrical Engineering from the University of Kansas.<br> <br> HARRIS is a professor of Program Management at DAU in San Diego, California. He has more than 20 years of experience in defense acquisition. He holds a Master of Science degree in Systems Engineering from Johns Hopkins University as well as a Master’s in Business from the Florida Institute of Technology.<br> <br> SANTIAGO is a Data Rights Analyst in San Diego for the JTNC. She is experienced in the practice of law and data rights. She has a Bachelor of Arts in English from the University of San Diego and a Juris Doctorate from the Thomas Jefferson School of Law in San Diego.<br> <br> The authors can be contacted at <a class="ak-cke-href" href=""></a>, <a class="ak-cke-href" href=""></a>, and <a class="ak-cke-href" href=""></a>. <h6>The views expressed in this article are those of the authors alone and not the Department of Defense. Reproduction or reposting of articles from Defense Acquisition magazine should credit the authors and the magazine.</h6></div>string;#/library/defense-atl/blog/Data-Rights-Marking-Sleuths
Is It Time for the Iron Square? It Time for the Iron Square?2022-06-01T16:00:00Z,<div class="ExternalClass848F787DCD914F239AA3B476B52A3596">The one-star program director’s steel-blue eyes pierced into this then young captain who was in his first program management assignment. Then, in a raspy voice, the brigadier general said, “Dave, do you know the secret of successfully managing a program? It’s the Iron Triangle: cost, schedule, and performance. If you balance those three, you’ll do well in this business.”<br> <br> That was a lifetime ago. However, I fondly remember that moment like it was yesterday. And for the last nearly 30 years, that is exactly the approach that I have followed and preached. Until now. No longer will we, or can we, guarantee war-winning capability by simply balancing the Iron Triangle. There’s a fourth dimension that deserves equal billing: security. If we fail to focus on it from Day One of a program’s life cycle, it could render our efforts fruitless.<br> <br> Is it time for the Iron Square? Yes! Cost, schedule, performance, and security.<br> <br> Speed is important. Speed is critical. Congress passed and the President signed the Fiscal Year (FY) 2016 National Defense Authorization Act (NDAA) with its expanded use of urgent capacity authority (Sec. 803), expanded use of Other Transaction Authority (Sec. 815), and, most important, the “Middle Tier of Acquisition” (MTA) authority (Sec. 804) for reducing bureaucracy and accelerating prototyping and fielding. Since then, the defense acquisition community has increasingly become aware of and is pursuing new, innovative technologies at the “speed of relevance.” MTA programs take advantage of these precepts to prototype and field war-winning technologies faster for our Warfighters. But what happens if all that great technology is usurped by our near-peer competitors? It’s like buying state-of-the art entertainment electronics for your home but not buying a lock for the front door. In time, someone else will have that great equipment.<br> <br> <img alt="security iconography" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article4_image01.jpg" style="margin-left:6px;margin-right:6px;float:left;width:25%;" />The importance of security is prominently noted in the Department of Defense Instructions (DoDI) for our traditional acquisition pathway, now known as Major Capability Acquisition. DoDI 5000.85’s policy section notes that the “DoD will prioritize speed of delivery, security, continuous adaptation, and frequent modular upgrades to ensure a highly effective and lethal force.” “Yes,” to speed of delivery and adding the latest innovative technologies, but also a huge “YES” to baking in security! The Iron Square will be a tough balance as a keen eye on security may add cost and schedule for development and testing. However, finding that balance is critical.<br> <br> While security can mean various things to everyone, let’s focus on two known threats from our near-peer competitors, specifically using China as our example—intellectual property (IP) theft and cybersecurity. IP theft has been a known issue with the People’s Republic of China (PRC) for many years from a commercial perspective. It was a major factor in the imposition of additional tariffs by the Trump administration, and it remains a concern for the Biden administration. While the efficacy of certain counter tactics, such as tariffs, can be debated, there is a national acknowledgment that the IP theft problem exists.<br> <br> As early as 2012, the then director of the U.S. National Security Agency, Gen. Keith Alexander, described the Chinese IP theft issue as the “greatest transfer of wealth in history.” More recently, former Secretary of Defense Mark Esper called China’s practices “the greatest intellectual property theft in human history.” A number of sources estimate that the private sector loses between $225 billion and $600 billion per year due to IP theft. William Evanina, former Director of the National Counterintelligence and Security Center, has said, “That’s like taking $4,000 to $6,000 annually from every family of four in America.”<br> <br> Chinese courts have recently instituted injunctions globally blocking U.S. companies from suing for patent violations, even providing for fines of roughly $1 million per week to be imposed on U.S. companies that don’t withdraw their IP theft lawsuits. How does that affect the DoD and the defense industry? Two major concerns are commercial-military transfer and counterfeit parts.<br> <br> The expanded authority for Other Transaction Authorities (OTAs) (Sec. 815) and the MTA authorization (Sec. 804) in the FY 2016 NDAA has rapidly increased the use of OTAs to encourage innovative, non-traditional companies (i.e., private, commercial firms) to provide war-winning technology for our Warfighters. While we continue seeing benefits from these innovative acquisition strategies and increased non-traditional company participation, barriers remain that continue to discourage commercial companies’ engagement with the DoD, including concerns about profit margins and even anti-military sentiments on the part of their employees. For example, Google refused to continue artificial intelligence work on Project Maven after 4,000 Google employees signed a petition to oppose working with the DoD. This is America and private companies can make that decision. Not so in China. <blockquote> <p style="text-align:center;">IF YOU KNOW THE ENEMY AND KNOW YOURSELF, YOU NEED NOT FEAR THE RESULT OF A HUNDRED BATTLES.<br> —Sun Tzu</p> </blockquote> <br> The Chinese Communist Party instituted an aggressive, national strategy in 2015 known as Military Civil Fusion (MCF) to enable the PRC to develop the most technologically advanced military by eliminating barriers between China’s civilian research and commercial sectors and its military and defense industry. China’s President Xi Jinping in 2017 said, “We should ensure that efforts to make our country prosperous and efforts to make our military strong go hand in hand. We will strengthen unified leadership, top-level design, reform, and innovation. We will speed up implementation of major projects, deepen reform of defense-related science, technology, and industry … and build integrated national strategies and strategic capabilities.”<br> <br> While those goals remain aspirational, China’s authoritarian government dictates the implementation of top-down policies, compliance, and state resources for long-term industrial planning and investments, which gives China a strategic path for its MCF initiatives.<br> <br> China also already has the distinct advantage that its top Chinese defense companies are heavily involved in commercial enterprises (only between 20 percent and 38 percent of their revenue is generated by defense) whereas U.S. defense industry leaders, such as Lockheed Martin, gain most of their revenues from defense work (between 56 percent and 96 percent). This disparity provides China a comparatively easier transition of advanced commercial technology into defense products than is the case for the United States.<br> So, what is the security concern about the combined threat of the Military-Civilian Fusion and the current heavily leveraged commercial duality within the PRC’s top defense firms? Add IP theft to the equation and you have a triple-headed monster. Without a solution to IP theft, any leading-edge, war-winning technologies usurped by China will directly and necessarily increase the lethal power of the People’s Liberation Army (PLA). The solution to IP theft must also cover our commercial firms that, until recently, were not required to report such thefts but volunteered any information they chose to report.<br> <br> Vital Signs 2022, the National Defense Industrial Association’s annual report on the health and readiness of the defense industry, states: “The Defense Industrial Base (DIB) faces sustained and increasing threats of intellectual property theft, economic espionage, and ransomware hacks among other security breaches.” The DIB relies on its intellectual property for its profitability. Lack of IP protection can negatively influence industry’s willingness to invest in research and development (R&D) and to venture into commercial activities.<br> <br> The good news reported in Vital Signs is that the number of new FBI investigations into IP theft has steadily declined since 2011. These investigations include counterfeiting, a prominent concern due to America’s global supply chain. The U.S. Intellectual Property Commission, citing U.S. Customs and Border Protection data, reported in 2017 that 87 percent of the counterfeit goods seized by Customs officials originated in China. The DoD has been aggressively contesting counterfeit parts for nearly a decade. In 2012, the Assistant Secretary of Defense for Sustainment published a memorandum titled, “Overarching DoD Counterfeit Prevention Guidance,” acknowledging that “counterfeit items are a serious threat to the safety and operational effectiveness of DoD systems.” While threats to IP remain a concern for the defense industry, the greater contributor to industry’s failing security score of 50 (as reported in Vital Signs) is the threat posed to information security (i.e., cybersecurity), which scored a dismally low score of 20.<br> <br> The cybersecurity threat cannot be overstated. DoDI 5000.90, “Cybersecurity for Acquisition Decision Authorities and Program Managers,” notes that “the Department must inculcate cyber security into all aspects of the DAS [Defense Acquisition System] and operations.” The bold visual of cybersecurity’s expansion across the Adaptive Acquisition Framework in DoD instructions underscores our need to focus on this area of critical vulnerability. Almost daily, news accounts illuminate the depth and breadth of attacks on commercial institutions. We have seen how aggressively Russia used cyber attacks prior to its invasion of Ukraine. Our defense industry and government agencies are not immune. Vital Signs 2022 reports, “Known cybersecurity vulnerabilities continue to rise at a very high rate. New cybersecurity vulnerabilities have seen a 263 percent increase since 2016.”<br> <br> According to the National Security Agency, “One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and DoD information networks is Chinese state-sponsored malicious cyber activity.” Chinese state-sponsored cyber attackers aggressively target a wide range of industries, academia, and medical institutions to advance China’s long-term economic and military development objectives. In addition to theft of intellectual property, another critical advantage our near-peer competitors derive from cyber attacks is the ability to collect intelligence on our defense systems and capabilities.<br> <br> In the nearly 3,000-year-old words of Sun Tzu, “If you know the enemy and know yourself, you need not fear the result of a hundred battles.” Vulnerabilities in our cybersecurity can lead to just such a situation, and the easiest way for our near-peer competitors, like China and Russia, or even a rogue nation such as Iran or North Korea, to gain insight into our future innovations is by attacking our most vulnerable partners, the small and medium businesses.<br> <br> As former Under Secretary of Defense for Acquisition and Sustainment Ellen Lord stated, “We know that the adversary looks at our most vulnerable link, which is usually six, seven, eight levels down in the supply chain.” One recent DoD initiative, the Cybersecurity Maturity Model Certification (CMMC) 2.0 program, aims to reduce that risk. The CMMC framework institutes three levels of certification that indicate the maturity and reliability of each defense company’s cybersecurity infrastructure to protect our critical war-winning information.<br> <a href="/event/Think%20Differently%20Speed%20and%20Security" target="_blank"><img alt="also see DAU webcast" src="/library/defense-atl/DATLFiles/May-June2022/SEEDAUWEBCAST.jpg" style="float:left;width:25%;border-width:0px;border-style:solid;" /></a><br> CMMC is intended to be a starting place and should provide the foundation for our defense industry partners to provide continual monitoring and agile responses to evolving threats. As DoDI 5000.90 explains, “Designs and architectures must address technology and cyber threat evolution to maintain mission effectiveness beyond the near term.” Although the goal of CMMC is to be cost-effective and affordable for small business, you can be sure that achieving security will be a balancing act.<br> <br> Security must move above its current standing as an annex of the Program Protection Plan; it is foundational to maintaining our Warfighters’ technological edge. It must become part of an Iron Square of cost, schedule, performance, and security—an equal concern to be balanced in pursuit of a successful acquisition program.<br> <br> Is it time for program managers to value security as much as cost, schedule, and performance? Is it time for the America’s near-peer competitors to be thwarted in their efforts to usurp our innovative, war-winning technologies? The answer is a resounding YES! It is time for the Iron Square! <hr />Riel is a DAU Professor of Acquisition Management who instructs future program managers. He previously worked with the U.S. Air Force and industry over a 25-year period.<br> <br> The author can be contacted at <a class="ak-cke-href" href=""></a>. <h6>The views expressed in this article are those of the author alone and not the Department of Defense. Reproduction or reposting of articles from Defense Acquisition magazine should credit the author and the magazine.</h6></div>string;#/library/defense-atl/blog/Is-It-Time-for-the-Iron-Square
An Acquisition Speed Manifesto Acquisition Speed Manifesto2022-05-29T12:00:00Z,<div class="ExternalClassDF28B13CEC534CA28143B4402D1DA34A">We hear it all the time: We need to speed up our DoD acquisition processes and get capabilities fielded more rapidly. With great power competition, the urgency to act increases significantly. The Adaptive Acquisition Framework provides the Department of Defense (DoD) with new authorities, pathways, and flexibilities that can enable greater speed. However, we have significant work to do as evidenced by an eroding technological edge over our potential adversaries. <br> <br> DoD can learn valuable speed lessons from the commercial sector. Investment in new commercial technology dwarfs the amount that DoD spends each year, including the investments from traditional defense companies. With these new commercial technologies come new methods and techniques that may be relevant in our domain. <br> <br> We saw this with Agile DevSecOps, which combines software development (Dev), security (Sec), and operations (Ops). After several successful pilot programs, DoD implemented the software pathway into the acquisition system. As the quote above from Dave Girouard suggests, industry understands the importance of speed because it provides a competitive edge. The value of speed also applies to acquisition, and we must incorporate it or face significant risks in future conflicts. Perhaps a manifesto can help generate greater speed. <br> <br> What is the purpose of a manifesto? A manifesto can be a very powerful tool to articulate a new vision. For example, 17 software developers met in 2001 to discuss software development methods. The software experts were pursuing an alternative to the existing software practices that they regarded as ineffective. The group produced a document known as the Agile Manifesto. It identifies four key values and 12 principles that its authors believe software developers should use to guide their work. <br> <br> That manifesto has had an incredible impact throughout the software development community over the last three decades. Many believe that it accelerated an industry paradigm shift in software development that continues to evolve. If you read the manifesto, (, you will see that it is still very relevant. <br> <br> The following is the proposed “Acquisition Speed Manifesto.” It follows a format similar to that of the Agile Manifesto, with over-arching values followed by principles. The manifesto does not provide the “how to do it” because it is not intended to be that prescriptive. As in the case of the Agile Manifesto, if we adopt the values and implement the principles, the results will follow. <blockquote> <p style="text-align:center;">Speed is the ultimate weapon in business. All else being equal, the fastest company in any market will win. Speed is a defining characteristic—if not the defining characteristic—of the leader in virtually every industry you look at. <br> —DAVE GIROUARD<br> CEO OF UPSTART (A PERSONAL FINANCE STARTUP)<br> AND FORMER PRESIDENT OF GOOGLE ENTERPRISE APPS</p> </blockquote> <h2>We value:</h2> <h3>Capabilities over programs.</h3> <p>Many have advocated for a portfolio-management-based acquisition model, including the Section 809 Panel. As we look at mission portfolios and evolving operational needs, we should prioritize investments that fill mission gaps at the expense of individual programs. This highlights the importance of more effective portfolio management versus the current incentives associated with a program-centric acquisition model.</p> <h3>Trust over doubt.</h3> <p>The importance of trust as it relates to speed is not a new revelation. In 1943, Kelly Johnson, chief engineer at Lockheed, developed a jet fighter to counter the growing threat from Germany in World War II. He led a small team that designed and built from scratch the XP-80 Shooting Star in just 143 days. One of his management rules was “Mutual trust between the military and the contractor with very close cooperation and liaison on a day-to-day basis.”<br> <br> Those familiar with Stephen M.R. Covey’s book, Speed of Trust: The One Thing That Changes Everything may recall that lack of trust results in slow bureaucratic processes and higher costs. On the other hand, providing and sustaining trust can be a game changer for many organizations, both in industry and in the public sector. Trusting others in a compliance-oriented system requires a culture change at all levels. Individuals and organizations earn trust based on their expertise and credibility. We should accord trust to those who have demonstrated that they are worthy of it. </p> <h3>Informed opportunity management over risk aversion. </h3> <p>We must value risk-taking but be sure to make informed decisions and manage the risks. Think of the behaviors of start-up companies and experienced commercial-sector innovators. They often must take high risks in order to succeed. We can view risks in the same context as opportunities because the investment coincides with potential future return on the investment. </p> <h3>Meaningful performance over cost.</h3> <p>We must not sacrifice high-quality systems, and we must be willing to pay a premium where speed and performance are priorities. Speed should not lead to poor quality or degraded performance. Faster work will require greater discipline. But we do not seek enhanced performance that provides little or no operational benefit. </p> <h3>Empowerment over bureaucracy.</h3> <p>Empowerment must be an integral element of our culture and processes. It does not mean delegating everything and reducing leadership involvement. On the contrary, it may involve greater coaching, support, and investment from leaders. There is no sustained speed without empowerment. </p> <h2>We adhere to these principles:</h2> <h3>1. Acquisition strategies must consider and implement alternatives that incorporate greater speed. </h3> <p>Speed must be a key element of a program’s strategy. Acquisition strategies determine priorities, incentives, risks and opportunities, business arrangements, pathways, and other key factors. If going faster is the priority, then this over-arching consideration should drive acquisition strategies. Program managers have several tools and techniques to drive faster contracting and capability delivery. Some examples include schedule concurrency, greater use of commercial off-the-shelf or government off-the-shelf items, rapid prototyping, streamlined contracting procedures, flexible business arrangements, and use of authorities within the Adaptive Acquisition Framework pathways. Not every program strategy needs to incorporate speed, but all should consider the possibilities! </p> <h3>2. Leaders must create and sustain acquisition execution and decision support systems based on a foundation of trust.</h3> <p>Trust enables speed and can lower costs. Think of all the systems and processes in acquisition that exist solely because of a lack of trust. These checks slow programs and communicate distrust throughout the enterprise. Just like the concept of designing in quality at the beginning of a product life cycle, we must incorporate trust behaviors in our acquisition decision systems from start to finish. These include processes such as governance, decision reviews, financial management, and requirements. </p> <h3>3. We will implement robust and continuous opportunity management.</h3> <p>Industry can teach us about the value of opportunity management. Capitalizing on high payoff opportunities is not a random occurrence. Industry allocates significant effort to manage and sometimes even create new opportunities. While we have policy guidance on opportunity management included in the DoD Risk, Issue, and Opportunity (RIO) Management Guide of January 2017, how many programs actually implement an effective opportunity management process and make it a priority? </p> <h3>4. The requirements process must emphasize speed and flexibility in all “swim lanes and pathways.”</h3> <p>It is no longer realistic to lock down requirements and expect them to remain stable over several years. We know that requirements will change. Therefore, we must develop and manage requirements processes that can accommodate rapid changes and accelerated delivery. </p> <h3>5. Speed will be included as a key objective in strategic plans.</h3> <p>Organizations should recognize that significant change takes time and effort. Greater speed can involve modifying processes, techniques, and organizational goals, which should be part of a strategic plan. Leaders should establish clear objectives that measure and track speed in all processes.</p> <h3>6. We must obtain financial management flexibility that enables rapid pivots to new capabilities.</h3> <p>Lack of funding flexibility is a big issue because existing financial rules severely limit the rapid resource shifts needed to respond to new threats and opportunities. The acquisition environment has witnessed unprecedented change in recent years, and our rules and procedures must change if we want to maintain a decisive edge on the battlefield.</p> <h3>7. People are our greatest asset. They will deliver if we provide them the necessary vision, tools, and support.</h3> <p>Leaders at all levels can inspire their people to meet the new imperative. However, it will not happen unless we give them the opportunity to succeed. <br> <br> <img alt="overhead photo of people on a street" src="/library/defense-atl/DATLFiles/May-June2022/DefAcqMag_May-June22_article3_image01.jpg" style="width:90%;" /></p> <h2>The Way Ahead</h2> <p>There is no easy solution for speed, and we have many obstacles to overcome: cumbersome processes, lack of financial management flexibility, a program-centric (rather than portfolio-based) model, risk aversion, inadequate strategies, poor planning, and other issues. We can overcome these obstacles with a commitment to adopting the right values and principles. <br> <br> The intent of this notional manifesto is to start a conversation. What is missing? What is included that does not belong? What should we change? <br> <br> As we embark on this journey, we should consider the example of an athlete in training. The athlete prepares for future events by devoting significant training time to fundamentals and various drills to build skill, strength, and endurance. This training may start with simple movements such as stretching and short exercises. Over time, the athlete gains strength, greater skill, and increased speed. If the athlete pushes too hard, too soon, an injury is more likely and could result in loss of the benefit of any previous effort. <br> <br> DoD acquisition can follow a similar model, starting with small fundamental steps such as pilot programs and then growing to more advanced sprints! The time to start the journey is now since we did not start yesterday. </p> <hr /> <p>SCHULTZ is a professor of Program Management and an executive coach in DAU’s Capital and Northeast Region at Fort Belvoir, Virginia. <br> <br> The author can be contacted at <a class="ak-cke-href" href=""></a>. </p> <h6>The views expressed in this article are those of the author alone and not the Department of Defense. Reproduction or reposting of articles from Defense Acquisition magazine should credit the author and the magazine.</h6></div>string;#/library/defense-atl/blog/An-Acquisition-Speed-Manifesto

Chat with DAU Assistant
Bot Image