U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Https

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

A Portfolio Management- Based Acquisition Model?

Breadcrumb

  1. Home
  2. A Portfolio Management- Based Acquisition Model?
A Portfolio Management- Based Acquisition Model?

Defense acquisition defines success based on cost, schedule and performance (c/s/p) goals as part of a program-centric model. Program managers (PMs) commit to achieving these c/s/p parameters when they sign the acquisition program baseline, similar to a management “contract” with the program executive officer and the milestone decision authority. This program-centric paradigm has been in place for decades and drives the behaviors in all aspects of defense acquisition, including requirements, funding, policy and acquisition decisions. 



Is this still the right model in today’s environment? Are we investing in the right things and reacting fast enough to change? The short answer is no and it is imperative that we change to a portfolio-based acquisition model! As part of this change, we also should define what success looks like and how to measure it. 

Before examining this portfolio management model, we should review some background. Efforts to reform acquisition have continued for the last several decades with numerous studies, panels, boards and legislative actions. Despite the many criticisms and attempts at reform, the defense acquisition system has delivered technologically superior weapon systems since the end of the Vietnam War. Our defense systems capabilities were a key factor in the successful conclusion of the Cold War in the early 1990s. Weapons superiority deters adversaries and enables war-winning capabilities when deterrence fails. 

Attempts to measure the overall performance of the defense acquisition system concluded that performance in terms of c/s/p was improving based on the annual Performance of the Defense Acquisition System reports from 2013 to 2016. The 2016 report stated the 5-year moving average of cost growth on our largest and highest-risk programs was at a 30-year low. While this appears to be great news, the 2016 report also highlights how the primary outcome of defense acquisition is the value of operational capabilities. The report goes on to explain how it is difficult to measure value to the Warfighter so the focus of the study defaults back to the more easily quantified measures of c/s/p. 

Recently, senior leaders have expressed concerns about our eroding technological edge. The costs of decades of Middle East conflict have limited our ability to recapitalize aging fleets and develop new systems. Meanwhile, our near-peer competitors have been investing in new capabilities and deploying new systems without the burden of prolonged conflicts and global operations. Moreover, the availability of commercial-based technology and intellectual property theft have enabled new and emergent threats. The consensus is that, in support of our new national security priorities and major power competition, defense acquisition must go faster and deliver warfighting capability with greater value. In order to achieve faster cycle times and deliver better outcomes, some observers, including the Section 809 Panel on DoD market-based adaptability, have recommended a major acquisition reform known as portfolio management. 



Portfolio management is a strategic management process that starts with an enterprise-level identification of needs and opportunities. These needs, or capability gaps for the Department of Defense (DoD), are then prioritized based on urgency and funding and other constraints. When a prioritized list is available, portfolio managers develop business cases for alternative product ideas to exploit each of the highest priority needs and/or opportunities. Each alternative product proposal gets a thorough review via a gated process. At each successive gate review, managers review product proposals against others in the portfolio, weighing them against resources, criteria and the objectives of the enterprise. As alternatives pass through the gate reviews, only those proposals with the highest potential to succeed and pay off make the cut into the overall portfolio. Thus, portfolio management shifts the paradigm from optimizing individual programs to optimizing a portfolio of investments that show the greatest value in meeting enterprise objectives. 

The concept of portfolio management is not new to DoD, but it is not fully integrated across the DoD acquisition decision support systems. The Government Accountability Office’s Report 07-388 in 2007 highlighted the differences between industry and DoD investment decision making. The report also recommended that the DoD adopt several of the commercial best practices for portfolio management. The DoD agreed with the majority of the recommendations and identified several pilot initiatives undertaken to address them. The GAO report also mentioned how the “[military S]ervices fight together as a joint force but separately identify investment needs and allocate resources using fragmented processes that do not support an integrated, portfolio management approach.” More recently, GAO Report GAO-15-466 in 2015, continued to recommend a stronger portfolio management model. DoD partially concurred with the recommendations but took issue with the suggested implementation steps. Finally, the Section 809 Panel report, January 2019, identified portfolio management as a priority for reform, recommending not only a change in investment processes but a shift away from the decades-old program-centric acquisition model. 

The Section 809 Panel report recommended that the portfolio management approach include a new organizational construct for requirements, funding and acquisition management responsibilities. The report provided significant content on how to implement this new model, which may seem like a radical reform for some, given its wide-ranging ramifications to an entrenched bureaucracy. 

We should recognize that the Joint Capabilities Integration Development System (JCIDS) process for requirements attempts to use some elements of portfolio management across Functional Capability Boards in broad mission areas. We also see portfolio management used in the information technology domain. However, implementing this broader change across acquisition decision support systems (requirements, funding and acquisition) would arguably constitute the most significant acquisition reform ever attempted. The change is so dramatic that the risk of severe and unintended consequences is significant enough to warrant some baby steps before a wider implementation. The portfolio model change would also require significant legislative authorizations, which might be difficult to sell to Congress. Thus, the 809 Panel recommended establishment of a pilot program in each military Service that would take responsibility for a portfolio of programs.

Piloting portfolio management within one Service does not fit well with the overall concept. If we link portfolio management to mission capabilities, then a better pilot program approach would be one that includes appropriate Joint Service participation. This would enable the portfolio manager to optimize a set of capabilities for mission effectiveness, as opposed to optimizing individual Service program portfolios. 

The Section 809 Report suggests establishing a smaller, less visible portfolio manager for requirements, funding and acquisition decisions within the assigned pilot portfolio. Appropriate legislative action will be crucial because current funding statutes and rules constrain the movement of funding from one program to another. These funding rules are a significant obstacle to implementing a portfolio based model since rapid resource shifts will be necessary in order to optimize the assigned portfolio and exploit opportunities. Without this expanded authority to shift funding and priorities, portfolio management cannot work in today’s dynamic environment. These funding shifts, also known as reprogramming actions, are among several constraints related to fiscal law and regulations that must be reviewed as well as changes to support a real portfolio management model. The 809 Report addresses other details, including new organizational structures, proposed congressional language and new decision processes. However, one area in the report that lacked much discussion involves defining and measuring portfolio-management success. 

Earlier, we touched on the c/s/p metrics that gauge success. There are three fundamental issues regarding continued use of c/s/p in a program-centric model. First, c/s/p metrics do not allow investment managers to optimize a portfolio of investments that contribute to mission capabilities. Optimizing individual programs for c/s/p can be detrimental to the larger mission portfolio, which may require a rapid shift in resources at the expense of the individual program. Secondly, and as mentioned before, c/s/p metrics do not provide insight into the value the program delivers to operators. A program could meet all of its c/s/p metrics but provide little or negative value, based on changing requirements. Finally, c/s/p metrics do not support rapid shifts in acquisition because the system incentivizes continued and stable funding and avoids new requirements to avoid a baseline breach. The acquisition program baseline is one of the fundamental management tools in program-centric acquisition, locking down requirements, cost estimates, and schedule dates. PMs must report breaches up the chain, even to Congress in some cases when significant or critical deviations occur. These breaches often indicate poor management performance or technical problems that could put future funding for the program at risk. 

Adopting portfolio management can help overcome these issues, but the magnitude of the change will be dramatic. It also will involve a cultural change since leadership would need to adopt new metrics and incentives.

So if c/s/p goes away, what measures replace them? The measures we choose obviously are very important as indicated by the quote “what gets measured gets done” (attributed to many sources). Establishing the right metrics will drive the right behaviors but will also help decision makers determine adjustments and investment priorities. Measures of success that focus on outputs (e.g., c/s/p) may indicate some level of efficiency but not the value provided. For defense acquisition, the default measure should start with the value of operational capabilities delivered. Thus, the measure of success for portfolio performance should be value metrics in support of operational mission improvements. The good news is that we now have the tools and technology to make it happen. 

Mission engineering, joint simulations, virtual operations and digital engineering are examples of tools that can enable greater insights into the value of acquisition deliverables. Mission engineering, also addressed in the Section 809 Panel report, provides an integrated view of missions and supporting capabilities that can enlighten acquisition decisions within an operational context. An enterprise-mission architecture can provide the foundation for future solution architectures and fielded capabilities. The DoD also has adopted a digital engineering strategy that will formalize the development, integration and use of models to inform enterprise and program decision making. These models can help future portfolio managers look beyond individual systems and Service-specific portfolios for broader mission capability and mission gap assessments, ultimately leading to delivery of capabilities that provide greater operational value. Figure 1 provides a summary of the paradigm shift from the current acquisition model to portfolio management.

In order for mission engineering and other tools to support the portfolio approach, the requirements and acquisition communities must collaborate and work in a more integrated manner. Acquisition expertise provides the technical, engineering and business know how while the requirements managers provide the mission expertise. We might even consider integrating two communities into the new portfolio-management structure. Rather than focus on specific program requirements, portfolio managers will develop capability solutions that fill gaps in mission scenarios. This mission enterprise view across system and mission domains can enable portfolio managers to minimize or eliminate redundancies and low-value efforts. 

The software development community provides interesting insights on value-based metrics. Organizations that use agile software development and deliver software to users in short, consistent cycles are dropping the program-centric approach. While there are several variations of agile methods, one of the common threads is discovery of the biggest pain (or value) points. Addressing the big pain points can provide users with the greatest benefit so that they receive the highest priority and team focus, putting other needs in backlog for future consideration. Portfolio management can enable identification of mission pain points through multiple tools and activities, including robust mission engineering. 
Programs that employ DevSecOps for software also measure performance with both efficiency and outcome-based measures. Efficiency measures include software deployment frequency, deployment speed, backlog velocity, mean time to recover, cyber intrusion detection and prevention rates—to name a few. Efficiency measures provide useful insights but leaders should ensure their use in conjunction with value-based metrics. Otherwise, we might incentivize teams to become efficient but lose sight of the results produced by that efficiency. Examples of outcome-based metrics include customer satisfaction, user acceptance or reject rate, user productivity improvements, mission effectiveness enhancements, and many others that relate to value and return on investment. Examples of metrics associated with rapid prototyping might include time to deliver early knowledge points, cycle time to build virtual prototypes, number of failures and lessons learned, and time to mature prototypes into field-able capabilities.

Final Thoughts

Some of us who have been working in acquisition for decades can relate to Dangerfield’s quote about the frequent inability to bring about change. There is, however, renewed optimism that change is not only possible but is beginning to occur. Even some of us old-timers are excited about the possibilities and opportunities. The DoD acquisition community has demonstrated great proficiency at optimizing programs and we can be even better at optimizing portfolios. The time is ripe for a portfolio-management acquisition model. Let’s get the pilot effort rolling! 


Schultz is a professor of Program Management and an executive coach in the Defense Acquisition University’s Capital and Northeast Region at Fort Belvoir, Virginia.

The author can be reached at [email protected].

Subscribe