U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.


Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.


  1. Home
  2. Performance Based Logistics (PBL) Management

Performance Based Logistics (PBL) Management


Alternate Definition

PBL Management is facilitated by the use of the 12-step Product Support Strategy Process Model, primarily in relation to Step 12, Implement and Assess. The model encompasses the major activities required to implement, manage, evaluate, and refine product support over the life cycle. It is not a one-time process, but rather a continuing, iterative process in which the sustainment of a system(s) is adapted and evolved to optimally support the needs and requirements of the Warfighter in an effective and affordable manner. The DoD PBL Guidebook provides information on specific activities within each of the model's 12-steps, and focuses on the "how" regarding PBL arrangement development and execution. 

Alternate Definition Source
General Information

PBL Definition
PBL is synonymous with performance-based life cycle product support, where outcomes are acquired through performance-based arrangements that deliver Warfighter requirements and incentivize product support providers to reduce costs through innovation. These arrangements are contracts with industry or intra-governmental agreements.

A PBL arrangement is not synonymous with Contractor Logistics Support (CLS). CLS signifies the "who" of providing support, not the "how" of the business model. CLS is support provided by a contractor, whether the arrangement is structured around Warfighter outcomes with associated incentives or not. PBL arrangements, on the other hand, are tied to Warfighter outcomes and integrate the various Integrated Product Support (IPS) Elements (IPS Elements) in a cohesive product support package with appropriate incentives and metrics. In addition, PBL focuses on combining best practices of both Government and industry.

Product Support Business Model
The DoD Product Support Business Model (PSBM) was developed to assist the Program Manager (PM) and Product Support Manager (PSM), who must be tightly aligned, with the numerous supportability considerations and trade-offs that take place during the development and fielding of a weapon system. The PSBM, which is described in detail in the PSM Guidebook, defines the hierarchical framework and methodology through which the planning, development, implementation, management, and execution of product support for a weapon system component, subsystem, or platform will be accomplished over the life cycle. The model seeks to balance weapon system availability with the most affordable and predictable total ownership cost. PBL arrangements are a mechanism for accomplishing this task in a manner that shares performance risk between a Government buying activity and Government or commercial Product Support Integrator (PSI) and/or Product Support Provider (PSP). A properly designed PBL arrangement will align the provider's and Government's goals through the proper application of incentives.

PBL Management 
The 12-step Product Support Strategy Process Model shown below is used to implement system, subsystem or component PBL arrangements . The steps may be performed in a different order, or they may be repeated or deleted depending on the life cycle phase and program requirements. PBL management factors are discussed primarily in relation to Step 12 of the model, Implement and Assess, and are discussed below.

Step 12 - Implement and Assess
Tracking performance is a critical part of PBL management, so PBLs cannot be a "fire and forget" endeavor. As discussed above, the two tenets of PBL management are establishing and aligning top level desired outcomes, and performance reporting and continuous improvement focus.

Tenet #1 - Establish and Align Top Level Desired Outcomes
Focusing on the desired outcome is the core of PBL. Performance-based arrangements are measured against their ability to directly meet or support planned Am, Reliability, O&S Cost, and other sustainment metrics. 

During the acquisition and sustainment phases of a program, metric use - specifically the Sustainment Key Performance Parameter (KPP) is spelled out in multiple documents to include by not limited to the following:

The life cycle metrics called out in the above policy documents include:

As programs progress from acquisition and fielding to sustainment, these basic metrics are used to monitor and measure performance. In addition to these mandated metrics, the PSM may also adopt a variety of metrics to provide insight into specific performance areas.

The recently published update to DoDI 3110.05, Sustainment Health Metrics in Support of Materiel Availability, identifies materiel availability (Am) and operational availability (Ao) as the two superordinate metrics that assess the effectiveness of the DoD sustainment enterprise. DoDI 3110.05 also introduces a new metric, Cost Per Day of Availability (C/DA), which is described as the superordinate metric that assesses the efficiency of the DoD sustainment enterprise. The calculations for Am and Ao differ slightly from the JCIDS Manual, but as additional policy and guidance updates are published, the differences between these formulas should be resolved. The new C/DA metric is related to, but not identical to, other O&S Cost metrics and applies to the population of end items associated with Ao, termed the Primary Mission Asset Inventory.

While the government and the support provider should work as a team to determine the required performance outcomes, the DoD is very specific about measuring and reporting on these performance outcomes.

What Should You Measure?
With literally hundreds of different things that can be measured, how do you know which ones you should use for your PBL program? Appendix F of the PBL Guidebook includes a listing of approximately 100 metrics that are commonly included in PBL arrangements. MIL-STD 260, Reference Data for Logistics Metrics, is another good resource. The listing includes product support metrics focused on the potential operational outcome, but also includes metrics aligned with the 12 IPS Elements and suitability attributes captured within sustaining engineering. The PSM Guidebook also provides some commonly used measures. Whatever metrics you select, always remember: smart managers don't just measure things because they can—they pick the factors that will have the biggest impact on improving a program's performance.

Metrics assigned to a support provider are measured by Key Performance Indicators (KPI). In order to use them to help reach desired goals, the metrics must be well-defined and quantifiable, and they should be selected or constructed to encourage performance improvement, effectiveness, efficiency, and innovation. The KPIs must be appropriate to the metric assigned. There is no perfect metric or KPI, but selecting an appropriate complementary ones will promote the desired behavior and outcome while minimizing unintended consequences. Effective metrics ensure PSI and PSP activities are aligned with the Warfighter mission, contribute to meeting Warfighter requirements, deliver an on-time, quality product, and reduce (or avoid) cost.

Rule of 5
When determining top-level desired outcomes, adopting "the rule of 5" is a good general guideline to limit the number of top-level metrics to 5 or fewer. The reason? Focus. Having too many metrics makes it hard for the support provider to focus on what is truly important.

Metrics are clearly aligned to desired outcomes 
It's not simply enough to have a few critical metrics. These metrics need to be focused on achieving the right things. It is tough to run a business without clear objectives. As such, the second best practice in selecting PBL metrics is to ensure that the metrics are clearly aligned to the desired outcomes. As Yogi Berra put it, "If you don't have a goal, any road will get you there." Picking metrics that are not aligned to your desired outcomes will take you down the wrong road.

Ideally, the metrics that the government and the support provider select should be focused on achieving warfighter requirements. However, there is more to it than that. The metrics need to be tied to requirements, to the operational role of the system, and must be synchronized with the scope of the support provider's responsibilities. This support provider should not be held contractually liable for achieving performance against a metric over which they have no control. For example, an overall platform level metric of Am is not appropriate when the support provider only controls the airframe and not the engine or other key subsystems. A support provider should not be held accountable for overall availability when there is a workshare agreement with a depot either, unless the provider has an enforceable contractual relationship with the depot for their portion of the work.

While metrics need to align with the scope of the support provider's responsibilities, it is still important to measure them at the system level to provide insight into overall performance. This multi-tiered metric and measurement approach should help provide the PM and PSM with end-to-end performance insights for management purposes.

Data sources are accurate and timely
Another best practice when selecting metrics is ensuring that data sources are accurate and timely. If a metric is right for the program, but accurate or timely data is not readily available, the team might still want to use it, especially if it will be possible to collect the data in the future. The government may be able to identify the data needs within the contract or intra-governmental agreement, such that the support provider will develop, collect, and provide the required data downstream.

Achievement of metrics validated by a mutually agreed Quality Assurance (QA) Approach 
It is also important to be able to assess selected metrics with a QA Surveillance Plan (QASP) or other method of QA monitoring. The QA approach must be mutually agreed upon, and often includes sampling or audit requirements. The support provider is responsible for ensuring the quality of all work performed and the government is responsible for surveillance and monitoring. A typical QA approach addresses:

  • What gets measured, when, and by whom
  • Processes to identify and address quality issues
  • QA monitor(s)

QA is a continuous activity designed to determine if the work being performed meets or exceeds the quality performance standards. The goal is to prevent substandard work, rather than catch it after the fact. The rigor of the QA process should match the needs of the program; it should be a major element in program management and control, focusing on insight rather than oversight, and the QA monitor(s) should be independent of the work being measured.

Five QA approaches can be used to validate achievement of desired outcomes. These are:

  • Random Sampling
  • Periodic Sampling
  • Trend Analysis
  • Customer Feedback
  • Third Party Audits

The program needs to ensure it has the resources to monitor the reporting management process because simply reporting on these measurements will not ensure the quality standards are being maintained. President Ronald Reagan's signature phrase is appropriate here: the program must "trust but verify" QA results.

Clear understanding of financial impact of metrics 
The best practice approach is to tie the provider's incentives directly to their delivery of the desired performance. This can be done with various types of pricing models or incentives.

Tenet #2 - Performance Reporting and Continuous Improvement Focus
While most organizations agree performance measurement is important, performance reporting is vital to the success of a PBL arrangement. Support providers need to understand their performance so they can proactively manage performance to make cost/profit tradeoffs; while the government needs to understand the program's performance to determine if they are fulfilling their mission. One important area to measure is supply chain performance associated with key performance indicators such as Materiel Availability and costs. It is imperative that management adopt solid performance management practices.

Regular review cycles 
In addition to formal QA metrics and reports, required by the government, a good PBL program will establish formal periodic reviews with the PBL stakeholders. Reporting is often relegated to the QA monitor or contracting officer's representative. Ideally, however reviews should include all key stakeholders, including the program office, key "customer" representatives, and the representative from the contracting community. Likewise, the support provider should include a cross-functional mix of stakeholders in the review as well.

While a support provider should only be contractually liable to measure what is in the scope of their control, best practice programs use cross-functional reviews to measure the end-to-end performance. This means there should be reporting from all program participants, industry and government alike, in order to capture the big picture and enable sound government management. The best practice approach puts together all the pieces of the puzzle in order to obtain a landscape view of total performance.

Metrics reports are used as part of regular review meetings
Best practice organizations develop performance and cost reports regularly and frequently to help them proactively manage their programs. At a minimum, a best practice PBL team should have internal, formal performance metrics reviews monthly, and should execute working level reviews as often as necessary to exercise adequate oversight of critical operational metrics.

Drill down capability 
Metrics should be used to drill down to find the root causes of performance and to allow targeted process change. In the case of PBL, it is imperative to make sure that the warfighter requirements are translated to "shop-floor" metrics. Contributions to key performance metrics need to be identified at the working level, such that the support provider can affect positive change against those metrics. You are far more likely to experience successful process improvement if the people doing the work feel like they have ownership and become stakeholders in improving the process.

Metrics are posted and communicated to entire team 
Another best practice is posting metrics reports so that all team members can read them. Performance against the support provider's KPIs should be visible throughout the organization to foster awareness and ownership of requirements. Some examples of best practices include posting performance against KPIs on the company intranet and shared team spaces. This keeps employees in the loop and cognizant of performance status and fosters a sense of accountability.

Fully automated dashboards 
A fully automated dashboard helps promote drill down functionality and root cause analysis. While root cause analysis can be done without an automated metrics reporting tool or dashboard, more and more support providers invest in automated solutions to help them collect, manage, and report on metrics.

One tool that is emerging among best practice PBL programs is a "scorecard" to show performance toward the achievement of goals. As a reporting tool, a scorecard can be an important part of an overall performance management system. The concept of scorecards has been around since the 1890s when French process engineers invented the concept with their "Tableau de Bord" (dashboard). However, few companies utilized the concept until Kaplan and Norton popularized the idea with their "Balanced Scorecard" in the 1990s. Since then, a host of technology companies have made automated scorecards a reality, and some of the more progressive PBL organizations use automated dashboards to help them manage their business - with some programs even linking to key suppliers and the depot to mine data for tracking overall program performance.

Continuous Improvement 
The philosophy of PBL is to develop a business model that promotes performance improvement—and that means instituting a solid performance management approach aimed at achieving the desired outcomes. It's one thing to have good performance metrics—but if you don't have a culture to rigorously drive performance improvements, it is likely your PBL will only get so-so results.

Continuous improvement philosophies have been around for decades. Walter Shewart, who coined the term "Statistical Process Controls (SPC)," laid the foundations of continuous improvement in the 1920s—striving to make these techniques accessible to the first level operators. SPCs are an industry-standard methodology for measuring and controlling quality  during the manufacturing process. Quality data in the form of product or process measurements are obtained in real-time during manufacturing. Today continuous improvement programs take many forms, ranging from rigorous approaches – such as Lean Six Sigma (which relies on a collaborative team effort to improve performance by systematically removing waste and reducing variation) -  to less demanding validation and verification approaches.

Regardless of which continuous improvement philosophy your organization chooses, some best practices are common across all good programs. These best practices are addressed below.

Supplier is clearly incentivized and authorized to use continuous improvement 
The culture of an organization is a key element in determining the effectiveness of its performance management and continuous improvement program. Organizations that encourage ownership and facilitate change and improvement succeed, while companies that punish for non-performance or mistakes encourage an atmosphere of "tell management or the customer what they want to hear." PBL programs need to encourage and give the support provider the authority to plan for and implement continuous product and process improvements.

Government PMs should recognize that a key step to a sound PBL program is the creation of a positive environment where change and improvement are rewarded. In some PBL programs, the PBL arrangement incentivizes the supplier to undertake formal continuous improvement or to bring proactive ideas that would improve performance or costs. Many best practice PBL programs have used incentives linked to cost savings. For example, some contracts have a 50/50 cost share split. When the industry support provider realized cost savings for their customer from their continuous improvement efforts, they were rewarded with an incentive fee equal to 50% of the savings. While not all PBL programs can or should provide this type of incentive, all best practice PBL programs should encourage and give the support provider the authority to plan for and implement some type of continuous product and process improvements efforts.

A formal continuous improvement program 
All PBL support providers should have some form of formal continuous improvement effort that effectively drives improvements against the top-level desired outcomes. For some organizations, detailed process improvements such as SPC or Lean Six Sigma may be too formal or complex or include unattainable goals. For example, within a distribution center environment, a typical Six Sigma metric would be 3.4 late lines per million lines scheduled. This may require years of shipments without a single error—a goal so ambitious that it may not be appropriate for the order fulfillment process.

On the other hand, the overarching goals of these methods can often provide value. In the case of SPC, the goal is to analyze a process or its outputs so as to take appropriate actions to achieve and maintain a state of statistical control and to improve the process capability, while Lean Six Sigma is focused on optimizing processes by eliminating waste and reducing variation. The challenge and benefit is finding the right goals and metrics to incentivize improvement. 

Continuous improvement plan supported by investment plan 
Continuous improvement efforts should be supported with a formal process for investing in the improvements that are identified. There is nothing less motivating to employees than asking them to come up with improvements that never get implemented. Not all improvement opportunities can be implemented due to time and cost constraints so continuous improvement plans should be supported by a formal investment planning process. This allows the best ideas to be prioritized and funded on a regular basis so that the program can effectively realize the improvement potentials. Improvements could be anything with a positive return on investment, for example, meeting targets in process or product efficiencies such as increased reliability.

Metrics Aligned to Suppliers 
As mentioned above, DoDD 5000.01 states that the PM shall work with the users to document performance and support requirements in performance agreements that specify objective outcomes, measures, resource commitments, and stakeholder responsibilities. Performance arrangements can take the form of contracts with industry or intragovernmental arrangements such as Memorandums of Agreements, Memorandums of Understanding, and Service Level Agreements, as appropriate.

Best practice dictates that formal arrangements be established with all of the key PBL organizations. They serve two primary purposes:

  1. To capture the details of the business relationship(s) prior to getting into the formal contracting process
  2. To capture relationship details when a contract will not apply – such as for arrangements with the Defense Logistics Agency (DLA) or a depot

If the government is the PSI, then this role of aligning the various entities becomes part of the government management responsibilities. However, if the PSI is a contractor, the role of aligning the various PSP falls on the shoulders of the contractor.

Research has revealed that very few support providers attempt to cascade or flow down top level outcomes to sub-suppliers and various PSPs. Typically, the sub-supplier contracts are transaction-based business models whereby payments are made based on consumed resources or specific activities performed by the supplier. Additionally, support providers often have arrangements with other government agencies (i.e., DLA or a depot) that do not specify alignment with the overall performance requirements. In short, alignment of the PSI and PSPs is very important, but not yet an institutionalized practice. Developing outcome based PSAs with all PBL participants would enhance end-to-end requirements flow down and performance measurement efforts.

Performance management should not be treated as something nice to have: it is an essential component of a support provider's management process and fundamental to the success of a PBL program. A good PBL program should adopt best management practices at all tiers of the PSBM and within all associated arrangements in order to foster success at all levels.


Click here to view a video on PBL Best Practices, featuring perspectives by Ms Lisa P. Smith, DASD(PS), and two Service practitioners describing successful PBL arrangements at the subsystem/component and system (platform) level.