U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.


Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.


  1. Home
  2. Functional Configuration Audit (FCA)

Functional Configuration Audit (FCA)

AETM 017


The formal examination of functional characteristics of a configuration item, or system, to verify that the item has achieved the requirements specified in its functional and/or allocated configuration documentation.

Alternate Definition

An audit formally examines that each configuration item meets the functional characteristics stated in its item performance specification. For a system as a whole, the audit examines the system functional characteristics has achieved the requirements in the allocated baseline.

General Information

The following text is from the 2022 DoD Systems Engineering (SE) Guidebook, Section 3.6 System Verification Review / Functional Configuration Audit. The SVR and FCA can be conducted together, hence their treatment together in the guidebook and here.

The Functional Configuration Audit (FCA) is the technical audit during which the actual performance of a system element is verified and documented to meet the requirements in the system element performance specification in the allocated baseline. Further information on FCA can be found in MIL-HDBK-61 (Configuration Management Guidance). The System Verification Review (SVR) is the technical assessment point at which the actual system performance is verified to meet the requirements in the system performance specification and is documented in the functional baseline. SVR and FCA are sometimes used synonymously when the FCA is at the system level.

When a full system prototype is not part of the program’s Acquisition Strategy, the FCA is used to validate system element functionality. Other system-level analysis is then used to ascertain whether the program risk warrants proceeding to system initial production for Initial Operational Test and Evaluation (IOT&E). Verification of system performance is later accomplished on a production system.

An SVR/FCA is mandatory per DoDI 5000.88, Section 3.5.a. A successful SVR/FCA reduces the risk when proceeding into initial production for the system to be used in IOT&E. The SVR/FCA is used to:

  • Assess whether system development is satisfactorily completed.
  • Review the completed documentation or digital artifacts of the Verification Process for completeness and adequacy.
  • Assess the results of developmental test to provide evidence of verification and readiness to proceed to the next phase and IOT&E with acceptable risk (see T&E Enterprise Guidebook).
  • Confirm that the product baseline meets the requirements of the functional baseline and therefore has a high likelihood of meeting the warfighter requirements documented in the CDD or equivalent requirements documentation.

Roles and Responsibilities

The unique PM responsibilities associated with an SVR/FCA include:

  • Approving, funding, and staffing the SVR/FCA as planned in the Systems Engineering Plan (SEP) developed by the Systems Engineer.
  • Establishing the plan to the Production Readiness Review (PRR) in applicable contract documents, including the Systems Engineering Management Plan (SEMP), Integrated Master Schedule (IMS), and Integrated Master Plan (IMP).
  • Ensuring the SEP includes Subject Matter Experts (SMEs) to participate in each technical review/audit.
  • Continuing to control appropriate changes to the product baseline (see Section 4.1.6 Configuration Management Process of the DoD SE Guidebook).

The unique Systems Engineer responsibilities associated with an SVR/FCA include:

  • Developing and executing the SVR/FCA plans with established quantifiable review criteria, carefully tailored to satisfy program objectives.
  • Ensuring the pre-established technical review/audit criteria have been met.
  • Ensuring all requirements in the system performance specification have been verified through the appropriate verification method and have been appropriately documented.
  • Verifying configuration items (CIs) and software CIs have achieved the requirements in their specifications.
  • Verifying that cybersecurity controls have been implemented as defined in the Security Technical Implementation Guides (STIGs).
  • Ensuring technical risk items associated with the verified product baseline are identified and analyzed, and mitigation plans are in place.
  • Monitoring and controlling the execution of the SVR/FCA closure plans.
  • Ensuring adequate plans and resources are in place to accomplish the necessary technical activities between SVR, PRR, and Physical Configuration Audit (PCA); these plans should allow for contingencies.

Inputs and Review Criteria

The SVR/FCA criteria are developed to best support the program’s technical scope and risk and are documented in the program’s SEP. Table 3-6 identifies the products and associated review criteria normally seen as part of the SVR/FCA. The Systems Engineer should review this table and tailor the criteria for the program. The system-level SVR/FCA review should not begin until the criteria, identified by the Systems Engineer and documented in the SEP, are met and any prior technical reviews are complete and their action items closed. A resource for SVR preparation is IEEE 15288.2 "Standard for Technical Reviews and Audits on Defense Programs". This is a best practice review.

Table 3-6. SVR/FCA Products and Criteria


System Verification Review (SVR) / Functional
Configuration Audit (FCA) Criteria

Technical Baseline Documentation/Digital Artifacts (Functional and/or Allocated) Verification
  • Documented achievement of functional and/or allocated baseline requirements through the appropriate documented verification method (analysis, demonstration, examination, testing or any combination thereof) are reviewed and verified (Note: verification testing may include developmental, operational (e.g., Early Operational Assessments, Operational Assessments, and/or live-fire testing)
  • Assessment that the documented product baseline for the initial production system has an acceptable risk of operational test failure during operational test and evaluation (OT&E)
  • Reliability and maintainability (R&M) performance meets the contractual specification requirements
  • Capability Development Document (CDD) R&M requirements are likely to be met.
Risk Assessment
  • Identified and documented risks (including human systems integration (HSI), cybersecurity, and environment, safety, and occupational health (ESOH)) have been accepted at the appropriate management level before initial production for the system to be used in OT&E
Technical Plans
  • Detailed plan and schedule have been established and sufficiently resourced to continue development

Outputs and Products

The Technical Review Chair determines when the review is complete. Once the products have been reviewed and approved in SVR/FCA, they provide a sound technical basis for proceeding into initial production for the system to be used in OT&E.