Sign In
App icon

DAU App

Available Now!
Get the App
Click Here to Continue Browser Session   ❯

Verification

image of military personnel performing test of equipment

About

The Verification process provides the evidence that the system or system element performs its intended functions and meets all performance requirements listed in the system performance specification and functional and allocated baselines. Verification answers the question, "Did you build the system correctly?" Verification is a key risk-reduction activity in the implementation and integration of a system and enables the program to catch defects in system elements before integration at the next level, thereby preventing costly troubleshooting and rework.

Role of the PM and SE

The Program Manager (PM) and Systems Engineer, in coordination with the Chief Developmental Tester, manage verification activities and methods as defined in the functional and allocated baselines and review the results of verification. Guidance for managing and coordinating integrated testing activities can be found in DAG CH 8–3.3. and in DoDI 5000.02, Enc 5, sec. 11.a.

Activities

Verification begins during Requirements Analysis, when top-level stakeholder performance requirements are decomposed and eventually allocated to system elements in the initial system performance specification and interface control specifications. During this process, the program determines how and when each requirement should be verified and the tasks required to do so, as well as the necessary resources (i.e., test equipment, range time, personnel, etc.). The resulting verification matrix and supporting documentation become part of the functional and allocated baselines.

Verification may be accomplished by any combination of the following methods:

  • Demonstration. Demonstration is the performance of operations at the system or system element level where visual observations are the primary means of verification. Demonstration is used when quantitative assurance is not required for the verification of the requirements.
  • Examination. Visual inspection of equipment and evaluation of drawings and other pertinent design data and processes should be used to verify conformance with characteristics such as physical, material, part and product marking and workmanship.
  • Analysis. Analysis is the use of recognized analytic techniques (including computer models) to interpret or explain the behavior/performance of the system element. Analysis of test data or review and analysis of design data should be used as appropriate to verify requirements.
  • Test. Test is an activity designed to provide data on functional features and equipment operation under fully controlled and traceable conditions. The data are subsequently used to evaluate quantitative characteristics.

Designs are verified at all levels of the physical architecture through a cost-effective combination of these methods, all of which can be aided by modeling and simulation.

Verification activities and results are documented among the artifacts for Functional Configuration Audits (FCA) and the System Verification Review (SVR) (see DAG CH 3–3.3.6. System Verification Review/Functional Configuration Audit). When possible, verification should stress the system, or system elements, under realistic conditions representative of its intended use.

The individual system elements provided by the Implementation process are verified through developmental test and evaluation (DT&E), acceptance testing or qualification testing. During the Integration process, the successively higher-level system elements may be verified before they move on to the next level of integration. Verification of the system as a whole occurs when integration is complete. As design changes occur, each change should be assessed for potential impact to the qualified baseline. This may include a need to repeat portions of verification in order to mitigate risk of performance degradation.

Outputs

The output of the Verification process is a verified production-representative article with documentation to support Initial Operational Test and Evaluation (IOT&E). The SVR provides a determination of the extent to which the system meets the system performance specification.


Resources

Key Terms

Verification

Source: DAU Glossary

Statutes, Regulations, Guidance

ACQuipedia Articles

DAU Training Courses

DAU Communities of Practice

DAU Test and Evaluation Community of Practice

DAU Tools

Media

Products and Tasks

Product Tasks
AWQI 7-1-1: Design and implement a testing process to compare actual performance with required system/item performance
  1. Identify system technical and operational requirements.
  2. Identify modeling and simulation system verification documents as applicable.
  3. Identify developmental test and evaluation documents as applicable.
  4. Identify operational test and evaluation documents as applicable.
  5. Review existing test and modeling and simulations data, analysis and reports for program applicability.
  6. Design and implement a testing process to compare actual performance with required system / item performance.
AWQI 7-2-1: Verify system compliance with defined physical architecture
  1. 1. Identify system build-to specifications.
  2. 2. Examine production representative article and compare with build-to specifications.
  3. 3. Assess system compliance with defined physical architecture.
  4. 4. Identify any deficiencies between physical architecture and build-to specifications.
  5. 5. Correct deficiencies and document compliance with defined physical architecture.

Source: AWQI eWorkbook