ARJ 71
Vol. 21, No. 4
Issue 71: January 2014
The Defense Acquisition Research Journal (ARJ) is a scholarly peer-reviewed journal published by the Defense Acquisition University (DAU). All submissions receive a blind review to ensure impartial evaluation. Articles represent the views of the authors and do not necessarily reflect the opinion of the DAU or the Department of Defense.
View as PDF 4 Articles in This Journal
Compressing Test and Evaluation by Using Flow Data for Scalable Network Traffic Analysis
The specialized nature of technology-based programs creates volumes of data of a magnitude never before seen, complicating the test and evaluation phase of acquisition. This article provides a practical solution for reducing network traffic analysis data while expediting test and evaluation. From small lab testing to full integration test events, quality of service and other key metrics of military systems and networks are evaluated. Network data captured in standard flow formats enable scalable approaches for producing network traffic analyses. Because of its compact representation of network traffic, flow data naturally scale well. Some analyses require deep packet inspection, but many can be calculated/approximated quickly with flow data, including quality-of-service metrics like completion rate and speed of service.
Leveraging the use of statistical methods is critical in providing defensible test data to the Department of Defense Test and Evaluation (T&E) enterprise. This article investigates statistical tolerance intervals in designed experiments for the T&E technical community. Tolerance intervals are scarcely discussed in extant literature as compared to confidence/prediction intervals. The lesser known tolerance intervals can ensure a proportion of the population is captured in the design space, and have the ability to map the design space where factors can be reliably tested. Further, the article investigates several two-sided approximate tolerance factors estimated by Monte Carlo simulation and compares them to the exact method. Finally, the applicability of tolerance intervals to the defense T&E community is presented using a simple case study.
A Comparative Analysis of the Value of Technology Readiness Assessments
The U.S. Department of Defense endorsed and later mandated the use of Technology Readiness Assessments (TRA) and knowledge-based practices in the early 2000s for use as a tool in the management of program acquisition risk. Unfortunately, implementing TRAs can be costly, especially when programs include knowledge-based practices such as prototyping, performance specifications, test plans, and technology maturity plans. What is the economic impact of these TRA practices on the past and present acquisition performance of the U.S. Army, Navy, and Air Force? The conundrum today is that no commonly accepted approach is in use to determine the economic value of TRAs. This article provides a model for the valuation of TRAs in assessing the risk of technical maturity.
Where Are the People? The Human Viewpoint Approach for Architecting and Acquisition
The U.S. Department of Defense Architecture Framework (DoDAF) provides a standard framework for transforming systems concepts into a consistent set of products containing the elements and relationships required to represent a complex operational system. However, without a human perspective, the current DoDAF does not account for the human performance aspects needed to calculate the human contribution to system effectiveness and cost. The Human Viewpoint gives systems engineers additional tools to integrate human considerations into systems development by facilitating identification and collection of human-focused data. It provides a way to include Human Systems Integration (HSI) constructs into mainstream acquisition and systems engineering processes by promoting early, frequent coordination of analysis efforts by both the systems engineering and HSI communities.
Defense Acquisition Research Journal
Vol. 21, No. 4
Issue 71: January 2014