Sign In
App icon


Available Now!
Get the App
Click Here to Continue Browser Session   ❯



Clarifying the Language of Acquisition Innovation the Language of Acquisition Innovation2021-06-21T16:00:00Z,<div class="ExternalClassD2A7E60EDC5A47239F8E9DDF5835D703">As the United States shifts its focus back to great power competition, ambitious and revanchist peer adversaries position themselves to challenge U.S. dominance in nearly every domain. Maintaining our technological edge requires a faster, more agile, and more innovative force. In response, new organizations such as the Defense Innovation Unit (DIU), AFWERX, SOFWERX, and many others have stood up. However, these small organizations are not designed to replace the roles of every traditional product center and program office. We need to capitalize on benefits that come with certain economies of scale by educating, training, and equipping the Department of Defense (DoD) acquisition workforce of more than 170,000 professionals. However, driving a cultural change in such a large and well-established organization is a monumental task.<br> <br> Caught in the undertow of the consistent leadership push to “accelerate change” and innovate, acquisition professionals are re-evaluating how they fit into this new way of doing business. Most perplexed by leadership messaging are the middle managers—those tasked with keeping an organization running, guiding inexperienced junior acquirers, and implementing the myriad federal, departmental, and command-level instructions. A clearly articulated strategy is necessary for a manager to adjust the modus operandi. However, the signal often gets lost in the noise of buzzwords and jargon.<br> <br> Innovation is not a magic dust that is sprinkled over a program to sprout better ideas. Repeated admonitions to “innovate!” leave troublesome ambiguity concerning both what to innovate and how to go about it. Recent changes to acquisition rules, including Section 804 authorities and the Middle Tier of Acquisition (MTA), benefit managers by providing more room for tailoring—but how should managers use this new breathing space? Some have embraced the new opportunities, but many find themselves perplexed or worse, threatened.<br> <br> Furthermore, evocative terms like “innovation” often serve as the catch-all for a group of related concepts such as creativity, speed, agility, and other desiderata. These attributes, while related and sometimes mutually-reinforcing, are different concepts. As such, they may require different operational approaches and management cultures. The first step in instituting foundational changes in our organizations is to understand the desired end-state and, therefore, to define our terms. This article takes a foundational step toward broader cultural change in the acquisition workforce by examining the distinction and interactions between innovation, speed, and agility—setting the stage with precise language for proposing organizational changes.<br> <br> Essential Attributes: Innovation, Speed, and Agility<br> <br> <strong>(1) </strong>Innovation is deceptively difficult to define. The term appears neither within the Defense Acquisition Glossary nor the June 2020 edition of the DoD Glossary. The Department of the Army’s “Innovation Strategy 2017-2021” uses the term 186 times without defining it, though it references the “U.S. Army Operating Concept,” which defines innovation as “The act or process of introducing something new, or creating new uses for existing designs.” We prefer a more succinct definition that captures these ideas: innovation is “useful novelty.” Writing in 1992, U.S. Air Force strategist Col. John Boyd echoed this idea, concluding that novelty is vital for organizations to meet changing conditions. Jeff DeGraff, noted author and thinker on innovation, expands on this broad definition by explaining that innovation “enhances something, eliminates something, returns something from our past, and eventually reverses into its opposite.”<br> <br> The value of an innovation has a time component since any novel idea has a lifespan. However, innovation is measured not only by the speed of its introduction; it also has a magnitude component. This combination of time scale and magnitude yields different categories of innovation. For example, an innovation might be a small, predictable improvement to an existing product. Such capability increments are termed “evolutionary change”—for example, a software update for a laptop that causes programs to use less memory.<br> <br> Improvements might also take the form of a significant increase in capability that changes the way users interact with a system. These changes are termed “revolutionary change” and are often transformational and irreversible. For example, consider the transition from spinning disk drives to solid-state memory and the associated impact on the laptop’s speed and size.<br> <br> Historian and U.S. Military Academy professor Clifford L. Rogers adapted the biological concept of “Punctuated Equilibrium” to demonstrate how evolutionary and revolutionary changes coexist and interact similarly in natural and technological ecosystems. He explains that military history often exhibits long phases of incremental evolutionary development, interrupted by brief periods of numerous, rapid changes that have revolutionary effect.<br> <br> Some readers will mistakenly identify revolutionary change with the ubiquitous term “disruptive innovation,” coined by innovation expert and Harvard professor Clayton M. Christensen in the mid-1990s. However, in his book The Innovator’s Dilemma, he explains that both evolutionary and revolutionary change are part of the same spectrum of “sustaining innovation.” Such changes continue to add value to an existing product with an existing customer base.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article05_figure01.jpg" style="width:800px;height:401px;" /><br> In contrast, disruptive innovations are new technologies (or unforeseen combinations of technology), originating with a fringe user base, that eventually displace existing products in an unexpected way. For example, consider that the smartphone has largely replaced the role of the laptop computer for many users, with more than half of all Web traffic now originating from phones. By the time laptop manufacturers realized their loss in market share, it was too late for them to break into the cellphone market. (Ironically, the smartphone only represents a sustaining innovation in the cellphone market, albeit a revolutionary one.)<br> <br> In established acquisition offices staffed predominantly by engineers, managers, and other optimizers, changes are typically small and incremental—that is, evolutionary. Economist Theodore Levitt characterized this mature stage in organizations: They possess large investments in the existing way of doing business and resolve to make only predictable, easily controlled improvements. The downside is that they often fail to fend off more innovative new competitors. This stagnating behavior results in the typical life cycle, shown in Figure 1, which has been found to apply on the scale of products, companies, and even macro-economics. Revolutionary change is needed to start the life cycle anew and continue to progress.<br> <br> However, disruptive innovations require an entirely different way of thinking. The introduction of nuclear weapons and stealth technology were not merely advancements in existing systems; they fundamentally and irreversibly changed the calculus for how military forces are employed. Their adoption required accepting a gargantuan amount of risk. They only succeeded because the DoD recognized an emerging technology’s potential and capitalized on opportunities before our enemies.<br> <br> Though it is compelling to be a part of the next Manhattan Project or something akin to Lockheed’s Have Blue stealth program, can we simply command these types of revolutions into existence? Notably, since innovation is defined by novelty, it is not possible to tell someone what to innovate, or to predict the emergence of the next big idea. According to Nassim Nicholas Taleb, author of the influential book Black Swan, this difficulty is due to the “law of iterated expectations.” If one knows what they want someone to invent, then at some level, they must have already invented it. Commanding a team to innovate is merely equivalent to saying, “I have a problem, and I need a solution.” Empowering? Perhaps—but not instructive.<br> <br> What is most necessary for innovation to occur is a culture that rewards novelty while carefully accepting and managing risk. Middle managers play an exceptionally large role in establishing this culture. Of course, predictability and accountability are desirable to some stakeholders (e.g., Congress). However, the innumerable processes designed to eliminate program risk, though individually prudent, have the cumulative effect of weighing down programs and stifling their ability to innovate. Comfort with novelty will allow disruptive innovations to emerge more frequently; comfort with well-managed risk will allow us to capitalize on them. Managers must strike a careful balance between innovation and accountability. The current trajectory of risk-averse evolutionary innovation traps much of the acquisition enterprise in the “mature stage” with only incremental changes to postpone inevitable technological obsolescence.<br> <br> <strong>(2)</strong> Speed is a measure of distance traveled over time. In the product cycle context, it is the rate at which some process (procedural distance) marches along. Speed is increased by reducing the time to perform individual steps, by parallelizing steps, or by eliminating steps, thereby increasing efficiency. These optimizations, often made under the umbrella of Continuous Process Improvement (CPI), are a form of sustaining innovation and can have dramatic effects on speed. However, since novelty (especially surprise) is generally the enemy of efficiency, a balance is required here, too. This notion is best illustrated by the idea of a production labor “learning curve,” as illustrated in Figure 2.<br> <br> New processes take longer to perform until workers gain familiarity, and any procedural changes will cause a regression to some previously overcome level of inefficiency. While speed may increase due to innovation (production capital investment, a new contract type, CPI) speed also increases due to a lack of changes (experience and practice). Both forces contribute to the time required for development and production. Note that in real-world learning data, touch labor is commonly reduced by between 20 and 90 percent as manufacturers produce additional units. Of course, this concept does not apply only to manufacturing. A contracting officer with significant Federal Acquisition Regulation-based expertise may take significantly longer to award their first Other Transaction Authority contract, thus eliminating any supposed speed advantage. Process innovation must unlock significant long-term efficiencies to justify a temporary (but significant) increase in production time.<br> <br> For these reasons, efficient processes increase speed when applied to well-defined (and well-planned) tasks. The fundamental objective of any process is to increase consistency and repeatability by turning large tasks into a series of smaller ones and fine-tuning performance.<br> Given this definition, it is interesting that acquirers sometimes blame the acquisition processes for program sluggishness when these processes should increase efficiency. Two reasons account for the apparent disconnect. First, many regulations attempt to optimize for cost or risk, not speed. Shifting the focus to produce systems in less time necessarily means taking greater risks elsewhere. Second, the development of a complex weapon system is not perfectly predictable and, therefore, not entirely subject to optimization. The typical management response to unpredictability and surprise is to “patch” the problem with more process. This increase in process adds novel tasks, creates increased complexity, and hinders speed (increasing exposure to additional surprises).<br> <br> Innovation is necessary for an organization to remain relevant, but managers retain the essential tasks of planning thoroughly and executing quickly. Even the new DoD Instruction 5000.80, governing MTA efforts, requires managers to produce a cost estimate, a budget, and an acquisition strategy. Innovation at an inappropriate place in the program life cycle may negatively impact planning and speed. Simply asking an organization to perform its traditional function faster is not necessarily a call for revolutionary change and certainly not a call for disruptive innovation.<br> <br> <strong>(3) </strong>Agility allows organizations to maintain speed while dealing with novelty like new requirements, unplanned rework, or process change. The Oxford English Dictionary defines agility as the “ability to think and understand quickly,” and also the “ability to move quickly.” Both definitions are appropriate and are unified in Col. Boyd’s famous OODA (Observe, Orient, Decide, Act) loop. Often referred to in the special operations community as “pivot speed,” agility is a measure of time it takes a team to adapt to an unplanned situation. Agile organizations suffer minimal penalties to their learning curve when an innovation or surprise is introduced. Agility enables innovation (high novelty) and process speed (high efficiency) simultaneously by overcoming the paralysis that occurs when a situation does not correspond with any predetermined process.<br> <br> Unfortunately, agility has (like innovation) become a buzzword. In fact, it has become so fashionable to affix the word “agile” to software development efforts that the Defense Innovation Board (DIB) published a guide with the provocative title Detecting Agile BS. Nominally agile software development may follow one of the myriad prescribed frameworks (Lean, SCRUM, Kanban), all attempting to add clarity to requirements and accelerate user feedback. However, agility is not a process or framework; it is a mindset.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article05_figure02.jpg" style="width:800px;height:401px;" /><br> <br> Perhaps the best description of this mindset emerged as the Agile Manifesto—the result of a team of software development gurus attempting to identify the concepts at the core of the disparate agility frameworks. The four “values” of the manifesto are referenced and expanded in the DIB’s Detecting Agile BS guide. We believe they make up a good starting point and apply to any program seeking to increase agility: <ul> <li>Individuals and interactions over processes and tools</li> <li>Working software [alternatively, products] over comprehensive documentation</li> <li>Customer collaboration over contract negotiation</li> <li>Responding to change over following a plan</li> </ul> The common thread through all four values is fluid, real-time interaction with customers and contractors, coupled with a laser-focus on the end product. These are crucial guideposts to keep programs on track as planning departs from reality. Planning and its products are valuable to a point but may become overly constraining.<br> <br> Separating this agile mindset from agile strategies is an important distinction because, while those strategies may not scale to large monolithic programs, the mindset can. Even a 10-year shipbuilding effort with a waterfall program structure can be robust in the face of innovation and surprise. Notably, this mindset involves an increased tolerance for risk. However, risks taken to create an agile organization represent a far more lucrative bet than one that a well-engineered process will never break down. To have any hope for capitalizing on innovation or carrying out large programs in an increasingly complex world, agility must emerge as a core competency of acquisition teams.<br> <br> <strong>Conclusion</strong><br> The phrasing of USAF Gen. Brown’s “Accelerate Change, Or Lose” directive is clever. For the reasons discussed above, achieving useful change quickly—especially if the change is highly novel—is perhaps the most challenging organizational feat to achieve. Fortunately, experience in business suggests it is attainable.<br> <br> Even so, it is important to remember that none of these attributes is a replacement for an overall strategy. A more solid understanding of the relationships between innovation, speed, and agility is the key that enables managers to achieve tactical and operational objectives within the time required to make an overall strategy effective. Even a cursory consideration of these attributes reveals that accepting some risk is necessary, however uncomfortable, though such risk needs to be balanced appropriately with accountability and efficiency.<br> As the threats against the United States proliferate, it becomes ever clearer that taking bold risks is justified. Speed, innovation, and agility are not silver bullets that will lead to continued U.S. dominance. But as the complexity and pace of the world have increased, all three are now essential. In our next article, we will utilize these definitions to provide actionable tips for infusing a culture of innovation, speed and agility into real-world programs.<br> <br> Opinions, conclusions, and recommendations expressed or implied in this article are solely those of the authors and do not necessarily represent the views of the Air University, the U.S. Air Force, the DoD, or any other U.S. Government agency. <hr />DeNeve is an Air University Fellow, teaching in the Department of Joint Warfighting at the U.S. Air Force (USAF) Air Command and Staff College (ACSC). He has master’s degrees in Systems Engineering and Military Operational Art and Science, as well as experience with various USAF Special Operations Forces, Intelligence, Surveillance and Reconnaissance, and Nuclear programs.<br> <br> Price is a former Silicon Valley executive, holding a B.A. in political science from University of California, Los Angeles, and a Ph.D. in military history from the University of North Texas. At ACSC, he teaches courses on War Theory, Joint Warfighting (JPME I), Airpower, and courses within the Joint All-Domain Strategist (JADS) concentration. Under a contract with Naval Institute Press, he is completing a book on USAF modernization in the post-Vietnam era.<br> <br> The authors can be contacted at <a class="ak-cke-href" href=""></a> and <a class="ak-cke-href" href=""></a>.</div>string;#/library/defense-atl/blog/Clarifying-the-Language
Contract Award Protest Rulings - Highlights From the GAO Report for FY 2020 Award Protest Rulings - Highlights From the GAO Report for FY 20202021-06-14T16:00:00Z,<div class="ExternalClass7E4EA58FBF6C411E90B50F1508A3B2D2">Every year, the Government Accountability Office (GAO) reports to Congress on its most prevalent basis for sustaining protests along with the identity of any agency failing to follow its recommendations. The GAO reported this year that all of its recommendations were followed and that it issued all protest decisions within 100 days from protest submittal.<br> <br> The GAO “sustains” a protest when it recommends that the protestor receive some form of relief. The GAO’s report for Fiscal Year (FY) 2020 revealed that the rate by which the government sustained protests decreased by 2 percent from 2019.<br> <br> <em>Note: The GAO rulings for FY 2019 were reviewed in the May-June 2020 issue of Defense Acquisition magazine. The FY 2018 rulings were reviewed in the July-August 2019 issue. Those for FY 2017 were examined in the May-June 2018 issue of Defense Acquisition’s predecessor, the Defense AT&L magazine; and rulings for FY 2016 were reviewed in the January-February 2018 issue of Defense AT&L.</em> <hr />GAO’s most prevalent grounds for sustaining protests in FY 2020, were: <ul> <li>Unreasonable technical evaluation</li> <li>Flawed solicitation</li> <li>Unreasonable cost or price evaluation</li> <li>Unreasonable past performance<br> (Note: Items 1 and 3 above are the same prevalent grounds as those in the FY 2019 report.)</li> </ul> GAO provided examples for each of its most prevalent grounds in the following cases. <h3><strong>Unreasonable Technical Evaluation</strong></h3> GAO cited Leidos Innovations Corp., B-417568.3 and B-417568.4, as its example case of an agency performing an unreasonable technical evaluation.<br> <br> In Leidos, the agency issued a Fair Opportunity Proposal Request (FOPR) seeking task order proposals to provide logistics support for a rotary wing aircraft. The agency’s solicitation provided the period of performance, the primary performance location, and directed offerors to submit separate volumes for their technical and cost/price proposals. The technical proposal was to be an oral presentation with PowerPoint slides in which each offeror was given 2½ hours to present their technical approach, and then was given a break followed with a question-and-answer (Q&A) session. Additionally, the solicitation allowed interchanges between the agency and the offeror through interchange notices (INs). The solicitation clearly provided that an offeror’s response to an IN would be considered in making the source selection decision.<br> <br> The agency videotaped all of the presentations and Q&A sessions from the six offerors who submitted proposals. In evaluating the proposals, the technical evaluation team used the video recording, the in-person oral presentation, response to government questions during oral presentations, clarifications submitted within 24 hours of the oral presentations, and the PowerPoint provided with the proposal submissions. The non-cost/price factors were considered of equal importance and together, more important than cost/price.<br> The GAO recommendation focused on two offerors: the lowest-priced offeror and the protestor, which was the second lowest-priced offeror. The protestor initially was selected for award after the first technical evaluation because the technical evaluation team assessed weaknesses against the lowest-priced offeror. These weaknesses were said to exist in its Original Equipment Manufacturer (OEM) reachback capability to bring broad external assets into play and its Performance Work Statement’s requirements for accessing and transferring its maintenance management information system). It was later discovered that the lowest-priced offeror’s OEM reachback capability was impacted by pending litigation between the offeror and the OEM. In contrast, the technical evaluation team assigned multiple strengths to the protestor, primarily because its repair facility provided cost and schedule savings to the government.<br> The agency discussed the strengths and weaknesses with the offerors in the Q&A sessions. The agency then amended the FOPR to clarify the performance period and the date to submit pricing proposals. The initial contracting officer determined that protestor’s proposal justified the $181 million price premium and that awarding to the lowest-priced proposal would increase the risk of unsuccessful contract performance. The lowest-priced offeror responded by protesting to GAO (which had jurisdiction since the matter concerned a task order exceeding $25 million). However, GAO dismissed the case when the agency took corrective action, which included the agency re-evaluating technical proposals and leaving open the possibility of additional interchanges. The contracting officer amended the FOPR to require verifiable assurance of OEM support (amendment 5) and to disallow revisions to the previously submitted price volume (amendment 6). The lowest-priced offeror responded to the amendments by filing another protest to the GAO asserting in part that its settlement with the OEM should have disposed of any concerns it had with its OEM reachback capability. The GAO dismissed this second protest when the agency elected to take corrective action.<br> <br> The agency appointed a new contracting officer who in turn appointed a new technical evaluation team (re-evaluation team) and amended the FOPR to discard of amendments 5 and 6. The proposals were re-evaluated along with the videotapes of the oral presentations and copies of the PowerPoint slides. The re-evaluation team assigned new ratings, which resulted in the protestor remaining the highest technically ranked. The contracting officer then determined that the technical superiority did not justify paying a price premium and that the award should instead go to the lowest-priced offeror (ln.k.a. awardee). The decision to award to the lowest-priced offeror, which was a switch from the initial determination to award to the second lowest-priced offeror, was documented in the contracting officer’s fair opportunity decision document (FODD). The FODD also noted that the newly assigned technical evaluators reviewed the Q&A and determined that it lacked significance regarding the re-evaluation.<br> <br> The protestor, who was the second lowest-priced offeror and initially selected for award by the first contracting officer, subsequently filed this protest to challenge the agency’s failure to consider information that the protestor provided during its Q&A session. The agency confirmed that it did not consider the information from any of the Q&A sessions, that it did not review a few of protestor’s revised slides that were submitted after the oral presentation, and that it did not provide the re-evaluation team with information obtained from INs with awardee. The agency believed that the interchange information obtained after the protestor initially won the contract award would “taint” the re-evaluation. The agency boldly asserted that any errors in the re-evaluation did not prejudice the protestor.<br> <br> The agency’s claim of no prejudice to protestor even in light of possible errors in the re-evaluation prompted GAO to request the record—specifically, the recorded Q&A sessions for the awardee and protestor. The agency was unable to provide the Q&A recordings because they were deleted by what the agency described as an “unintentional technical oversight.” The agency also submitted a correction of the record to GAO to reflect the fact that the video of the Q&A sessions was not reviewed; rather, the Q&A summary within the initial contracting officer’s FODD was reviewed, with the current contracting officer then concluding that the Q&As were irrelevant. GAO elected to conduct a hearing in this case, held as a transcribed telephone conference as a COVID-19 precaution.<br> <br> Protestor supplemented its filing, stating that the agency failures to consider the Q&A and subsequent INs with awardee were unreasonable and contrary to the terms of the solicitation, specifically to the provision that “offeror responses to INs would be considered in making the selection decision.” The GAO agreed with protestor, concluding that the agency failed to comply with the solicitation provisions by not considering the information from the Q&A sessions and the INs.<br> <br> The GAO stated that it will generally leave evaluation of proposals in a fair opportunity process to the agency’s discretion but will question an evaluation that is not reasonable or consistent with the solicitation. In this case, the GAO found that it was unreasonable for the agency to exclude certain portions of some of the proposals and to exclude responses to agency questions when the solicitation stated that the responses would be considered. It was also found unreasonable to conclude that a prior issue was resolved on the basis of excluding known information from consideration. Despite the brief summary provided by the initial contracting officer, the erasure of the videotapes of the Q&A sessions prevented establishing the scope or substance of those discussions. GAO pointed out that competitive prejudice is an essential element of a viable protest and that it was present here because the flawed evaluation process prejudicially resulted in the protestor losing its initial award.<br> <br> <img alt="Agencies are required to evaluate competitive proposals and assess their relative qualities solely on the factors and subfactors specified in the solicitation." src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article06_image01.jpg" style="width:800px;height:376px;" /> <h3><strong>Flawed Solicitation </strong></h3> GAO cited Blue Origin Florida, LLC, B-417839 as its example case of a flawed solicitation because its proposed methodology was not compliant with the Federal Acquisition Regulation (FAR). In this case, the Air Force (agency) solicited for commercial item launch services for Phase 2 of its National Security Space Launch Service Procurement. The Phase 2 solicitation was a Request for Proposal (RFP) for a negotiated commercial item procurement (FAR parts 12 and 15). The RFP contemplated award of two competitive, fixed-price requirements contracts for launch services. The evaluation factors were provided in the RFP but the agency decided to conduct a trade-off based on pairing of proposals in lieu of ranking them and conducting individual trade-off determinations. As as a result, a trade-off was not conducted based on the individual merits of each individual proposal. Instead, pairings of the four separately developed and submitted proposals the agency anticipated receiving were to be made for comparison purposes. The GAO ultimately sustained the protest because the basis for award failed to provide an intelligible basis upon which offerors were expected to compete.<br> <br> In response to the protest, the contracting officer explained that the approach in the RFP was designed to allow some leeway for the Source Selection Authority to determine which pair of proposals when combined offered the best value. In the example provided by the contracting officer to GAO, it was clear that two individually ranked proposals may not both be selected if they did not have what the agency termed “complementary attributes” even though they were the highest ranked when individually assessed against the RFP evaluation criteria. The “complementary attribute” consideration might result in an overall technically weaker offeror winning an award in the event that the weaker offeror did not share in a common weakness with the highest technically rated offeror. For example, if both of the two highest-rated offerors shared a common weakness only the individually highest ranked would be selected along with a third not as highly rated offeror that did not share the common weakness. After determining the pair, the Source Selection Authority would determine which of the two offered the best individual value. The offeror that was found to provide the overall best value would be awarded the first requirements contract, which equated to 60 percent of the requirement and the other would be awarded the second requirements contract, which equated to the remaining 40 percent of the requirement.<br> The GAO stated that the problem with the approach evident in this scenario is that the second-ranked offeror’s proposal would not actually be assessed for award based on the merits of its own proposal against the stated criteria in the RFP or the relative merit of its proposal against the other individual proposals. Instead, the agency would evaluate the second- and third-ranked offerors by using a new undefined evaluation criterion as to whether they complement the highest individually ranked offeror. GAO pointed out that offerors have no control over the strategy or contents of another offeror’s proposal and would be unable to intelligently compete short of colluding with another offeror to coordinate their proposals.<br> <br> The GAO concluded that this “when combined” best-value approach failed to provide an intelligible and common basis for award since the source selection decision was not based on the evaluation of individual proposals against the RFP’s specified evaluation criteria but on an evaluation of pairings of proposals using undefined criteria. The GAO recommendation reiterated that the objective of source selection is to select the proposal representing the best value and that the FAR requires the solicitation to clearly provide all evaluation factors and significant subfactors affecting the contract award and their relative importance when using the trade-off process. Only then can an offeror know what factors are the most important. Agencies are required to evaluate competitive proposals and assess their relative qualities solely on the factors and subfactors specified in the solicitation. <h3><strong>Unreasonable Cost or Price Evaluation</strong></h3> GAO cited Sayres & Associates Corp., B-418374, as its example of an agency’s unreasonable evaluation of a cost or price. Under an indefinite-delivery, indefinite-quantity contract, the Department of the Navy issued an RFP. The Navy anticipated issuing a single cost-plus-fixed-fee task order with a one-year base period and four one-year options. The awardee would provide engineering services to support technical integration, management, testing, and evaluation services.<br> <br> The task order was to be issued on a best-value trade-off basis, considering the factors of technical and management, past performance, and total evaluated cost (TEC). The technical factors, when combined, were significantly more important than the TEC. The agency was to evaluate the reasonableness, realism, and completeness of the cost data provided. The Navy received proposals from two offerors. The losing bidder protested after a debriefing and notification that it was not the awardee.<br> <br> The RFP provided that there would be a realism analysis of each offerors’ proposed costs. Based on this realism analysis, proposed costs would be adjusted to derive the most probable cost to the government of performing this task order. Offerors were encouraged to propose a reasonable and realistic escalation factor for wages, consistent with company practices and estimated future increases in wages. Offerors were required to explain their rationale or provide historical information to substantiate the proposed escalation rates. The Navy would use current market data to evaluate the proposed escalation rates if historical rates were not provided.<br> <br> The Cost Evaluation Team reviewed the protester’s cost proposal and made several upward adjustments to the proposed escalation rates for the base and option years. The cost realism analysis raised the protester’s costs by 5.34 percent (more than $2.1 million), yielding a TEC of $42,785,798. This moved their proposal from the least expensive to the most expensive of the offerors.<br> <br> The protestor argued that the Navy unreasonably rejected detailed historical data, including the salaries of the protestor’s staff and their respective salary increases for the preceding five years. In response, the Navy stated that the “source of the numbers” that the protester provided were “unverified.” The Navy claimed that “[protestor] could have provided screenshots of the salaries from year to year to verify” the salary data.<br> <br> In reaching its decision, the GAO stated that evaluating the adequacy of a cost realism evaluation need not achieve scientific certainty. However, the methodology must be reasonably adequate. The methodology must provide confidence that the rates proposed are reasonable and realistic in view of other cost information available to the agency at the time of its evaluation. The GAO concluded that five years of detailed historical data in combination with accompanying supporting explanations were sufficient to meet the standard.<br> The GAO sustained the protest stating that it was uncomfortable by the ease in which the Cost Evaluation Team rejected the protestor’s data and support information. The Cost Evaluation Team records disclosed that the team’s “analysis” consisted of only one sentence that summarily dismissed the protestor’s submitted data and information. The contemporaneous record did not provide the GAO with a basis to find that the agency’s rejection of the proposed escalation was reasonable. <h3><strong>Unreasonable Past Performance</strong></h3> GAO cited Addx Corp., B-417804 et al., as its example of an unreasonable agency evaluation of a protestor’s past performance. The Department of the Army issued a Task Order Request for Proposal (TORFP) for professional and technical system support services for the Army’s Special Operations Mission Planning and Execution (SOMPE) program. Protestor filed a protest with the Army (agency) and then the GAO after being notified it was eliminated from the competition. Protestor challenged almost every aspect of the procurement’s conduct, and the GAO dismissed all but two of the challenges.<br> <br> The first challenge sustained by the GAO concerned the Army’s failure to perform and document any analysis that considered the protester’s lower proposed cost in its decision to eliminate the protester’s proposal from the competition. The GAO reviewed the Army’s documentation and concluded that Army did not meaningfully consider cost prior to eliminating protestor’s proposal from the competition.<br> The second challenge involved the evaluation of the protester’s past performance. The protester argued that the agency unreasonably failed to consider the content of its proposal and instead applied unstated evaluation criteria when evaluating past performance. The agency argued that its evaluation of the protestor’s past performance was reasonable because the protester “did not have any explicit past performance information related to the position of a Mission Planning Software Engineer or any detailed description as to how it would address and overcome this lack of explicit past performance either directly or through one of its subcontractors.”<br> <br> The TORFP required that offerors provide a past performance questionnaire (PPQ) to both the contracting officer and contracting officer’s representative for each contract identified in the technical proposal volume, and that the respondents provide the completed PPQs directly to the agency. In its proposal, the protester identified seven contracts performed in the last five years by it and its proposed team members to “demonstrate highly relevant experience of similar scope and type” to the TORFP requirements. For each contract identified, the protester also provided a narrative explanation of the relevance of the contract as compared to specific sections of the performance work statement related to the TORFP.<br> <br> The GAO recognized the facts in the case and stated that, in evaluating an offeror’s past performance, an agency must be reasonable and consistent with the stated evaluation criteria. Determining the relative merit of an offeror’s past performance is primarily a matter within the agency’s discretion, and GAO said that it will not substitute its judgment for that of the agency.<br> <br> The GAO concluded that the Army’s evaluation of the protester’s past performance was unreasonable. Specifically, the GAO found that the Army applied an unstated evaluation criterion to the protestor’s proposal when it limited past performance to only SOMPE efforts. This limitation was inconsistent with the more expansive performance work statement in the TORFP because the protestor’s past experience was similar in scope and type contemplated by the TORFP. As a result, since the Army assigned an adjectival rating of marginal and a moderate risk rating to the protester’s evaluation factors of past performance, the GAO found the ratings lacked a reasonable basis. The protest was sustained. <h3><strong>Conclusion</strong></h3> Year to year, we are faced with consistent themes on how proposal evaluations should be conducted. It is sometimes difficult to see where our evaluations may not comply with GAO expectations. The cases reviewed above are just a few scenarios that may be useful to compare with our current procurements. In avoiding the pitfalls presented by these examples, we can perhaps support a procurement environment devoid of unfair practices and reduce the protests sustained by GAO. <hr />Wallace is a professor of Contract Management at DAU at Fort Belvoir, Virginia. She is a U.S. Air Force veteran, a former litigation attorney and contract specialist. She holds a law degree from the University of North Dakota.<br> <br> Rodgers is a professor of Contract Management at DAU. He is a retired U.S. Air Force attorney and holds law degrees from the University Cincinnati and George Washington University.<br> <br> The authors can be contacted at <a class="ak-cke-href" href=""></a> and <a class="ak-cke-href" href=""></a>.<br> <img alt="Two hands shaking" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article06_image02.jpg" style="width:800px;height:360px;" /></div>string;#/library/defense-atl/blog/Contract-Award-Protest-Rulings_FY2020
Compliance, Continuity, and COVID-19, Continuity, and COVID-192021-06-07T16:00:00Z,<div class="ExternalClassC33A7FF277384480829E8D90D5568104">The COVID-19 Pandemic is an ongoing contagion producing severe acute respiratory syndrome from coronavirus 2 (SARS-CoV-2). First identified in December 2019 in Wuhan, China, the outbreak was declared a Public Health Emergency of International Concern in January 2020 and a pandemic in March 2020. As of March 1, 2021, more than 114 million cases worldwide have been confirmed, with more than 2.5 million deaths attributed to COVID-19.<br> <br> Whether an organization was well prepared for a pandemic or it had no contingency plan in place, business disruption and disaster followed. Organizational leaders saw firsthand the fragility of business systems, operations, and revenue streams. They witnessed the critical importance of risk awareness and preparedness, as well as the need for robust “Continuity Management” programs.<br> <br> Today, Department of Defense (DoD) program managers (PMs) and contractors face a dual challenge: (1) How do they ensure continuity in their programs before, during, and after a major disruption such as a pandemic, and (2) how do they do it remotely? If it helps to create a sense of urgency for PMs, think of COVID-19 as a bioweapon.<br> <br> This article discusses the importance of Continuity Management in both the public and the private sectors. Like the cyberattack threat, the pandemic threat will forever be real; and DoD’s response to it must be just as real.<br> <br> Compliance with the applicable DoD contract is (or should be) mandatory for contract award and execution. However, compliance alone doesn’t guarantee the ability of an organization to successfully respond to and survive a major disruptive incident—especially one with the scope, duration, and severity of the COVID-19 pandemic.<br> <br> Compliance with contract specifications can (to some extent) provide the PM with a good feeling about the contractor’s past, and a good snapshot of the contractor today—but not nearly enough about tomorrow. That kind of good feeling comes only with a favorable assessment of the contractor’s ability to handle “tomorrow.” In short, compliance requires continuity. Otherwise, there may not be a “tomorrow” for the program—no “new normal,” or any other kind for that matter.<br> <br> Figure 1 describes the sequence from “normal” to “new normal” and the importance of continuity planning in recovery and restoration.<br> <br> I have written articles for DAU on the following distinct but related subjects over the last several years: <ul> <li>Contingency Planning</li> <li>Adding COVID-19 to a Risk Management Model</li> <li>Collecting and analyzing “Lessons Learned”</li> <li>The importance of second-party auditing</li> <li>Cybersecurity and system integration in DoD</li> <li>Due diligence in DoD contracts and the courage to cancel a failed DoD program</li> <li>Lessons learned from Afghanistan</li> <li>Second-party auditing of DoD contracts</li> <li>The value of tabletop exercises</li> </ul> When published, those articles were meant to help focus PMs on maximizing the effectiveness of both their programs’ missions and their administration of them. Now, in the midst of this pandemic, my hope is that revisiting those articles will help to focus PMs on rebuilding programs damaged by the pandemic and operating them remotely, as the situations require.<br> <br> The following terms will also help to assess continuity for future recoveries and structure the rebuilding required for recovery operations already in progress.<br> <br> <img alt="Figure 1. Continuity Planning" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_figure01.jpg" style="width:800px;height:257px;" /><br> Continuity management means a management process that covers the identification of situations that may have a highly negative impact on DoD operations—and the implementation of capabilities and competencies in order to properly respond and protect the interests of DoD and other relevant interested parties.<br> <br> Continual improvement is the basis of modern management. It must be thought of as an ongoing process and not an end state. It requires PMs and contractors to develop the mindset that we can always make something better.<br> <br> Impact analysis (IA) helps identify the possible threats and effects of a disruption or serious situation on operations or activities. Impact analysis helps organizations build resiliency and responsiveness into their operations.<br> <br> A risk management plan can be thought of as the end-user of the impact analysis. An organization’s continuity management plan must have in place a documented risk assessment process in order to identify, analyze, evaluate, and treat risks that may lead to disruptive situations. Risk management (assessment and treatment criteria development) must consider the continuity plan’s objectives and the organization’s definition of acceptable risk.<br> <br> Personnel awareness is an essential part of personnel competence. People who work under an organization’s control must be made aware of the continuity policy and its contents, and what their personal performance means; plus, its objectives, and the possible implications of nonconformity, and the employees’ roles during disruptive incidents. It’s analogous to knowing the location and use of the closest fire extinguisher, fire alarm box, first aid kit, or eyewash station, only on a grander scale—and remotely.<br> <br> Resources support continuity strategies. Organizations must define needed continuity resources, such as people, information and data, buildings and facilities, equipment and consumable resources, transportation, suppliers, and partners.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_image01.jpg" style="width:800px;height:257px;" /><br> <br> <strong>Continuity Management Plan</strong><br> A continuity management plan is a set of procedures and instructions to guide an organization during and after a disruptive event in order to speed immediate response, recovery, and resumption of minimum operational conditions, and eventual restoration of normal operations. We must consider now the pandemic-driven requirement to assess and manage those management functions from a distance.<br> <br> Table 1 compares normal DoD program management compliance functions with continuity management, and then suggests that these functions may be monitored remotely. The requirements for compliance and continuity are essentially the same. It follows therefore that continuity in DoD programs must start at the very inception of the contract and remain an integral part of it throughout. Normal and continuity management functions are identical. So there should be no real difficulty shifting to a “restoration” scenario when continuity has been built into compliance at the beginning.<br> <br> <strong><img alt="Figure 2. Cost-Benefit Analysis" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_figure02.jpg" style="width:400px;height:267px;float:left;margin-left:5px;margin-right:5px;" /></strong>The sections that follow expand on some of the vital continuity management practices that already lend themselves to remote monitoring, and have done so for a long time.<br> <br> <strong>Cost-Benefit Analysis</strong><br> In a cost-benefit analysis, the manager or analyst adds up the benefits of a situation or action and then subtracts the costs associated with taking that action. Figure 2 describes the basic cost-benefit analysis process in action.<br> <br> Thorough cost-benefit analyses reflect both objective (direct/easily quantified) and subjective (indirect and not easily quantified) costs and benefits. These can be done remotely and revised as the situation changes. Forward-thinking PMs are already doing cost-benefit analyses remotely. <table border="1" cellpadding="1" cellspacing="1" style="width:800px;"> <caption>Table 1. Remotely Monitoring Compliance and Continuity</caption> <thead> <tr> <th scope="col">Management Function</th> <th scope="col">Compliance</th> <th scope="col">Continuity</th> <th scope="col">Monitor Remotely</th> </tr> </thead> <tbody> <tr> <td style="text-align:center;">Top Management Involvement</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Provision of Resources</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Cost-Benefit and Risk Analyses</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Preventive/Corrective Actions Identified</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Internal Audit</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Critical Personnel/Functions Identified</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Recovery Time Objectives Established</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Exercise (Gaming Potential Disruptions)</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Incident Response Structure</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> <tr> <td style="text-align:center;">Continual Improvement</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> <td style="text-align:center;">✔</td> </tr> </tbody> </table> <br> <strong>Process Approach</strong><br> A process approach is a matter of managing a group of processes as a system, where the interrelations between processes are identified and the outputs of a previous process are treated as the inputs of the next one. The process approach helps to ensure that the results of each individual process will add value and contribute to achieving the final desired results. Theoretically, there should be no wasted or unnecessary operations. Process approaches also identify opportunities for potential synergy, innovation, risk identification, and resource reallocation. With a little planning, even the most complicated processes may be analyzed and monitored remotely.<br> <br> <img alt="Figure 3. The Basic Process Approach in Program Management" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_figure03.jpg" style="margin-left:5px;margin-right:5px;float:left;width:300px;height:393px;" />Figure 3 describes the basic process approach challenge for DoD PMs.<br> <br> It’s all the same, whether you call it “Plan—Organize—Actuate—Control” as we business majors called it in 1965; “Define—Measure—Analyze—Improve—Control (DMAIC)” if you are into Six Sigma; or “Plan—Do—Check—Act” as everybody else does. Your challenge now is that you may need to manage or audit much (if not all) of the processes from a distance.<br> <br> <strong>Recovery Time Objective </strong><br> The Recovery Time Objective (RTO) is the duration of time and level of service that must be restored in a program after a disaster to avoid unacceptable breakdowns in continuity. Often used with information technology (IT), RTOs can be used to measure the time required to recover data after the disruption. RTOs also help to determine how long a business can survive with reduced infrastructure and services.<br> RTOs are often complicated. IT departments can streamline some of the recovery processes by automating them as much as possible, with tripwires and planned responses built into the software. A meaningful RTO involves an organization’s entire infrastructure.<br> <br> Many DoD contractors routinely “certify” to one or more of the International Standards Organization (ISO) Management Standards (e.g.; ISO 9001:2015: Quality Management Systems). Not only are those organizations subject to periodic audits by accredited certification bodies but they also have an obligation to internally audit themselves to ensure compliance with the International Standard and maintain their certifications. Adherence to the International Standards also indirectly audits contractors to the requirements of the DoD contract.<br> <br> Accordingly, PMs for these “ISO-Certified” DoD contractors have at their disposal the ability to (1) directly audit the contractor, (2) directly assess the contractor’s ability to audit itself, and (3) monitor the status of the contractors’ certifications.<br> <br> More than any other meaningful assessment, audits can be conducted and monitored remotely. And remote audits take less time to schedule and conduct. Remote audits eliminate budget-busting “other direct costs” such as airline tickets, hotel rooms, meals, and rental cars. Findings, feedback, and corrective actions also may be developed faster, especially in working with decentralized or overseas operations or organizations.<br> <br> Forward-looking certification registrars and management consultants of my acquaintance add value by offering, to new and existing clients, remote audits in which they optimize document reviews via e-mails; conduct ZOOM and Skype conferences and interviews, and critiques; and (last but not least) use telephone cameras to remotely observe factory floors, warehouses, and loading docks.<br> <br> <strong><img alt="Figure 4. Developing a Pandemic TACSIT" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_figure04.jpg" style="margin-left:5px;margin-right:5px;float:left;width:300px;height:331px;" />Tabletop Exercises and TACSITs</strong><br> In a tabletop exercise, key personnel assigned high-level operational and administrative roles and responsibilities gather to deliberate various simulated emergency or rapid-response situations. Tabletops are used frequently to improve team responses, disaster preparedness, and emergency planning. They also contribute lessons learned to less time-critical challenges, such as stateside program administration. Tabletop exercises can serve as “disruption rehearsals” by simulating actual events for preparation before, progress during, and recovery after the simulated disruption.<br> <br> Tactical situations (TACSITs), scenarios based on real-world conditions, are used to shape and forecast future operations. They give structure, substance, and direction to tabletop exercises. When data or knowledge are insufficient, computer modeling and simulation contribute to the analysis. Figure 4 describes the creation and continuing improvement of TACSITs.<br> <br> An enduring feedback loop increases productivity and potential contribution. The more remote the operation, the greater is the need for real-time feedback.<br> <br> <img alt="Tabletop exercises as pandemic “rehearsals.”" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article07_image02.jpg" style="width:800px;height:220px;" /><br> A pandemic TACSIT can be as useful as TACSIT developed for more traditional warfighting missions, such as Navy SEAL incursions, cargo routing, or noncombatant evacuation.<br> <br> <strong>Summary</strong><br> The COVID-19 pandemic underscored the need for continuity plans and the decisions that they include—decisions that must be the product of formal, structured, and defensible processes. Otherwise, they will be as meaningless and potentially dangerous as an empty fire extinguisher.<br> <br> Compliance without continuity is meaningless, in even the most benign scenarios. Disruptions and disasters like the COVID-19 pandemic can destroy a DoD program on all sides simultaneously.<br> <br> Continuity means the ability to deliver previously agreed products and services even under extremely negative situations, such as a pandemic. Continuity plans, made up of tripwires and planned responses guide organizations during and after disruptive events to speed immediate response, recovery, resumption of minimum operating conditions, and restoration of normal operations.<br> <br> Continuity requires the same best management practices that PMs already use in every contract, program, or process—but with a greatly enhanced wariness for the unexpected. Look again at Table 1.<br> <br> Audits, internal or external, remote or on-site, are the essence of compliance monitoring, feedback, corrective action, and continual improvement.<br> <br> Tabletop exercises and TACSITs can be excellent “disruption rehearsals.” They can simulate actual events for preparation before, progress during, and recovery after a disruptive event.<br> <br> The PM’s challenge today is to manage many (if not all) of the programs from a distance. Most good management practices can be implemented and monitored remotely, which is already being done for decentralized and overseas organizations.<br> <br> If it helps to develop a sense of urgency for continuity planning, think of COVID-19 as a “bioweapon” employed by a formidable adversary intent on world domination. In fact, I very much recommend doing so. <hr />Razzetti, a retired U.S. Navy captain, is a management consultant, auditor, military analyst, and frequent contributor to Defense Acquisition magazine and the former Defense AT&L magazine. He is the author of five management books, including <em><a href="">Fixes that Last—The Executive’s Guide to Fix It or Lose It Management</a></em>.<br> <br> The author can be reached at <a class="ak-cke-href" href=""></a>.</div>string;#/library/defense-atl/blog/COMPLIANCE-CONTINUITY-and-COVID-19
Other Transactions: Do All Industry Partners Benefit Fairly? Transactions: Do All Industry Partners Benefit Fairly?2021-05-31T16:00:00Z,<div class="ExternalClassB7C205EA27FC4F758329BDE97D04ED51">Other Transactions (OTs) are hugely popular across the Department of Defense (DoD). They are the bright and shiny contracting vehicle that DoD entities have become more comfortable with using for research and development (R&D) and science and technology efforts. <hr /><img alt="The future competitive advantage of U.S. national defense depends largely on its ability to modernize business practices" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article01_image01.jpg" style="width:800px;height:198px;" /><br> <br> Although OTs are one of many contracting vehicles for most DoD entities to leverage, they are incredibly unique since they offer tremendous flexibility and involve commercial-like business terms to attract industry partners of all sizes. OT success directly depends on the Defense Industrial Base (DIB) and the innovative technology solutions industry partners could bring to the table. Understandably, the DIB must have complete confidence and trust in DoD’s overall procurement system, particularly for OTs. The overarching question is, “Which industry partners are benefiting from OTs?” Specifically, are nontraditional defense contractors (NDCs) and small businesses (SBs) getting plentiful opportunities to participate like traditional defense contractors? This article is the first part of a two-part series and analyzes DoD’s prototype OT usage from recent years. A later article will identify suggestions for steps DoD entities can take to ensure there is a level playing field.<br> <br> The future competitive advantage of U.S. national defense depends largely on its ability to modernize business practices and maintain a strong DIB. In response, DoD leadership in 2020 completed multiple relevant and much-needed initiatives to update DoD acquisition policy. The DoD specifically updated policies to help transform and modernize the options by which DoD entities can conduct business to keep pace with technology changes and Warfighter needs. The Adaptive Acquisition Framework (AAF) outlines six acquisition pathways, including Middle Tier of Acquisition and Major Capability Acquisition. OTs are flexible and innovative contracting options that DoD entities can use for most of the pathways to speed a contract solution. Federal law permits DoD entities to use OTs for research, prototyping, and production purposes. Moreover, Congress permanently codified prototype OTs in 10 U.S. Code (USC) 2371b in Fiscal Year (FY) 2016, thereby expanding OT usage for DoD entities in addition to the existing authority for research OTs.<br> <br> The flexibility and innovation necessary to support the AAF exist because OTs are different from traditional procurement contracts and not subject to all laws and regulations. Examples of laws and regulations not applicable include the Competition in Contracting Act, Cost Accounting Standards (CAS), the Federal Acquisition Regulation, and the Defense Federal Acquisition Regulation Supplement. OTs are also unique since agreements can be structured with custom terms and conditions necessary for individual project success between the parties involved. DoD entities have the freedom to strategically negotiate project areas like intellectual property, reporting requirements, payment structures, and property deliverables or government-furnished property since there is not a one-size-fits-all methodology or universal pathway. Besides being similar to contract arrangements in the commercial marketplace, OTs are intended to broaden the DIB (both with traditional contractors and NDCs), support dual-use projects, and enable quicker and cost-effective efforts compared to other contracting vehicles.<br> <br> <img alt="Icons rising from a tablet" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article01_image02.jpg" style="width:800px;height:375px;margin-left:5px;margin-right:5px;" /><br> <br> For prototype OTs, DoD entities must meet at least one of four possible conditions for each award. Two of the four conditions encompass participation by NDCs and small businesses. One condition requires significant participation by at least one NDC or nonprofit research institution, and another condition requires that all significant participants be NDCs or SBs. The remaining two conditions do not pertain to NDCs or SBs and require either a cost-sharing arrangement or a senior procurement executive determination for exceptional circumstances.<br> <br> An NDC, as defined in 10 USC 2302(9), is an industry partner that has not performed, for at least one year preceding the DoD entity’s solicitation of sources for the procurement or transaction, any DoD contract or subcontract subject to full CAS coverage. Based on the definition specifics, it is very likely that a large business, even if recognizable from work on prior DoD projects, could be an NDC when the DoD solicits future OT opportunities. Regardless of past performance or prior support on DoD-specific efforts, industry partners could qualify as NDCs so long as no contract supported by them over the prior 12 calendar months was subject to full CAS. Since CAS are not required for OTs, it is possible and even likely for businesses to remain NDCs if they primarily conduct business with the DoD via OTs. SpaceX, for example, was considered an NDC at the time of this article even though it had revenues exceeding $1 billion in 2020 and previously worked with the DoD on various efforts.<br> <br> DoD entities have several pathways to award prototype OTs. For example, entities could directly award OTs to industry partners after independently performing solicitation and proposal evaluation efforts. Besides direct awards, entities could leverage a consortium by awarding an OT to a consortium management firm (CMF).<br> <br> A consortium is a pool of industry partners working together to achieve similar goals focused on a specific technology area, such as space, artificial intelligence, cyber, aviation, or missiles. Most consortia are made up of hundreds of industry partners such as traditional defense contractors, non-profit organizations, academic institutions, research institutions, and SBs (many of which could be categorized as NDCs). Examples of current consortia are the Aviation and Missile Technology Consortium, the Space Enterprise Consortium, and the System of Systems Consortium. Examples of current CMFs are the Advanced Technology International, the Consortium Management Group, and the National Security Technology Accelerator.<br> <br> Each consortium has a public website including different information on its members and business opportunities. The Space Enterprise Consortium, as of January 2021, identified that 354 of its 459 members were NDCs. While many of these members have historically done business with the DoD, their support on DoD contracts may not have been subject to CAS for the past 12 months. A close evaluation of various consortia reveals that while some consortia identify members as traditional contractors, NDCs, or SBs, many do not provide such individual member details.<br> <br> Each OT pathway has individual pros and cons that DoD project teams should assess before pathway selection. Regardless of the pathway used for each OT project, entities must ensure that they use competitive procedures to the greatest extent practicable and that their business practices remain fair and transparent. A closer analysis of recent prototype OT data reveal additional information on the industry partners receiving awards, including the opportunities gained by consortia or CMFs and other businesses. <h3><img alt="A hand typing on a tablet" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article01_image03.jpg" style="margin-left:3px;margin-right:3px;float:left;width:281px;height:400px;" />What do the data show about the DoD’s prototype OT usage since FY 2016?</h3> The DoD’s use of prototype OTs has significantly grown since FY 2016. The growth is directly linked to Congress enacting various laws that expanded OT authority, specifically requiring that the DoD use OTs as the preferred vehicles for prototyping efforts.<br> <br> Data from the government’s Federal Procurement Data System (FPDS), according to DoD data collection and analysis, show that DoD obligated only $1.4 billion for prototype OTs in FY 2016 (less than 1 percent of DoD’s FY 2016 contract obligations) compared to $7.4 billion for prototype OTs in FY 2019 (about 2 percent of DoD’s FY 2019 contract obligations). Preliminary FY 2020 data project that DoD obligated more than $15 billion for prototype OTs in FY 2020, which represents about 4 percent of DoD’s total FY 2020 contract obligations. While the data show a continuous upward trend, that trend is even greater for prototype OT data when obligations are compared only to DoD’s overall R&D contract obligations. The trends support the projection that prototype OT growth will continue across the DoD. Furthermore, it is almost certain that DoD use of production OTs will increase as entities successfully complete their prototyping effort(s) and leverage the noncompetitive follow-on option. <h3>Are NDCs and SBs obtaining widespread opportunities to participate on DoD OTs?</h3> Analysis of prototype OT awards shows that NDCs and SBs received direct OT awards. However, since many details involving awards to consortia or CMFs are not publicly available, it is not easy to ascertain the exact extent that NDCs and SBs are participating. Figure 1 analyzes prototype OT data from FY 2013 to FY 2020 based on a total percentage of the total OT award actions and dollars obligated. For analysis purposes, the data highlight OTs awarded to consortia and the large, recognizable businesses that have traditionally supported the DoD, such as Lockheed Martin, Raytheon Technologies, Northrop Grumman, Boeing, L3Harris, and Aerojet Rocketdyne.<br> <br> The data show that about 36 percent of prototype OTs between FY 2013 and FY 2020 were awarded to consortia and the DoD’s largest recognizable contractors. There is an overall downward trend as 81 percent of prototype OT awards went to hese groups in FY 2016, compared to 33 percent in FY 2020. Similarly, there is an overall downward trend with 96 percent of the total amount obligated going to these groups in FY 2017, compared to 50 percent in FY 2020. However, the data show about 88 percent of the total amount obligated for prototype OTs between FY 2013 and FY 2020 went to consortia and the DoD’s largest recognizable contractors.<br> <br> <img alt="Figure 1. Prototype OT Analysis Since FY 2013" src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article01_figure01.jpg" style="margin-left:3px;margin-right:3px;width:600px;height:332px;" /><br> <br> Although NDCs and SBs did receive OT awards, their dollar amounts were not nearly as large as those of the OTs awarded to consortia and large businesses. NDCs and SBs, while not directly receiving the bulk of obligations, may have participated to a significant extent through teaming or subcontractor arrangements. However, it is not possible to determine this through publicly available data for two reasons. First, the determination of an industry partner to meet the NDC definition rests with the Agreements Officer. Limited information on whether a specific industry partner was subject to full CAS within the last 12 months makes it difficult to determine if any industry partner receiving the OT award met the NDC definition when the DoD solicited the opportunity. FPDS data for OT awards do not specifically recognize which of the four conditions were met for each prototype OT award. The data also do not provide details to support the DoD entity’s interpretation of industry partners participating to a significant extent (when applicable). As a result, the public cannot see if a condition involving NDCs or SBs was met for an OT award.<br> <br> Secondly, as previously alluded to, most if not all of the inner details on OTs awarded to consortia or CMFs are not publicly available. While information on FPDS identifies OTs awarded to consortia or CMFs, inadequate information is available to the public on the various consortium members that participated and actually worked on each OT. CMFs usually complete various sub-agreements with the various performers selected to support each OT effort. Unfortunately, none of these agreements are required in FPDS or provide details identifying the extent of NDCs or SBs participation. It is possible that many of the OTs awarded to consortia or CMFs were directed to NDCs and SBs. However, the data, as reported, do not provide that level of fidelity. Again, the public cannot distinguish the condition met for OTs awarded to consortia or CMFs since the limited data available to the public do not provide full transparency or assurances that DoD entities are utilizing OTs to attract NDCs and/or SBs. An assessment cannot be made of industry partners equally benefiting until more specific and consistent OT data are available. Although the DoD’s OT reporting since FY 2016 has significantly improved, the DoD should further enhance the collection of OT data.<br> <br> <img alt="OTs are crucial to help the United States solidify its competitive advantage both now and in the future." src="/library/defense-atl/DATLFiles/May-June2021/DEFACQ-DATL_MayJune2021_article01_image04.jpg" style="width:800px;height:161px;" /><br> The trend for substantial prototype OT usage growth across the DoD undoubtedly will continue in future fiscal years. OTs are crucial to help the United States solidify its competitive advantage both now and in the future. DoD entities must modernize business practices and leverage the flexible vehicles to maintain an exceptional DIB. DoD entities should also continue to closely evaluate OT award data and take appropriate action to ensure that industry partners, regardless of size or prior DoD business relations, have equal opportunities for OT participation. The DoD will assuredly encounter the greatest possible success with OTs in its pursuit of national defense superiority if it maintains fair and transparent business practices that truly maximize opportunities for all industry partners. <hr /><br> Speciale is a senior acquisition specialist supporting the DoD. He is a Certified Defense Financial Manager–Acquisition and a Certified Fraud Examiner.<br> <br> Downs is a senior acquisition analyst supporting the DoD. He is a retired Army lieutenant colonel and former contracting officer with the U.S. Army Contracting Command.<br> <br> The authors can be contacted at <a class="ak-cke-href" href=""></a> and <a class="ak-cke-href" href=""></a>.</div>string;#/library/defense-atl/blog/Other-Transactions

Chat with DAU Assistant
Bot Image