Procurement Officials Are Leading Federal AI Adoption

Earth view from space at night with lights and connections from cities. (World Map Courtesy of NASA: https://visibleearth.nasa.gov/view.php?id=55167)
Font Size:

Contracting officials and agency leaders are key to the deployment of ethical AI processes.

Font Size:

Over 50 years ago, in the sci-fi movie, 2001: A Space Odyssey, the spacecraft’s crew included the HAL 9000 computer that became a malevolent artificial intelligence (AI) system motivated by its survival, intent on murdering the human crew members who distrusted it.

This cautionary tale is an example of the potential for unintended, if extreme, AI behavior when deploying AI solutions to augment human abilities. How can the government prevent AI from acting contrary to human intentions and expectations? How can the procurement process and agreements be used to mitigate potentially undesirable behaviors?

Government procurement processes are beginning to mitigate this potential using new specialized principles and methods that encourage responsible use of AI. There is much to learn and do on this mission-critical journey.

Academia and the private sector have led the way in developing AI technologies. Pervasive adoption of AI by the government will enable agencies to pursue improved outcomes at scale.

Using AI to augment human performance can also enable efficiency improvements at levels once unattainable and orders of magnitude better than the status quo..

But putting the power of private sector AI technology to use for good government can create challenges for public procurement, as Cary Coglianese and Erik Lampman have noted in their work on contracting for AI governance. They point out that the risks of AI must be balanced against their potentially game-changing benefits.

Coglianese, in an article with Alicia Lai, goes on to make the point that there is no perfect, unbiased system against which to compare AI. Human designers, decision-makers, and government agents bring lifetimes of acquired, often underappreciated cumulative biases to their work. These biases will not be eliminated by AI if there are still people in the process—as there will and should be. In addition to residual human discretion, algorithmic bias introduced through choices in training data embedded in algorithms by repetitive training may cause human prejudices to endure out of sight.

For government to reap AI’s benefits without creating problems, the application of the technology needs to improve. At stake are thorny problems in cybersecurity, equity, diversity, inclusion, and adapting to a life-changing climate crisis. But innovation in using the technology for such important use case solutions will be discouraged if even more rules are imposed on overburdened government contracting officers and entrepreneurs.

The federal acquisition system is the expression of a massive library of rules and regulations interpreted by each agency and their warranted professional contracting officers. The Federal Acquisition Regulation (FAR) and its derivatives are the sheet music for a bureaucratic  orchestra and choir in which the contracting officer is the conductor. The high degree of complexity in government procurement regulations is intended to satisfy the need for public confidence in fairness of the system, and it has largely fulfilled that objective—with hidden and unobservable opportunity costs in terms of forgone performance.

Part 1 of the FAR states that contracting officials should employ good business judgment on behalf of taxpayers. In practice, however, Part 1 discretion is often overwhelmed by the cultural norms of complying with the voluminous regulations.

Many contracting officers recognize that libraries of regulations impose barriers and disincentives for companies at the forefront of technology invention and adoption to engage with government. This acknowledgement has led to an avalanche of procurement innovation among contracting officers responding to new market opportunities in the shifting landscape.

The Office of Federal Procurement Policy through the Federal Acquisition Institute partnered with my ACT-IAC team of volunteers to create an easy to use knowledge repository of such innovations—the Periodic Table of Acquisition Innovations—to promote adoption of successful acquisition techniques across government and industry. One such innovation is the Pilot IRS program. It smartly pairs the authorities of FAR Part 12 and 13 to enable the Internal Revenue Service (IRS) to buy like a venture capitalist. The current limit on such contracts is $7.5 million, which the IRS is seeking to get raised.

The U.S. Congress, sensing a similar need, expanded the 60-year-old Other Transaction Authority (OTA), designed to eliminate FAR rules and promote experimentation with new technologies like AI. Use of OTAs has been surging in recent years, concentrated in the U.S. Department of Defense.

These authorities have been essential to advancing the art of procuring AI by the Defense Department’s Joint Artificial Intelligence Center (JAIC). The ability to use these authorities requires more experience on the part of contracting officers, who should selectively employ the appropriate business acumen, but not the rules per se, embedded in the FAR as they create OTAs.

To its credit, the JAIC has intentionally created a “golf course” of AI contracting called Tradewind, where the tees, pins, traps, and fairways can incorporate the business acumen of the FAR with the relative freedom of OTAs. Tradewind is available for use across the federal government to enable better and faster AI acquisition.

Responsible AI (RAI) constitutes a set of new, AI-specific principles of the JAIC’s enterprise-wide AI initiative. The Defense Department’s commitment to RAI starts with top departmental leadership.

The new Chief Digital and AI Office (CDAO) is the focal point for the execution of the Department of Defense AI strategy. RAI principles are guiding the development and speeding the adoption of AI through innovative approaches to acquisition on a new acquisition pathway based on OTA, related authorities, and an infrastructure of contract vehicles such as test and evaluation support described by the Defense Innovation Unit. Contracts based on challenge statements can be executed in 30 to 60 days to quickly shape and capitalize on emerging techniques.

On the other hand, many essential civilian agency missions are fundamentally about allocating resources.

Missions at the U.S. Department of Health and Human Services, for example, must mitigate against unintended socioeconomic biases that are illegal. In guiding procurement teams to prevent such bias, the National Institute of Standards and Technology (NIST) is remarkably ambitious in addressing data, testing and evaluation, and human factors. NIST’s analysis of prospective standards includes the legal, business, and technical guardrails to prevent and discover socioeconomic bias in deploying AI solutions.

Reminiscent of the JAIC, NIST argues that traditional testing approaches incorporated in contracts are not sufficient to eliminate socioeconomic bias from AI solutions. They recognize that the “explainability” challenge of powerful but opaque machine learning techniques should be factored into contracts.

Mitigating bias requires deep and transparent insights into the data used to train the solutions. NIST presents an approach to sociotechnical AI solution testing that procurement teams should consider. NIST’s work is the first systematic map of this uncharted territory. At stake is the public’s trust in AI.

Contracting officers are blazing new trails into the brave new world of acquiring AI for government use. They are simultaneously encouraging industry engagement, ensuring that the AI solutions are accountable and free of undesired bias, and augmenting human efforts in mission performance. Ethical use of AI technology begins with procurement. Contracting officers are conducting the symphony of evolving federal procurement of AI.

Timothy W. Cooke is the CEO of ASI Government, LLC.

This essay is part of a nine-part series entitled Artificial Intelligence and Procurement.