OFCCP Releases Guidance On Federal Contractors' Use Of AI And Automated Systems

Ogletree, Deakins, Nash, Smoak & Stewart


Ogletree Deakins is a labor and employment law firm representing management in all types of employment-related legal matters. Ogletree Deakins has more than 850 attorneys located in 53 offices across the United States and in Europe, Canada, and Mexico. The firm represents a range of clients, from small businesses to Fortune 50 companies.
The U.S. Department of Labor's (DOL) Office of Federal Contract Compliance Programs (OFCCP) recently released new guidance warning federal contractors they must routinely monitor their use of artificial intelligence (AI).
United States Employment and HR
To print this article, all you need is to be registered or login on Mondaq.com.

The U.S. Department of Labor's (DOL) Office of Federal Contract Compliance Programs (OFCCP) recently released new guidance warning federal contractors they must routinely monitor their use of artificial intelligence (AI) and automated systems to ensure they do not adversely impact applicants and employees from protected groups.

Quick Hits

  • OFCCP released its first detailed guidance on federal contractors' use of AI and automated systems.
  • The guidance instructs federal contractors to routinely monitor whether AI and automated systems have a disparate or adverse impact on protected groups and take actions to reduce those impacts or use different tools.
  • The guidance further clarifies that federal contractors are ultimately responsible for meeting nondiscrimination and affirmative action obligations regardless of whether they use third-party vendors to implement such tools.

On April 29, 2024, OFCCP published new guidance titled "Artificial Intelligence and Equal Employment Opportunity for Federal Contractors," clarifying federal contractors' compliance obligations with the use of AI and automated decision-making technologies. Key to the guidance are clarifications that federal contractors must monitor their use of such technology and are responsible for the impact of technology whether or not third-party vendors provide such systems.

While AI and automated systems technologies have the potential to increase efficiency and improve decision-making by federal contractors and other employers, the OFCCP guidance warns that the technologies carry risks of discrimination or bias that could potentially violate laws enforced by the OFCCP.

"AI has the potential to embed bias and discrimination into a range of employment decision-making processes," the OFCCP guidance states. "As a result, if not designed and implemented properly, automated systems and AI can replicate or deepen inequalities already present in the workplace and may violate workers' civil rights."

The publication of the OFCCP's guidance coincided with the release of similar guidance by the DOL's Wage and Hour Division (WHD) regarding the application of the Fair Labor Standards Act (FLSA) and other federal labor standards on the use of AI and automated systems in the workplace. Both actions come six months after President Biden issued Executive Order 14110 on October 20, 2023, calling for a "coordinated, Federal Government-wide approach" to the responsible development and implementation of AI.

Level Setting

OFCCP's AI guidance largely tracks other federal agencies' definitions of the technologies, including guidance issued by the U.S. Equal Employment Opportunity Commission (EEOC). The guidance defines AI similarly to the National Artificial Intelligence Initiative Act of 2020, defining it as "a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments." The guidance also uses the EEOC's definition of "algorithm,": "a set of instructions that can be followed by a computer to accomplish some end."

However, the OFCCP guidance further defines "automated systems" as "broadly describ[ing] software and algorithmic processes, including AI, that are used to automate workflows and help people complete tasks or make decisions." The guidance points to examples of automated systems that sift through resumes and identify qualified applicants or AI that determines "which criteria to use when making employment decisions" such as "to define the parameters by which the resumes are filtered and reviewed."

Compliance Obligations

The guidance reminds federal contractors that existing compliance obligations apply to the use of AI and automated systems.

  • Production and Recordkeeping—The OFCCP guidance clarifies that federal contractors' production and recordkeeping compliance obligations extend to their use of AI and automated systems. The guidance states that federal contractors must "[m]aintain records and ensure confidentiality of records," such as keeping records of "resume searches, both from searches of external websites and internal resume databases, that include the substantive search criteria used." Federal contractors must further "[c]ooperate with OFCCP by providing the necessary, requested information on their AI systems."
  • Reasonable Accommodations—The guidance clarifies that federal contractors' obligations to provide reasonable accommodations extend to their use of AI and automated systems, including electronic job application systems.
  • Selection Procedures—The guidance clarifies that when federal contractors use AI or automated technology in selection procedures, they must "validate the system using a strategy that meets applicable OFCCP-enforced nondiscrimination laws and the Uniform Guidelines on Employee Selection Procedures (UGESP)." This includes obtaining "results of any assessment of system bias, debiasing efforts, and/or any study of system fairness." Federal contractors must further: (1) "[c]onduct routine independent assessments for bias and/or inequitable results," and (2) "[e]xplore potentially less discriminatory alternative selection procedures."

Third-Party AI Vendors

The guidance clarifies that the OFCCP will hold federal contractors accountable for complying with their nondiscrimination and affirmative action obligations, regardless of whether they use an AI or automated system and regardless of whether such technologies are provided by or used by third-party vendors or contractors. The guidance states that the laws OFCCP enforces "do not impose separate obligations on vendors" and federal contractors cannot "delegate" their nondiscrimination and affirmative action obligations to third-party vendors. Instead, the compliance risks and obligations remain with the contractor utilizing the technology.

These obligations include the requirement to adequately provide "relevant, requested information and answer questions" about the use of AI during compliance reviews or investigations, including information about the design of the screening or selection system and whether alternative approaches were considered or tested. Such requirements may be burdensome for federal contractors as accessing or retrieving such information from their vendors may prove difficult. However, the guidance specifies that "a federal contractor cannot escape liability for the adverse impact of discriminatory screenings conducted by a third party, such as a staffing agency, HR software provider, or vendor."

The guidance states that when using AI vendors, federal contracts should, among other requirements, be able to verify:

  • provisions of vendor contracts requiring maintenance of necessary records consistent with OFCCP regulation requirements and access to applicable records;
  • the source and quality of information collected, used as background, or analyzed by AI systems;
  • "Whether the vendor documents and maintains the data used in collecting, cleaning, training, and building algorithms and the rationale for why the vendor used the data points";
  • "The vendor's protections and privacy policy on data provided by the contractor"; and
  • "Critical information about the vendor's algorithmic decision-making employment tool, e.g., captured data, scoring system, and the basis for selection or elimination of applicants/candidates."

Key Practices for Federal Contractors

The guidance sets forth some "promising practices" when using AI or automated systems to avoid compliance violations, including maintaining human oversight, providing notice, routinely monitoring systems, safely storing data, and ensuring vendor's AI systems are accurate and effective.

  • Human Oversight—The guidance states that federal contractors should "[n]ot rely solely on AI and automated systems to make employment decisions," and should train staff on the appropriate use of these technologies. This aligns with the WHD's recommendations, which centered on "responsible human oversight" of AI.
  • Notification—The OFCCP guidance states that federal contractors should notify applicants, employees, and their representatives in advance that an AI or automated system is being used in the hiring process or to make employment decisions, and provide detailed information about what data will be collected and entered into an AI system. Federal contractors should further notify applicants, employees, or their representatives that they can correct such data or have it deleted. While OFCCP referenced the ability to request deletion of data, it did not indicate how contractors should prioritize this with competing recordkeeping obligations.
  • Routine Monitoring—Key to the whole guidance is the clarification that federal contractors should "[r]outinely monitor and analyze" where AI or automated systems may be causing a disparate or an adverse impact on protected groups, including reproducing patterns of systemic discrimination, both before the tools are implemented and at regular intervals. If using AI or automated systems causes disparate or adverse impacts, then federal contractors should take action to reduce such impacts or use a different tool.

AI Legal Landscape

The OFCCP's AI guidance is part of a broader effort by the federal government to reign in the use of AI and automated decision-making technologies due to concerns about the risks of discrimination, bias, fraud, and abuse. In April 2024, the DOL signed an updated joint statement with several other federal agencies that was originally released in April 2023.

Further, the Biden administration's October 2023 executive order (EO) stated that federal agencies should balance the benefits of these emerging new technologies with risks and required the DOL, within 180 days, to "develop and publish principles and best practices for employers" on the responsible use of such technologies.

The EO followed up on the Biden administration's "Blueprint for an AI Bill of Rights," which outlined nonbinding recommendations for the design, use, and deployment of AI and automated decision-making systems. In addition to the DOL's guidance, the EEOC has issued guidance clarifying the potential for AI and automated employment decision-making tools to result in a disparate impact and can create issues for individuals covered by the Americans with Disabilities Act (ADA).

Next Steps

In light of the OFCCP's guidance and regulators' focus on AI, federal contractors who are using AI may want to review their current use of AI or automated systems to make hiring or other employment decisions and whether third-party vendors are using such technology. Federal contractors may further want to review their contracts with vendors to ensure they have access to necessary information about the data the technology uses or collects and how the systems are designed to be able to provide such information to federal regulators.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More