Federal agencies are operating under a clear and increasingly enforced AI governance mandate. OMB Memorandum M-24-10, "Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence," establishes the foundational requirements for how federal agencies must govern their use of AI systems.
If you're a federal agency leader, program manager, or CISO trying to understand what this means for your organization, this is the plain-language explanation you need.
The Core Requirements
1. Designate a Chief AI Officer (CAIO)
Every covered federal agency must designate a Chief AI Officer. The CAIO is responsible for coordinating the agency's use of AI, promoting AI innovation, and managing AI risk. Importantly, the CAIO must have sufficient seniority and authority to influence agency-wide AI decisions.
Many agencies are meeting this requirement through a dual-hat arrangement — assigning CAIO responsibilities to an existing senior official — or through a Fractional AI Officer engagement while they build toward a permanent appointment.
2. Establish an AI Governance Board
Agencies must establish an AI Governance Board or equivalent body to coordinate AI governance activities across the agency. This board is responsible for reviewing and approving high-risk AI use cases, setting agency-wide AI policy, and ensuring accountability for AI outcomes.
3. Maintain an AI Use Case Inventory
Agencies must maintain a public inventory of AI use cases — documenting the AI systems they use, the purposes for which they're used, and the risk levels associated with each use case. This inventory must be updated annually and made publicly available.
4. Conduct AI Impact Assessments for Rights-Impacting and Safety-Impacting AI
For AI systems that impact the rights or safety of individuals — including benefits determinations, law enforcement applications, and healthcare decisions — agencies must conduct AI impact assessments before deployment. These assessments must evaluate the system's accuracy, potential for bias, and the adequacy of human oversight.
5. Ensure Minimum Practices for AI Governance
OMB M-24-10 establishes a set of minimum practices that agencies must implement for all AI use cases. These include testing for bias, ensuring transparency with affected individuals, establishing human oversight mechanisms, and providing recourse for individuals adversely affected by AI decisions.
What's Changed Since the Original EO
Executive Order 14110, signed in October 2023, established the foundational federal AI governance framework. OMB M-24-10 operationalized that framework for federal agencies — translating the EO's broad directives into specific, enforceable requirements with defined timelines.
The key shift is accountability. Under M-24-10, agencies can no longer treat AI governance as an aspirational goal. The requirements are specific, the timelines are defined, and OMB has the authority to assess agency compliance.
The Compliance Timeline
- →CAIO Designation — Required within 60 days of M-24-10 publication (already past for most agencies)
- →AI Governance Board — Required within 60 days of CAIO designation
- →AI Use Case Inventory — Initial inventory required within 60 days; annual updates required thereafter
- →Minimum Practices — Required for all new AI use cases immediately; phased implementation for existing use cases
- →AI Impact Assessments — Required before deployment of rights-impacting or safety-impacting AI systems
What Agencies Are Getting Wrong
Based on our work with federal agencies, the most common compliance gaps are:
- →Incomplete AI use case inventories — agencies are documenting the AI systems they know about, but missing shadow AI deployed at the program level
- →CAIO appointments without authority — designating a CAIO in name only, without the seniority or resources to actually govern AI use
- →Impact assessments without methodology — completing the assessment form without a rigorous, documented methodology for evaluating bias and risk
- →Governance boards without teeth — establishing a board that meets quarterly but has no authority to pause or modify AI deployments
How DLSS Supports Federal AI Compliance
DLSS has supported federal agencies through FISMA, FedRAMP, and NIST RMF compliance for over two decades. Our AI governance practice applies that same rigor to OMB M-24-10 compliance — helping agencies build the governance structures, documentation, and oversight mechanisms that meet the letter and spirit of the requirements.
Not Sure Where Your Organization Stands on AI Governance?
Take the free AI Governance Readiness Assessment to understand your agency's current compliance posture against OMB M-24-10 requirements.
Take the Free Assessment