Editorial illustration for HHS employs Palantir AI to audit grants for DEI and gender‑ideology compliance
HHS Uses Palantir AI to Audit DEI Grant Compliance
HHS employs Palantir AI to audit grants for DEI and gender‑ideology compliance
The health department’s new tech rollout has drawn attention far beyond typical budget paperwork. In early 2024, HHS signed a contract with Palantir, the data‑analytics firm known for its government‑focused platforms, to embed machine‑learning models into its grant‑review workflow. Officials say the move aims to standardise compliance checks across thousands of research awards and staffing notices.
Yet the timing aligns with a series of executive orders issued by former President Donald Trump that forbid federal funding for projects deemed to promote “gender ideology” or broader diversity initiatives. Critics argue that automating such vetting could turn nuanced policy judgments into binary decisions, while supporters claim it helps enforce clear legal boundaries. As the agency scales the system, the question becomes whether the software will simply flag language or reshape how public‑health funding is allocated.
Since last March, the Department of Health and Human Services has been using AI tools from Palantir to screen and audit grants, grant applications, and job descriptions for noncompliance with President Donald Trump's executive orders targeting "gender ideology" and anything related to diversity, equ
Since last March, the Department of Health and Human Services has been using AI tools from Palantir to screen and audit grants, grant applications, and job descriptions for noncompliance with President Donald Trump's executive orders targeting "gender ideology" and anything related to diversity, equity, inclusion (DEI), according to a recently published inventory of all use cases HHS had for AI in 2025. Neither Palantir nor HHS has publicly announced that the company's software was being used for these purposes. During the first year of Trump's second term, Palantir earned more than $35 million in payments and obligations from HHS alone.
None of the descriptions for these transactions mention this work targeting DEI or "gender ideology." The audits have been taking place within HHS's Administration for Children and Families (ACF), which funds family and child welfare and oversees the foster and adoption systems. Palantir is the sole contractor charged with making a list of "position descriptions that may need to be adjusted for alignment with recent executive orders." In addition to Palantir, the startup Credal AI--which was founded by two Palantir alumni--helped ACF audit "existing grants and new grant applications." The "AI-based" grant review process, the inventory says, "reviews application submission files and generates initial flags and priorities for discussion." All relevant information is then routed to the ACF Program Office for final review.
Since last March, HHS has been running Palantir’s AI tools to scan grants, applications, and job postings for breaches of the former president’s executive orders on gender ideology and DEI. The effort appears in a 2025 inventory that lists every AI use case the agency maintains, yet neither Palantir nor HHS has issued a public statement confirming the deployment. No comment was given.
This lack of transparency raises questions about oversight and the criteria used to define “noncompliance.” Short, algorithmic checks may flag language that aligns with diversity initiatives, but the article does not explain how false positives are handled or whether human reviewers intervene. Moreover, the scope of the audit—covering both funding decisions and personnel descriptions—suggests a broad interpretation of the orders, though the precise parameters remain undisclosed. It is unclear whether the technology influences grant awards or merely records compliance.
Without independent verification, the actual impact on research funding and staffing practices cannot be measured, leaving observers to wonder how effectively the AI serves its stated purpose.
Further Reading
- Palantir scores $20M HHS Integrated Care and Case Management (ICCM) and Database Support task on SHARE BPA - OrangeSlices AI
- HHS' new AI strategic plan promises increased guidance, funding - Fierce Healthcare
- Palantir wins $12M HHS CARES – ACF UCBC Expansion and Support task on SHARE BPA - OrangeSlices AI
Common Questions Answered
What specific AI tool is HHS using to audit grants and job descriptions?
HHS is using Palantir's AI tools to screen and audit grants, grant applications, and job descriptions. The implementation began in March 2024 and is focused on checking compliance with executive orders related to gender ideology and diversity, equity, and inclusion (DEI) initiatives.
How does the HHS AI use case inventory relate to the Palantir AI tool deployment?
The Palantir AI tool deployment appears in the 2025 HHS AI use case inventory, which lists every AI use case maintained by the agency. This inventory was published as part of the Office of the Assistant Secretary for Technology Policy's (ASTP) annual reporting requirement consistent with Executive Order 13960.
Why has the HHS AI tool implementation raised concerns about transparency?
Neither Palantir nor HHS has publicly announced the deployment of the AI screening tool, which has raised questions about oversight and the specific criteria used to define noncompliance. The lack of public communication about the tool's implementation and its precise operational parameters has drawn attention to potential transparency issues.