Skip to main content
Eightfold AI logo displayed on a screen, representing the lawsuit over creating consumer reports without consent. [blog.hubsp

Editorial illustration for Eightfold sued for creating consumer reports on job applicants without consent

Eightfold AI Hiring Platform Sued Over Unauthorized Reports

Eightfold sued for creating consumer reports on job applicants without consent

2 min read

Why does a hiring platform’s data‑driven screening raise a legal alarm? Eightfold, a company that markets AI‑powered tools for matching talent to jobs, now faces a lawsuit alleging it compiles consumer reports on job seekers without their permission. The complaint says the firm’s software not only predicts individual skills, experience and personality traits, but also pits candidates against one another, producing rankings that influence hiring decisions.

If true, the practice could run afoul of federal consumer‑reporting statutes that require explicit consent before such assessments are shared with employers. Critics argue that turning a person’s career history into a scorecard, especially when the methodology is opaque, threatens privacy and fairness in the recruitment process. The lawsuit also points to Eightfold’s “Evaluation Tools” as the mechanism for generating these reports, raising questions about how AI is being used to quantify human potential.

The following excerpt from the filing lays out the core of the allegation.

According to the lawsuit, Eightfold generates consumer reports for potential employers using its Evaluation Tools. They evaluate job candidates not just as individuals by claiming to pinpoint their likely skills, experiences, and traits, but also in relation to each other, ranking applicants on a scale from 0 to 5 based on the findings, conclusions, and assumptions derived from Eightfold's proprietary AI regarding their "likelihood of success." Eightfold creates talent profiles of job seekers that include personality descriptions such as 'team player' and 'introvert', ranks their 'quality of education', and predicts their future titles and companies, according to the lawsuit.

Related Topics: #AI recruitment #consumer reports #talent matching #job screening #Eightfold AI #candidate ranking #algorithmic hiring #privacy concerns

Eightfold's practices have now entered the courtroom. A California lawsuit filed on Jan. 20 alleges the firm compiled applicant screening reports without consent, invoking the Fair Credit Reporting Act for the first time against an AI recruiter.

According to the complaint, Eightfold’s Evaluation Tools produce consumer reports that assess candidates’ skills, experience, traits, and even rank them against one another. The plaintiffs argue that such reports qualify as consumer reports under the law, yet the company did not obtain the required permission from the individuals evaluated. Consumer advocates are pressing the case as a test of how existing privacy statutes apply to algorithmic hiring tools.

The filing lists major users of the platform, including Microsoft, PayPal, and other Fortune 500 firms, suggesting the reach of the alleged practice. Whether the court will find a violation of the Fair Credit Reporting Act remains uncertain, and the outcome could clarify obligations for AI‑driven recruitment services. Until a judgment is rendered, the legal status of Eightfold’s evaluation methodology stays in question.

Further Reading

Common Questions Answered

What are the key legal concerns raised in the lawsuit against Eightfold regarding their AI-powered hiring tools?

The lawsuit alleges that Eightfold generates consumer reports on job candidates without obtaining their proper consent, which potentially violates the Fair Credit Reporting Act (FCRA). [ftc.gov](https://www.ftc.gov/business-guidance/resources/using-consumer-reports-what-employers-need-know) guidelines require employers to get written permission before obtaining and using consumer reports for employment decisions.

How does Eightfold's AI screening process potentially differ from traditional background checks?

Eightfold's Evaluation Tools not only assess individual candidates' skills and traits but also rank applicants against each other on a 0-to-5 scale using proprietary AI algorithms. [eeoc.gov](https://www.eeoc.gov/laws/guidance/select-issues-assessing-adverse-impact-software-algorithms-and-artificial) notes that such algorithmic decision-making tools raise significant concerns about potential bias and adverse impact in employment selection procedures.

What specific legal protections exist for job applicants when companies use AI-driven background screening?

The Fair Credit Reporting Act requires employers to obtain written consent before conducting background checks and provide applicants with a copy of the report if an adverse employment decision is made. [consumer.ftc.gov](https://consumer.ftc.gov/employer-background-checks-your-rights) emphasizes that job applicants have the right to be informed about and potentially dispute information used in employment screening reports.