The Case That Could Reshape AI Hiring#
Mobley v. Workday, Inc. is the most significant legal challenge to AI-powered hiring tools in American history. After a federal court granted class certification in May 2025, the case now represents potentially millions of job applicants over age 40 who were rejected by Workday’s algorithmic screening system.
The ruling establishes a precedent with far-reaching implications: AI vendors can be directly liable for employment discrimination caused by their tools, even when the vendors don’t make the final hiring decisions themselves.
- 1.1 billion applications rejected through Workday’s system (company filing)
- May 16, 2025: Class certification granted (ADEA claims)
- July 29, 2025: Scope expanded to include HiredScore AI features
- Class Period: September 24, 2020 to present
- Eligible Class: Job applicants age 40+ rejected by Workday AI nationwide
Case Background#
The Plaintiff#
Derek Mobley is a Black man over 40 years old who suffers from anxiety and depression. He applied to over 100 job positions through various employers using Workday’s AI-powered applicant recommendation system. Every application was rejected.
Mobley filed suit in February 2023 in the U.S. District Court for the Northern District of California, alleging that Workday’s screening algorithm discriminated against him based on:
- Race (Title VII of the Civil Rights Act)
- Age (Age Discrimination in Employment Act - ADEA)
- Disability (Americans with Disabilities Act - ADA)
Workday’s AI System#
Workday, Inc. provides cloud-based human resources management software used by thousands of employers worldwide. Its applicant recommendation system uses artificial intelligence to:
- Screen job applications
- Rank candidates based on algorithmic scoring
- Recommend which applicants employers should consider
- Filter out “unqualified” candidates automatically
The system functions as a gatekeeper, applicants rejected by the algorithm never reach human reviewers.
The Core Allegation#
Mobley alleges that Workday’s AI has a disparate impact on protected classes: it systematically rejects qualified applicants based on characteristics correlated with race, age, and disability, even if those characteristics aren’t directly used in the algorithm.
Case Timeline#
| Date | Development |
|---|---|
| February 2023 | Mobley files initial complaint in N.D. California |
| January 2024 | Court denies Workday’s motion to dismiss; holds AI vendors can be directly liable |
| Throughout 2024 | Discovery proceeds; additional plaintiffs join |
| May 16, 2025 | Class certification granted for ADEA claims (age discrimination) |
| July 29, 2025 | Scope expanded to include HiredScore AI features |
| August 20, 2025 | Workday deadline to identify HiredScore customers |
| TBD | Class action notice, opt-in period, trial |
The May 2025 Certification Ruling#
What the Court Decided#
Judge Rita Lin granted preliminary certification under the ADEA, allowing the lawsuit to proceed as a nationwide collective action. Key holdings:
Unified Policy: The court found that Workday’s AI screening tool operates as a “unified policy” across all employers using it. This means affected applicants don’t need to prove discrimination by individual employers, the algorithm itself is the alleged discriminatory policy.
Claims Rise and Fall Together: Judge Lin determined that the central legal question:“whether Workday’s AI recommendation system has a disparate impact on applicants over forty”, is common to all class members. Individual differences in applications or employers don’t defeat class treatment.
Collective Action Permitted: Unlike a traditional class action (where members are automatically included), ADEA claims proceed as “collective actions” requiring individuals to opt in. Class members must affirmatively join the lawsuit to participate.
Class Definition#
The certified class includes:
All job applicants age 40 and older who were denied employment recommendations through Workday’s platform since September 24, 2020.
The July 2025 expansion added applicants processed using HiredScore AI features, broadening potential class membership further.
Scale of Potential Class#
Workday disclosed in court filings that approximately 1.1 billion applications were rejected through its system during the relevant period. While not all rejected applicants will qualify or opt in, the potential class is enormous.
The Agent Liability Theory#
Why This Case Matters for AI Vendors#
Traditionally, employment discrimination claims target employers, not the software vendors they use. Workday argued it couldn’t be liable because it doesn’t actually employ anyone, it just provides tools.
The court rejected this argument, holding that AI vendors can be directly liable as agents of the employers using their tools.
The Legal Theory:
Under agency principles, when a vendor’s AI system makes screening decisions on behalf of employers, the vendor acts as the employer’s agent for those decisions. If the agent discriminates, the agent is liable, not just the principal (employer).
Key Factors:
- Workday’s system makes substantive screening decisions
- Those decisions determine which applicants employers see
- Employers delegate gatekeeping authority to the algorithm
- Workday controls how the algorithm functions
Implications for Other AI Vendors#
This theory applies to any AI tool that makes employment-related decisions:
| Vendor Type | Potential Exposure |
|---|---|
| Resume screeners (Workday, ADP, SAP) | Direct liability for discriminatory screening |
| Video interview AI (HireVue, Pymetrics) | Liability for biased assessment algorithms |
| Skills testing AI (Eightfold, Phenom) | Liability for discriminatory testing |
| Background check AI (Checkr, Sterling) | Liability for biased risk scoring |
Legal Claims and Theories#
Age Discrimination (ADEA) - Certified#
The ADEA prohibits employment discrimination against individuals 40 years or older. Mobley alleges Workday’s AI has a disparate impact on older applicants, disproportionately rejecting them even for positions they’re qualified for.
Disparate Impact Standard:
- Plaintiff must show the AI causes statistically significant harm to protected group
- Defendant can justify with “business necessity”
- Plaintiff can show less discriminatory alternatives exist
Race Discrimination (Title VII) - Pending#
Title VII prohibits employment discrimination based on race. Mobley alleges the algorithm discriminates against Black applicants. This claim hasn’t yet received class certification but remains in the case.
Disability Discrimination (ADA) - Pending#
The ADA prohibits discrimination against individuals with disabilities. Mobley alleges his anxiety and depression, disclosed in application materials, triggered algorithmic rejection. This claim is also pending.
What This Means for Job Applicants#
Are You a Class Member?#
You may be eligible to join this lawsuit if:
✓ You are 40 years old or older ✓ You applied for jobs through employers using Workday’s system ✓ Your application was rejected (no interview, no offer) ✓ The rejection occurred after September 24, 2020
How to Join#
Because this is a collective action, you must opt in to participate:
- Watch for official notice, The court will order notice to potential class members
- Complete opt-in form, You’ll need to submit a consent form to join
- Meet deadline, Opt-in periods are time-limited
Potential Recovery#
Class members may be entitled to:
- Back pay, Wages lost due to discriminatory rejection
- Front pay, Future earnings impact
- Compensatory damages, Emotional distress, other harms
- Liquidated damages, Double back pay for willful violations
- Attorney’s fees, Paid by defendant if plaintiffs prevail
Note: Individual recovery depends on case outcome and class size. With potentially millions of class members, individual awards may be modest.
What This Means for Employers#
Immediate Risks#
Employers using Workday or similar AI screening tools face potential liability even if they didn’t build the discriminatory system:
Vicarious Liability: Employers remain liable for discrimination caused by vendors they hire.
Negligent Selection: Using an AI tool without verifying it doesn’t discriminate could constitute negligence.
Failure to Monitor: Not tracking outcomes for disparate impact may support liability.
Six Actions to Take Now#
Based on guidance from employment law experts:
- Audit AI vendors, Request bias testing documentation and nondiscrimination warranties
- Maintain human oversight, Don’t rely solely on algorithmic decisions
- Document selection criteria, Ensure job requirements are clearly job-related
- Monitor disparate impact, Track outcomes by protected class
- Establish AI governance, Create oversight programs for hiring technology
- Track legal developments, This case law is rapidly evolving
What This Means for AI Vendors#
The New Liability Landscape#
The Mobley ruling signals that AI vendors can no longer avoid employment discrimination liability by claiming they’re “just providing tools.” If your AI makes screening decisions, you may be directly liable for discriminatory outcomes.
Risk Mitigation#
AI hiring tool vendors should:
Technical:
- Conduct regular bias audits
- Document validation studies
- Implement disparate impact testing
- Create explainability features
Contractual:
- Review indemnification provisions
- Consider liability caps
- Require customer cooperation in compliance
Insurance:
- Assess E&O policy coverage for discrimination claims
- Consider specialized AI liability coverage
- Review exclusions for algorithmic discrimination
Related Cases and Developments#
HireVue / Intuit Complaint (March 2025)#
The ACLU and Public Justice filed a complaint against Intuit and HireVue on behalf of an Indigenous and Deaf woman. The case alleges HireVue’s video interview AI discriminated based on:
- Speech patterns
- Lack of typical vocal cues
- Disability-related characteristics
This case reinforces that AI hiring tools face scrutiny across multiple discrimination theories.
EEOC Guidance#
The EEOC has issued guidance on AI in employment decisions:
- Employers responsible for vendor tool outcomes
- Disparate impact analysis applies to AI
- Reasonable accommodations required for AI assessments
- Record-keeping requirements apply
State Laws#
Several states have enacted AI hiring regulations:
| State | Law | Key Requirements |
|---|---|---|
| Illinois | AIPA | Notice and consent for AI video analysis |
| Maryland | Facial Recognition Ban | Prohibits AI facial analysis without consent |
| New York City | Local Law 144 | Bias audits required for automated hiring tools |
| Colorado | AI Act | Risk assessments for high-risk AI decisions |
Frequently Asked Questions#
How do I know if I was screened by Workday's AI?
I'm under 40. Can I still join the lawsuit?
Will I have to pay to join the class action?
I'm an employer using Workday. Am I liable?
Does this case affect other AI hiring tools like HireVue?
What happens next in the case?
Related Resources#
AI Employment Liability#
- AI Workers’ Comp Claim Denials, Algorithm bias in benefits processing
- AI Product Liability, Strict liability for AI systems
- Agentic AI Liability, Autonomous system accountability
Industry Guidance#
- AI Insurance Coverage, E&O and AI-specific coverage
- International AI Frameworks, EU AI Act and global standards
Related Litigation#
- AI Misdiagnosis Case Tracker, Healthcare AI liability
- Section 230 and AI, Platform immunity questions
Developing or Deploying AI Hiring Tools?
Mobley v. Workday establishes that AI vendors face direct employment discrimination liability. Ensure your hiring AI meets emerging legal standards for bias testing and human oversight.
Contact Us