Skip to main content
  1. AI Legal Resources/

Mobley v. Workday: AI Hiring Discrimination Class Action Tracker

Table of Contents

The Case That Could Reshape AI Hiring
#

Mobley v. Workday, Inc. is the most significant legal challenge to AI-powered hiring tools in American history. After a federal court granted class certification in May 2025, the case now represents potentially millions of job applicants over age 40 who were rejected by Workday’s algorithmic screening system.

The ruling establishes a precedent with far-reaching implications: AI vendors can be directly liable for employment discrimination caused by their tools, even when the vendors don’t make the final hiring decisions themselves.

Key Case Statistics
  • 1.1 billion applications rejected through Workday’s system (company filing)
  • May 16, 2025: Class certification granted (ADEA claims)
  • July 29, 2025: Scope expanded to include HiredScore AI features
  • Class Period: September 24, 2020 to present
  • Eligible Class: Job applicants age 40+ rejected by Workday AI nationwide

Case Background
#

The Plaintiff
#

Derek Mobley is a Black man over 40 years old who suffers from anxiety and depression. He applied to over 100 job positions through various employers using Workday’s AI-powered applicant recommendation system. Every application was rejected.

Mobley filed suit in February 2023 in the U.S. District Court for the Northern District of California, alleging that Workday’s screening algorithm discriminated against him based on:

  • Race (Title VII of the Civil Rights Act)
  • Age (Age Discrimination in Employment Act - ADEA)
  • Disability (Americans with Disabilities Act - ADA)

Workday’s AI System
#

Workday, Inc. provides cloud-based human resources management software used by thousands of employers worldwide. Its applicant recommendation system uses artificial intelligence to:

  • Screen job applications
  • Rank candidates based on algorithmic scoring
  • Recommend which applicants employers should consider
  • Filter out “unqualified” candidates automatically

The system functions as a gatekeeper, applicants rejected by the algorithm never reach human reviewers.

The Core Allegation
#

Mobley alleges that Workday’s AI has a disparate impact on protected classes: it systematically rejects qualified applicants based on characteristics correlated with race, age, and disability, even if those characteristics aren’t directly used in the algorithm.


Case Timeline
#

DateDevelopment
February 2023Mobley files initial complaint in N.D. California
January 2024Court denies Workday’s motion to dismiss; holds AI vendors can be directly liable
Throughout 2024Discovery proceeds; additional plaintiffs join
May 16, 2025Class certification granted for ADEA claims (age discrimination)
July 29, 2025Scope expanded to include HiredScore AI features
August 20, 2025Workday deadline to identify HiredScore customers
TBDClass action notice, opt-in period, trial

The May 2025 Certification Ruling
#

What the Court Decided
#

Judge Rita Lin granted preliminary certification under the ADEA, allowing the lawsuit to proceed as a nationwide collective action. Key holdings:

Unified Policy: The court found that Workday’s AI screening tool operates as a “unified policy” across all employers using it. This means affected applicants don’t need to prove discrimination by individual employers, the algorithm itself is the alleged discriminatory policy.

Claims Rise and Fall Together: Judge Lin determined that the central legal question:“whether Workday’s AI recommendation system has a disparate impact on applicants over forty”, is common to all class members. Individual differences in applications or employers don’t defeat class treatment.

Collective Action Permitted: Unlike a traditional class action (where members are automatically included), ADEA claims proceed as “collective actions” requiring individuals to opt in. Class members must affirmatively join the lawsuit to participate.

Class Definition
#

The certified class includes:

All job applicants age 40 and older who were denied employment recommendations through Workday’s platform since September 24, 2020.

The July 2025 expansion added applicants processed using HiredScore AI features, broadening potential class membership further.

Scale of Potential Class
#

Workday disclosed in court filings that approximately 1.1 billion applications were rejected through its system during the relevant period. While not all rejected applicants will qualify or opt in, the potential class is enormous.


The Agent Liability Theory
#

Why This Case Matters for AI Vendors
#

Traditionally, employment discrimination claims target employers, not the software vendors they use. Workday argued it couldn’t be liable because it doesn’t actually employ anyone, it just provides tools.

The court rejected this argument, holding that AI vendors can be directly liable as agents of the employers using their tools.

The Legal Theory:

Under agency principles, when a vendor’s AI system makes screening decisions on behalf of employers, the vendor acts as the employer’s agent for those decisions. If the agent discriminates, the agent is liable, not just the principal (employer).

Key Factors:

  • Workday’s system makes substantive screening decisions
  • Those decisions determine which applicants employers see
  • Employers delegate gatekeeping authority to the algorithm
  • Workday controls how the algorithm functions

Implications for Other AI Vendors
#

This theory applies to any AI tool that makes employment-related decisions:

Vendor TypePotential Exposure
Resume screeners (Workday, ADP, SAP)Direct liability for discriminatory screening
Video interview AI (HireVue, Pymetrics)Liability for biased assessment algorithms
Skills testing AI (Eightfold, Phenom)Liability for discriminatory testing
Background check AI (Checkr, Sterling)Liability for biased risk scoring

Legal Claims and Theories#

Age Discrimination (ADEA) - Certified
#

The ADEA prohibits employment discrimination against individuals 40 years or older. Mobley alleges Workday’s AI has a disparate impact on older applicants, disproportionately rejecting them even for positions they’re qualified for.

Disparate Impact Standard:

  • Plaintiff must show the AI causes statistically significant harm to protected group
  • Defendant can justify with “business necessity”
  • Plaintiff can show less discriminatory alternatives exist

Race Discrimination (Title VII) - Pending
#

Title VII prohibits employment discrimination based on race. Mobley alleges the algorithm discriminates against Black applicants. This claim hasn’t yet received class certification but remains in the case.

Disability Discrimination (ADA) - Pending
#

The ADA prohibits discrimination against individuals with disabilities. Mobley alleges his anxiety and depression, disclosed in application materials, triggered algorithmic rejection. This claim is also pending.


What This Means for Job Applicants
#

Are You a Class Member?
#

You may be eligible to join this lawsuit if:

✓ You are 40 years old or older ✓ You applied for jobs through employers using Workday’s system ✓ Your application was rejected (no interview, no offer) ✓ The rejection occurred after September 24, 2020

How to Join
#

Because this is a collective action, you must opt in to participate:

  1. Watch for official notice, The court will order notice to potential class members
  2. Complete opt-in form, You’ll need to submit a consent form to join
  3. Meet deadline, Opt-in periods are time-limited

Potential Recovery
#

Class members may be entitled to:

  • Back pay, Wages lost due to discriminatory rejection
  • Front pay, Future earnings impact
  • Compensatory damages, Emotional distress, other harms
  • Liquidated damages, Double back pay for willful violations
  • Attorney’s fees, Paid by defendant if plaintiffs prevail

Note: Individual recovery depends on case outcome and class size. With potentially millions of class members, individual awards may be modest.


What This Means for Employers
#

Immediate Risks
#

Employers using Workday or similar AI screening tools face potential liability even if they didn’t build the discriminatory system:

Vicarious Liability: Employers remain liable for discrimination caused by vendors they hire.

Negligent Selection: Using an AI tool without verifying it doesn’t discriminate could constitute negligence.

Failure to Monitor: Not tracking outcomes for disparate impact may support liability.

Six Actions to Take Now
#

Based on guidance from employment law experts:

  1. Audit AI vendors, Request bias testing documentation and nondiscrimination warranties
  2. Maintain human oversight, Don’t rely solely on algorithmic decisions
  3. Document selection criteria, Ensure job requirements are clearly job-related
  4. Monitor disparate impact, Track outcomes by protected class
  5. Establish AI governance, Create oversight programs for hiring technology
  6. Track legal developments, This case law is rapidly evolving

What This Means for AI Vendors
#

The New Liability Landscape
#

The Mobley ruling signals that AI vendors can no longer avoid employment discrimination liability by claiming they’re “just providing tools.” If your AI makes screening decisions, you may be directly liable for discriminatory outcomes.

Risk Mitigation
#

AI hiring tool vendors should:

Technical:

  • Conduct regular bias audits
  • Document validation studies
  • Implement disparate impact testing
  • Create explainability features

Contractual:

  • Review indemnification provisions
  • Consider liability caps
  • Require customer cooperation in compliance

Insurance:

  • Assess E&O policy coverage for discrimination claims
  • Consider specialized AI liability coverage
  • Review exclusions for algorithmic discrimination

Related Cases and Developments#

HireVue / Intuit Complaint (March 2025)
#

The ACLU and Public Justice filed a complaint against Intuit and HireVue on behalf of an Indigenous and Deaf woman. The case alleges HireVue’s video interview AI discriminated based on:

  • Speech patterns
  • Lack of typical vocal cues
  • Disability-related characteristics

This case reinforces that AI hiring tools face scrutiny across multiple discrimination theories.

EEOC Guidance
#

The EEOC has issued guidance on AI in employment decisions:

  • Employers responsible for vendor tool outcomes
  • Disparate impact analysis applies to AI
  • Reasonable accommodations required for AI assessments
  • Record-keeping requirements apply

State Laws
#

Several states have enacted AI hiring regulations:

StateLawKey Requirements
IllinoisAIPANotice and consent for AI video analysis
MarylandFacial Recognition BanProhibits AI facial analysis without consent
New York CityLocal Law 144Bias audits required for automated hiring tools
ColoradoAI ActRisk assessments for high-risk AI decisions

Frequently Asked Questions
#

How do I know if I was screened by Workday's AI?

Many large employers use Workday for HR management. If you applied through an online portal that asked for detailed work history, skills assessments, or used a standardized application format, it may have been a Workday system. The official class notice (when issued) will provide guidance on identifying affected applications.

I'm under 40. Can I still join the lawsuit?

Currently, only the ADEA (age discrimination) claims have been certified as a class action, covering applicants 40+. The race and disability discrimination claims remain in the case but haven’t been certified for class treatment. You may still have individual claims even if you’re under 40.

Will I have to pay to join the class action?

No. Class members typically pay nothing upfront. Attorneys work on contingency, meaning they’re paid from any recovery. If the case is unsuccessful, class members owe nothing.

I'm an employer using Workday. Am I liable?

Potentially. Even though Mobley targets Workday directly, employers using discriminatory tools remain liable for discrimination in their hiring processes. You should audit your AI vendor relationships, maintain human oversight, and monitor outcomes for disparate impact.

Does this case affect other AI hiring tools like HireVue?

Yes. The agent liability theory applies to any AI vendor making employment screening decisions. HireVue, Pymetrics, Eightfold, and similar tools face similar exposure. The July 2025 expansion to include HiredScore (another AI tool) demonstrates the broadening scope.

What happens next in the case?

The court will order official notice to potential class members, followed by an opt-in period. Discovery will continue, and the case will proceed toward trial (or settlement). Major developments typically take months to years in class actions of this scale.

Related Resources#

AI Employment Liability
#

Industry Guidance
#

Related Litigation#


Developing or Deploying AI Hiring Tools?

Mobley v. Workday establishes that AI vendors face direct employment discrimination liability. Ensure your hiring AI meets emerging legal standards for bias testing and human oversight.

Contact Us

Related

AI Employment Discrimination Tracker: Algorithmic Hiring, EEOC Enforcement & Bias Cases

AI in Employment: The New Discrimination Frontier # Artificial intelligence has transformed how companies hire, evaluate, and fire workers. Resume screening algorithms, video interview analysis, personality assessments, performance prediction models, and automated termination systems now influence employment decisions affecting millions of workers annually. But as AI adoption accelerates, so does evidence that these systems perpetuate, and sometimes amplify, discrimination based on race, age, disability, and gender.

AI Litigation Landscape 2025: Comprehensive Guide to AI Lawsuits

The AI Litigation Explosion # Artificial intelligence litigation has reached an inflection point. From copyright battles over training data to employment discrimination class actions, from product liability claims for AI chatbots to healthcare AI denial lawsuits, 2025 has seen an unprecedented wave of cases that will define AI accountability for decades to come.

Biometric Privacy Litigation Tracker: BIPA, CUBI, and Biometric Data Cases

The Biometric Privacy Litigation Explosion # Biometric data, fingerprints, facial geometry, iris scans, voiceprints, represents the most intimate form of personal information. Unlike passwords or credit card numbers, biometrics cannot be changed if compromised. This permanence, combined with the proliferation of facial recognition technology and fingerprint authentication, has triggered an unprecedented wave of privacy litigation.

Healthcare AI Denial Litigation Tracker: Insurance Denials, Medicare Advantage & Class Actions

The Healthcare AI Denial Crisis # When artificial intelligence decides whether your health insurance claim is approved or denied, the stakes are life and death. Across the American healthcare system, insurers have deployed AI algorithms to automate coverage decisions, often denying care at rates far exceeding human reviewers. The resulting litigation wave is exposing how AI systems override physician judgment, ignore patient-specific circumstances, and prioritize cost savings over medical necessity.

AI Debt Collection and FDCPA Violations: Legal Guide

When AI Becomes the Debt Collector # The debt collection industry, historically notorious for harassment and intimidation, is rapidly adopting artificial intelligence. AI chatbots can contact millions of debtors in days. Voice cloning technology creates synthetic agents indistinguishable from humans. Algorithmic systems decide who gets sued, when to call, and how aggressively to pursue payment.