Workday AI Hiring Bias: Class and Collective Action Certified
Workday’s cloud-based hiring platform is used by companies nationwide (fisherphillips.com). A federal judge has now allowed a discrimination lawsuit (Mobley v. Workday) to proceed on behalf of a broad group of applicants (fisherphillips.com). The suit alleges that Workday’s AI-driven screening tools automatically filtered out older, Black, and disabled candidates. The plaintiff – a man over 40 with anxiety and depression – says he applied to more than 100 jobs and was rejected every single time (reuters.com) (fisherphillips.com). In its May 16, 2025 order, the court said the case can include all people in those protected groups who were denied by Workday’s system (hrdive.comclearinghouse.net).
In plain terms, class certification means the court agrees this can be treated as a group lawsuit: other qualified applicants who were similarly screened out can join in.
What Is Class/Collective Certification?
Class certification (and the similar “collective” certification under age-discrimination law) is a legal step where a court says many people with the same kind of claim can sue together (fisherphillips.com) (clearinghouse.net). Instead of one person’s case, the lawsuit now represents whole classes of applicants. Here, the judge granted preliminary certification of Mobley’s age claim (ADEA) as a collective action and allowed Title VII/ADA class claims for race and disability to proceed. In practice, this means Mobley can send court-approved notice to other applicants in those groups so they can opt in or out, rather than being left out of the fight (dwt.comhrdive.com). Class actions can bring more visibility and resources to a case, and they signal the court sees a common pattern, not just an isolated incident.
Who Is Included?
The certified classes cover all job seekers in the protected groups who used Workday’s system and were turned down. In particular:
Applicants 40 and older: All individuals age 40+ who applied through Workday’s job portal and received a rejection instead of a hiring recommendation (hrdive.com).
Applicants with disabilities: Job-seekers with disabilities (physical or mental) who got the same automated rejections (clearinghouse.net) (hrdive.com).
Each of these groups must be similarly situated: for example, Mobley’s complaint says he took Workday’s online assessments, and his “depression and anxiety” likely affected the scores he received (hrdive.com). Other applicants in those protected groups can now come forward, because the court agreed they face a common question: did Workday’s AI system disproportionately screen them out?
How Workday’s AI Hiring System Worked (Allegedly)
Workday provides employers with cloud software that automates recruitment. Companies upload job postings and resumes into Workday’s system, which uses AI/ML to parse and rank candidates. As the court noted, Workday’s tools “reduce time to hire by automatically dispositioning or moving candidates forward” (caselaw.findlaw.com). In practice, this can mean:
Automated Assessments: After you submit your resume, you may be asked to complete online tests or personality assessments. Workday’s AI scores these results along with resume data and ranks you against an employer’s preferences (dwt.com) (fisherphillips.com). Many employers rely on this to quickly sift applicants.
Algorithmic Screening: The AI then “determines whether an employer should accept or reject an application” (caselaw.findlaw.com). According to Mobley’s complaint, this screening is built on historical hiring data. If the data reflects past biases, the AI can “reflect employer biases and rely on biased training data” (caselaw.findlaw.com) (aeriempowered.org). In fact, independent analysis by experts finds Workday’s model tends to favor younger, white male applicants (aeriempowered.org).
Quick, Automated Rejections: If a candidate’s score isn’t high enough, Workday automatically generates a rejection. Class plaintiffs reported receiving hundreds of rejection emails within hours of applying (often late at night), with no interview – a clear sign it was machine-generated (hrdive.com) (hrdive.com). As the court observed, you can only advance in the process if you get past Workday’s screening algorithm (caselaw.findlaw.com).
Application that employers receive are those ranked and scored at a level to allow visibility.
In short, instead of a human reviewing every resume, Workday’s AI did an initial cut. Plaintiffs claim this AI cut systematically cut out older, and disabled candidates. The court’s order noted this may be a company-wide policy applied uniformly to all applicants (dwt.com) (caselaw.findlaw.com), so one person’s experience can be said to be “similarly situated” to another’s under the law.
Why This Ruling Matters
Acknowledge Systemic Harm: By certifying a class/collective, the court is recognizing that this isn’t just one unlucky candidate – it could affect thousands or more. The judge even noted if the group is “hundreds of millions” of people, that’s only because Workday is accused of a broad, systemic bias (hrdive.com). In other words, the sheer size of the class underscores how widespread the problem might be.
Accountability for AI: This decision sends a signal to employers and tech vendors that they can be held liable for biased AI hiring tools – even if any discrimination was unintentional (fisherphillips.com). As one expert wrote, AI systems can’t be a “get out of jail free” card if they disproportionately hurt protected groups. The U.S. Equal Employment Opportunity Commission (EEOC) has already warned that companies must monitor algorithms to prevent unlawful disparate impact (reuters.com). Now a court is showing it will enforce that principle.
Setting a Legal Precedent: Mobley v. Workday is one of the first of its kind. As Reuters put it, this is “the first proposed class action to challenge the use of AI screening software,” and it could set “important precedent” for the future (reuters.com). A decision here could influence many tech vendors and employers to audit their AI tools and adopt stronger bias protections.
Empowering Victims: Treating this as a group lawsuit means plaintiffs share resources and evidence (like expert audits of the algorithm) that one person alone might not have. It also makes a public statement: these experiences aren’t imagined or isolated — courts are beginning to see the pattern. For jobseekers, it means you are not alone. If you’ve felt unfairly screened out, this case is recognizing your experience as part of a broader issue.
Looking Ahead: Implications and Actions
The case will now move into discovery: the parties will gather evidence on how Workday’s tools actually work and what impact they have. The court has even approved notifying thousands of similar applicants so they can opt in to the lawsuit (dwt.com). If bias is proven, remedies could range from changes to the AI system (to make it fair) to monetary damages for those harmed.
Beyond this lawsuit, expect more scrutiny of AI in hiring. Regulators and lawmakers are watching: some states are considering strict new laws for algorithmic fairness, and the EEOC is revising its rules. The Workday case adds momentum to calls for transparency and oversight of automated hiring tools.
What You Can Do: Stay informed and speak up. AERI and other advocacy groups encourage job-seekers who’ve faced AI screening to share their stories. Your experiences can help build a record of why fair hiring practices matter. If you think you fit the class (for example, if you are over 40, or have a disability and got a quick automated rejection from a Workday-powered system), consider reaching out to legal aid or employment rights organizations for guidance. By connecting with advocacy networks, you can help push for better policies (such as algorithm audits or stronger anti-bias laws) that benefit everyone.
Remember companies cannot hide behind code to escape civil rights laws. The recent court certification is a hopeful sign that victims of automated hiring bias will be heard and, with persistence and community support, can win broader justice. Stay alert for updates, share information with others in your situation, and lean on advocates (like AERI) who are fighting for fair AI in employment aeriempowered.org. Together, we can help ensure technology works for all workers, not just a privileged few.
Sources: Court and news reports (including Judge Lin’s order and Mobley’s complaint (caselaw.findlaw.com) (hrdive.com), as well as expert analyses (dwt.comreuters.com), explain how Workday’s AI tools work and why this class action matters (fisherphillips.com) aeriempowered.org (hrdive.com).
All cited materials are linked above for reference.