Workday boasts 625 billion data transactions and datasets. Is it possible to know what is in them?
I performed a simplistic calculation:
For a manual review of 625 billion datasets, the time required would be significantly greater. Using the rough estimate of 1 minute per dataset, here's how the calculation would break down:
1. Total Minutes: 625 billion minutes.
2. Convert to Years:
- 625 billion minutes ÷ 60 (to get hours) = approximately 10.42 billion hours.
- 10.42 billion hours ÷ 24 (to get days) = about 434 million days.
- 434 million days ÷ 365 (to get years) = approximately 1.19 million years.
This is, of course, a simplified calculation and doesn't consider factors such as:
- Team Size: If you have a large team of reviewers, the time could be drastically reduced. For example, if you had 1,000 reviewers working full time, the time could be reduced to about 1,190 years, assuming they work continuously without breaks.
- Automation: Utilizing automated tools for initial filtering or processing could significantly decrease the manual review workload.
- Efficiency Improvements: Implementing efficient processes, training, and best practices can also reduce the time required.
Overall, manually reviewing 625 billion datasets is a monumental task that would likely require a combination of automation, a large team, and efficient processes to make it feasible within a reasonable timeframe.
Short answer: Workday has no clue what's in those datasets that are used to measure, rank, score, and profile candidates.
04/07/2025