Data Brokerage The widespread adoption of AI screening systems in recruitment, used by over 75% of Fortune 500 companies, results in 83% of resumes being filtered before reaching human review. This technology affects an estimated 85 million job seekers annually through shadow profiles. While AI screening is not limited to high-level roles, entry-level positions increasingly rely on it, raising concerns about demographic biases and historical hiring inequities. By 2025, over 90% of large employers are expected to use AI evaluation[1],[2]. In summary, AI in recruitment poses challenges related to bias and fairness, despite its efficiency.
Digital Dossiers
Shadow profiles are unauthorized collections of candidate data harvested from social media, public records, and browsing history. They exist parallel to official applications and include information candidates never provided. AI recruitment tools compile these profiles from 50+ digital touchpoints. These dossiers may contain everything from your shopping habits and financial history to personal relationship data and political leanings. Unlike traditional background checks that require consent, these shadow profiles are assembled covertly through data scraping and cross-platform tracking technologies.
What's Included in Shadow Profiles?
Most shadow profiles combine explicit (directly provided), implicit (inferred from behavior), and predictive (AI-generated) data.
Basic Candidate Information (Explicit)
Full Name (including previous names & nicknames)
Contact Details (email, phone, address, work location)
Résumé Details (education, degrees, past jobs)
Social Media Links (LinkedIn, Twitter, GitHub, Facebook)
Professional & Work-Related Data (Implicit)
Skills & Experience – Pulled from LinkedIn, job applications, certifications
Job-Hopping Patterns – How often you change jobs
Salary History & Expectations – Estimated based on industry trends, past applications
Endorsements & References – From LinkedIn, review platforms, past applications
Project Contributions – GitHub, ResearchGate, personal blogs
Public Speaking & Conference History – Talks, events, and panels
Behavioral Data (AI-Inferred)
Communication Style – How you write emails, job applications, or social media posts
Personality Analysis – Based on language, job preferences, and online activity
Work Ethic Predictions – AI-generated scores from activity patterns
Cultural Fit Assessment – Compared to company values and past hires
⚠️ Risk: AI predictions may lead to biased hiring decisions if they incorrectly label candidates.
AI cannot detect human emotions, and it cannot predict your future success; however, for more than 5 years this technology has gone unregulated and has been on a steady incline in use.
The talent acquisition industry is exploiting AI, Machine Learning, Natural Language Processing, and Biometric data claiming they can create a skills-based workforce and predict probability of success...resulting in once successful people being destitute, homeless.
AI Attempts and Failure to Detect Human Emotions
Risk & Compliance Data
Background Checks – Criminal records, driving history, financial standing
Credit Score & Debt History – Especially for finance-related jobs
Legal Disputes – If publicly available (e.g., lawsuits, business disputes)
Past Employer Feedback – Some firms collect anonymous employer reviews
⚠️ Risk: Errors in credit or criminal records may disqualify candidates unfairly.
AI-Generated Hiring Predictions
Likelihood of Quitting Early – Based on job history & industry trends
Leadership Potential – Inferred from past roles & social media presence
Remote vs. Office Suitability – Based on past job settings & work style
Personality Fit for a Role – AI comparison with similar job applicants
⚠️ Risk: Candidates may be rejected before they even apply based on flawed AI predictions.
Cited Research on Shadow Profiles & Data Brokers
Chen, M.S. (2019) – "China’s Data Collection on U.S. Citizens: Implications, Risks, and Solutions"
Source: Journal of Science Policy & Governance
📄 Full Paper (PDF)
Key Findings:
China uses extensive data collection on U.S. citizens through data brokers, job portals, and social media scraping.
Recruitment platforms and AI-driven hiring models are exploited to track employment trends and create unofficial shadow profiles.
Calls for tighter cybersecurity laws and better consumer data protection.
⚠️ Risks Identified:
🔹 Sensitive employment data can be used for corporate espionage or identity theft.
🔹 Job seekers unknowingly opt-in to data tracking by using platforms like LinkedIn and Indeed.
Naous et al. (2019) – "Information Disclosure in Location-based Services"
Source: ICIS Conference Proceedings
📄 Full Paper (PDF)
Key Findings:
Recruiters and employers use location data (from apps, check-ins, and GPS) to track potential job candidates.
Candidates unknowingly expose location data when using career-focused apps or attending networking events.
AI models use past location patterns to predict job-hopping risks and cultural fit.
⚠️ Risks Identified:
🔹 Candidates can be screened out based on their commute distance or frequent travel.
🔹 AI hiring tools may misinterpret location history, leading to biased hiring decisions.
Kuempel, A. (2016) – "The Invisible Middlemen: A Critique of the Data Broker Industry"
Source: Northwestern Journal of International Law & Business, 36
📄 Full Paper (PDF)
Key Findings:
Data brokers collect employment history, social media interactions, and financial records.
Many job applicants have shadow profiles without knowing—aggregated from third-party sources like credit reports and online résumés.
Employers purchase candidate profiles from brokers like Acxiom, Experian, and Equifax.
⚠️ Risks Identified:
🔹 Incorrect or outdated data can harm job prospects.
🔹 No transparency laws exist to help job seekers correct or remove shadow profiles.
Price et al. (2019) – "Shadow Health Records Meet New Data Privacy Laws"
Source: Science, 366(6464), 46-49
📄 Full Paper (PDF)
Key Findings:
Healthcare data is being merged with employment history in recruitment AI systems.
AI-driven hiring tools may use health indicators to predict absenteeism and job longevity.
Companies are using shadow health profiles for pre-employment risk assessments.
⚠️ Risks Identified:
🔹 Candidates could face discrimination if an employer suspects a health issue (even if it's inaccurate).
🔹 HIPAA protections do not cover hiring decisions, leaving candidates vulnerable.
Solove, D.J. (2000) – "Privacy and Power: Computer Databases and Information Privacy"
Source: Stanford Law Review, 53
📄 Full Paper (PDF)
Key Findings:
Shadow profiling has existed for decades, but modern AI has made it more powerful.
Recruitment platforms create automated hiring scores based on data aggregation.
AI hiring systems may use metadata (job titles, résumés, and search history) to predict job fit.
⚠️ Risks Identified:
🔹 Mass surveillance in recruitment leads to unfair hiring advantages for large corporations.
🔹 Job candidates have no control over how shadow profiles are created or used.
Molitorisz et al. (2021) – "From Shadow Profiles to Contact Tracing"
Source: Law, Technology & Policy Journal
📄 Full Paper (PDF)
Key Findings:
Job applicants are automatically enrolled in data tracking when they accept online terms of service.
Contact tracing technologies from COVID-19 were later repurposed for employee surveillance.
Behavioral data (attendance, breaks, performance metrics) are added to shadow profiles for long-term tracking.
⚠️ Risks Identified:
🔹 Workers' off-duty activities are monitored without consent.
🔹 AI hiring tools can unfairly judge candidates based on behavioral tracking.
Lageson, S.E. (2020) – "Digital Punishment: Privacy, Stigma, and Data-driven Hiring"
Source: Cambridge University Press
📄 Full Paper (Google Books)
Key Findings:
AI hiring systems integrate legal history (e.g., minor infractions, dismissed cases) into employment background checks.
People with expunged records still appear in hiring databases due to shadow profiling.
Recruiters often buy third-party background reports instead of checking public records.
⚠️ Risks Identified:
🔹 Expunged legal records still show up in data broker hiring profiles.
🔹 AI may flag candidates unfairly based on old social media activity.
Legal Request Template to Access Your Shadow Profile
If you suspect that recruiters or data brokers have compiled a shadow profile on you, you can legally request access to your personal data under GDPR (Europe), CCPA (California), or general privacy laws.
This template is designed to:
· Request a copy of all data held about you
· Ask how your data was obtained and shared
· Request corrections or deletion
Where to Send This Request
🔹 For Job Portals & Recruitment Platforms:
LinkedIn: [email protected]
Indeed: [email protected]
Glassdoor: [email protected]
🔹 For Major Data Brokers:
Acxiom: [email protected]
Experian: [email protected]
LexisNexis: [email protected]
🔹 For AI Hiring & Video Interview Platforms:
HireVue: [email protected]
Pymetrics: [email protected]
Subject: Request for Access to Personal Data Under [GDPR/CCPA]
Dear [Company Name] Data Protection Officer,
I am formally requesting access to all personal data your company has collected, processed, or stored about me under the [General Data Protection Regulation (GDPR) Article 15 / California Consumer Privacy Act (CCPA)].
Full Name: [Your Name]
Email Address(es): [Your Email(s)]
Phone Number: [Your Contact Number]
Date of Birth (if applicable): [Your DOB]
LinkedIn/Profile Links (if applicable): [Your Profile Links]
I request that you provide the following information:
A full copy of my personal data held in your system, including:
· Résumé, employment history, salary estimates
· Social media insights, behavioral analysis, recruitment scores
· Any predictive hiring or AI-based assessments about me
· How and when you obtained my data, including:
o Third-party data sources (data brokers, public records)
o AI-generated insights used in my candidate profile
o A list of any third parties that have received or purchased my data.
o Options to correct, delete, or opt out of further data collection.
Under GDPR Article 12 and CCPA regulations, you are required to respond within 30 days and provide this information free of charge.
If you fail to comply, I will escalate this matter to the [Data Protection Authority / Federal Trade Commission].
Thank you for your cooperation. I look forward to your response.
Best regards,
[Your Name]
[Your Email]
[1] (2024) VIRTUALHR: AI-DRIVEN AUTOMATION FOR EFFICIENT AND UNBIASED CANDIDATE RECRUITMENT IN SOFTWARE ENGINEERING ROLES International Research Journal of Modernization in Engineering Technology and Science
[2] Rahman, S., M., Hossain, M., A., Miah, M., S., Alom, M., Islam, M. (2025) Artificial Intelligence (AI) in Revolutionizing Sustainable Recruitment: A Framework for Inclusivity and Efficiency International Research Journal of Multidisciplinary Scope