Mon–Fri 10:00–17:00 IST
IJMEM Logo
International Journal of Modern Engineering and Management | IJMEM
Multidisciplinary
Open Access Journal
ISSN No: 3048-8230
Follows UGC–CARE Guidelines
Home Scope Indexing Publication Charges Archives Editorial Board Downloads Contact Us

Algorithmic Bias in AI-Driven Recruitment Systems in India

Author(s):

Namrata Singh Bhattacharya, Rajeev Krishnaswamy

Affiliation: Department of Labour Studies & Social Work, Banaras Hindu University, Varanasi, Uttar Pradesh, India ,Department of Management Studies, Mahatma Gandhi Kashi Vidyapith, Varanasi, Uttar Pradesh, India

Page No: 22-25-

Volume issue & Publishing Year: Volume 3, Issue 3, 2026/03/08

Journal: International Journal of Modern Engineering and Management | IJMEM

ISSN NO: 3048-8230

DOI:

Abstract:

Artificial intelligence-based applicant tracking systems (ATS) and automated resume screening tools have achieved widespread adoption in Indian corporate recruitment, with estimates suggesting that over 73% of large Indian enterprises and 56% of mid-sized firms now use some form of AI-assisted resume screening for entry- to mid-level positions. These systems promise to eliminate human cognitive biases — halo effects, affinity bias, similar-to-me bias — from resume shortlisting decisions and to improve screening efficiency in high-volume recruitment contexts. However, algorithmic systems trained on historical hiring data inherit and may systematically amplify the structural biases embedded in those historical patterns, with potentially serious implications for equity in employment access across India's intersecting dimensions of caste, gender, and educational institution prestige stratification.


This explanatory sequential mixed-methods study employs a resume audit experiment (N=2,400 fictitious resumes submitted to four commercial AI recruitment systems across four job categories) to quantify shortlisting rate differentials across four demographic signals — gender-coded names, caste-coded surnames, educational institution tier, and residential address urbanicity — followed by in-depth interviews with 32 HR professionals and algorithm audit of 8 AI vendor documentation packages to explain the mechanisms underlying observed biases. The institute-tier signal produces the largest shortlisting disadvantage (−21.3 percentage points for Tier-3 vs. Tier-1 institution, pooled across AI systems, p<0.001), followed by caste signal (−12.7 pp for SC/ST-coded surnames vs. general category surnames, p<0.001), gender signal (−8.4 pp for female-coded names, p<0.001), and address urbanicity (−6.8 pp for rural addresses, p=0.003). All four AI systems exhibit statistically significant bias on at least three of four demographic signals.

Keywords:

algorithmic bias, AI recruitment, resume audit, applicant tracking system, gender discrimination, caste discrimination, educational institution prestige, India, mixed methods, human resource management, fairness, digital discrimination, ATS, structural inequality

Reference:

  • [1]  Baert, S. (2018). Hiring discrimination: An overview of (almost) all correspondence experiments since 2005. In Audit Studies: Behind the Scenes with Theory, Method, and Nuance. Springer.

  • [2]  Banerjee, A., Bertrand, M., Datta, S., & Mullainathan, S. (2009). Labor market discrimination in Delhi. Journal of Comparative Economics, 37(1), 14–27.

  • [3]  Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? American Economic Review, 94(4), 991–1013.

  • [4]  Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

  • [5]  Datta, A., Tschantz, M. C., & Datta, A. (2018). Automated experiments on ad privacy settings. Proceedings on Privacy Enhancing Technologies, 2015(1), 92–112.

  • [6]  Kearns, M., & Roth, A. (2019). The Ethical Algorithm. Oxford University Press.

  • [7]  Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447–453.

Download PDF