Focus on Diversity, Equity, Inclusion and Belonging and Analytical Aptitude: Algorithmic Racial Bias in Automated Video Interviews
Facial recognition software tends to be less accurate for—or biased against—Black and African American faces, according to research funded by the Society for Industrial/Organizational Psychology Foundation Such bias could impact automatic interview scores, thereby creating systematic disadvantages for racial minorities if these algorithms are adopted by organizations.