(p. A23) The products of a company called HireVue, which are used by over 600 companies including Nike, Unilever and even Atlanta Public Schools, allow employers to interview job applicants on camera, using A.I. to rate videos of each candidate according to verbal and nonverbal cues. The company’s aim is to reduce bias in hiring.
But there’s a catch: The system’s ratings, according to a Business Insider reporter who tested the software and discussed the results with HireVue’s chief technology officer, reflect the previous preferences of hiring managers. So if more white males with generally homogeneous mannerisms have been hired in the past, it’s possible that algorithms will be trained to favorably rate predominantly fair-skinned, male candidates while penalizing women and people of color who do not exhibit the same verbal and nonverbal cues.
For the full story, see:
Joy Buolamwini. “The Hidden Dangers Of Facial Analysis.” The New York Times (Friday, June 22, 2018): A23.
(Note: the online version of the story has the date June 21, 2018, and has the title “When the Robot Doesn’t See Dark Skin.”)