Hiring Algorithms Are Not Neutral

Primary tabs

Hiring Algorithms Are Not Neutral

By Gideon Mann & Cathy O'Neil
tweet me:
Hiring Algorithms Are Not Neutral #HarvardBusinessReview http://bit.ly/2gJQgGu
Monday, December 19, 2016 - 3:30pm

CAMPAIGN: Bloomberg: Global Diversity & Inclusion

CONTENT: Article

Originally posted on Harvard Business Review.

By Gideon Mann & Cathy O'Neil
Gideon Mann is head of data science at Bloomberg LP.

More and more, human resources managers rely on data-driven algorithms to help with hiring decisions and to navigate a vast pool of potential job candidates. These software systems can in some cases be so efficient at screening resumes and evaluating personality tests that 72% of resumes are weeded out before a human ever sees them. But there are drawbacks to this level of efficiency. Man-made algorithms are fallible and may inadvertently reinforce discrimination in hiring practices. Any HR manager using such a system needs to be aware of its limitations and have a plan for dealing with them.

Cliick here to read more.

Keywords: Diversity & Inclusion | Author | Careers | Diversity & Inclusion | Education | Gender | Race | algorithms | data science | demographics | diversity

CAMPAIGN: Bloomberg: Global Diversity & Inclusion

CONTENT: Article