Fay Cobb Payton Addresses Bias in AI Recruiting Tools
After the Amazon artificial intelligence (AI) hiring tool failed to create gender equality within the hiring process two years ago, many experts are taking a closer look at the possibility of hiring software perpetuating existing bias. Fay Cobb Payton, professor of information technology and business analytics for Poole College, has taken a look at how each phase of the hiring software process should be monitored and double-checked by humans.
“I believe that human-in-the-loop should not be at the end of the recommendation that the algorithms suggest,” said Payton. “Human-in-the-loop means in the full process of the loop from design to hire, all the way until the experience inside of the organization.”
Read more on IEEE Spectrum.
This article was originally published on the Poole College of Management News website here.
- Categories: