How to Be Careful With AI-Enhanced Recruiting Software

June 30, 2022
By: Bill Yates

The Keynote Presentation delivered on Tuesday afternoon at Future of Work Expo 2022 featured a speaker with a different perspective – the government’s.

In “The Promise and Perils of AI in Employment Decision-Making,” lawyer Keith Sonderling discussed the government’s perspective on using AI-powered software to enhance your recruiting efforts.

Sonderling is a commissioner with the Equal Employment Opportunity Commission. “Just call it the EEOC,” he said.

The EEOC, an independent agency of the Federal government, was established by the Civil Rights Act of 1964. They settle complaints, and they have the power to file civil suits against companies. You don’t want them on your case.

Sonderling talked about the potential for AI-influenced software to help companies reduce discrimination in hiring. Companies spent $17 billion on recruiting software in 2021, with tech companies tripling their spend, he said.

“Virtual recruiting is here to stay,” he said. “AI has the potential to remove the human from human resources.”

The first thing to know about AI-influenced recruiting software is the user is responsible for the actions of the software. “There’s no defense saying the computer did it,” Sonderling said. “You’re responsible.”

Like all AI software, results are only as good as the programming. Properly conceived recruiting software can “eliminate potential discrimination at the screening phase,” Sonderling said.

AI-based recruiting software is designed to eliminate personal biases. If poorly programmed, though, AI-influenced recruiting software can produce “far greater damage than any one individual could do.”

So the stakes are high. Poorly programmed software can harm through both intentional and unintentional methods. The unintentional methods are harder to spot.

Sonderling described a case where a company used AI software to recruit applicants from only one zip code. The expressed intention was to retain a pool of employees who lived nearby and could easily commute.

The result was discriminatory because people who live in the same zip code have similar backgrounds. In cases such as this “AI can prevent you from ever seeing applicants,” he says.

Exclusionary practices represent some of the most difficult violations to spot, Sonderling said. But larger companies must rely upon AI-based recruiting software, just to peruse the volume of applications they receive.

For example, Sonderling noted that Walmart got more than 200,000 applications for a recent position it advertised. “How do you review all those applications?” he asked.

One option companies have employed is conducting initial interviews using AI-based software. An electronic voice questions the applicant and records the answers. Using AI-influenced software lets you “eliminate a lot of significant biases” such as name recognition and visual race identification, Sonderling notes.




Edited by Erik Linask


Original Page