By Anthony Kaylin, courtesy of SBAM Approved Partner ASE
Artificial intelligence (AI) has its issues, but in the long run, it does assist in today’s job marketplace. While currently in the beginning stages, ten years from now, AI will likely be the expected form of communication between job applicants and employers. With so many resumes submitted for a job, sifting through them in a timely manner and still trying to maintain a positive applicant experience while being short-handed, is a quagmire most recruiting organizations try to avoid.
A new tool being used by employers today is similar to Tinder, or as one writer puts it, a Corporate Tinder. These corporate Tinders conduct an automated video interview and guide applicants through a conversation with their computer screen. As it is explained, “the applicant stares at the webcam distortion of their face (instructed to emote normally like they would if speaking with an actual person), tries to explain why they want the job and then once more sends the information back into the abyss, often without being able to review their video first. The software will then produce a report and likely a ranking that will be used to determine if they get an interview with an actual person.”
Many recruiters like the automated video interviewing process as it assists to quickly narrow the applicant search. And whether true or not, many of the suppliers of these programs tell recruiting groups that they are scientific in their approach, which has much little bias and less errors than the human touch. There are a number of companies that provide these services including HireVue, Modern Hire, Spark Hire, myInterview, Humanly.io, Willo, and Curious Thing. Although flawed, this approach will likely be the future. For example, HireVue announced in March 2021 that its platform had hosted more than 20 million video interviews since its inception.
Why is it flawed? First and foremost, what algorithm is controlling the process? As shown in various studies, how the algorithm is written and by whom could lead to discrimination claims. Amazon shut down one project in 2015 when at testing found that only males were being let through the screening process. In other words, Amazon realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. Who wrote the program? Males. What were the comparative factors for setting guideposts for selection? Male resumes (as they dominated the software developer field).
Machine-learning tools generally are not audited or regulated. They are known to recreate or enhance already existing biases. Therefore, an HR group that does not do its due diligence is knowingly making a choice to take on risk, which could lead to punitive damages in a lawsuit.
Another issue was pointed out by the EEOC and U.S. Department of Justice guidelines for use of AI in recruitment, especially the use of video evaluation. Applicants may have disabilities or otherwise have “issues” which may flaw video evaluations. HireVue is not using any facial recognition in its tool because of a complaint with the FTC. HireVue stated that they “made the decision to not use any visual analysis in our pre-hire algorithms going forward. We recommend and hope that this decision becomes an industry standard.”
Then there is the issue of speech and speech patterns – another area of concern. Who says that the best candidate is a smooth operator? “We only score by the way the words people say that are transcribed, not the way they sound or the way they look. That is a hard line that we draw and have always drawn; my mentality and our mentality as a company is that we should only be scoring information that candidates consciously provide to us,” said Eric Sydell, the executive vice president of Innovation at Modern Hire. “There are organizations that use that information. I think it’s wrong. I only give you express permission to use my responses; that’s the right way that we need to proceed.”
Moreover, there will be questions in the future to be answered by the EEOC regarding whether these tools need to be validated according to the uniform guidelines on employee selection, because the use of the tools is to weed out applicants. As such these tools assess applicants, they should be validated. If not, the employer using these tools could be subject to liability exposure if they are not tracking who did and who did not get through and why. And like regular assessment tools, be careful of validation claims. Generally, a good validation runs between .2 and .4 and is never perfectly correlated. A lot of testing companies may tend to oversell their validation and products.
Finally, HR recruiters need to recognize the future of work, as less recruiters will be needed, especially with the dearth of recruiters today; therefore, HR costs will be less of a factor in the automated future. In other words, HR could eventually be outsourced with a minimum group, even in very large organizations. The future of work will greatly impact HR both positively and negatively in the coming years.