AI has its uses but it should not be treated as a solution for everything

Whilst we recruit Machine Learning and Data experts for other companies, the actual use of Machine Learning for recruiting purposes is highly problematic. It may seem crazy that recruiters working within this sector would question ML, but because we know this subject, it makes perfect sense to us.

Yes, technology helps us recruiter in some ways, but the idea of a one-size-fits-all solution is the incorrect reading of the situation. Tech can help us find new ways to reach out to the right candidates but when it comes to finding and selecting who is perfect for the role, it lacks the natural element, that understanding that comes with interacting with those kinds of candidates, working with those businesses, researching the roles. It lacks a human touch, that extra sense that is sometimes indefinable.

Would you buy a house with no human interaction whatsoever? Yes you can look at the pretty pictures, you can even walk around it. But if no one was talking to you about the house, if you couldn’t meet the owners and gauge them, if you couldn’t get a sense of what it is like to actually live there, something just wouldn’t feel right. Finding a job, or finding the right individual to fill a role, is a really big deal. No one goes into it lightly and rightfully so. Selections made via automated algorithm can seem right but scratching beneath the surface reveals a more complicated situation.

The idea that most people have, and one which drives the concept of using AI to dominate the recruiting process, is that machines have no bias. It is true that machines do not have emotions and if that were the only factor, then it would be understandable that you would take that route with your recruitment. The problem is that machines need to be fed facts by humans. Biased humans equal biased data equal biased AI decisions.

The Amazon gender bias story from 2018 highlights this. Amazon were seeking a system to simplify the process, where they could input a lot of CVs and the machine would select the top five based on a rating system. The problem was that the machine could only base its future predictions on past actions. Amazon realised that, because the past ten years had been dominated by male applicants, the AI had a bias against female applicants.

The Amazon debacle happened with correctly inputted data. If the data is inputted incorrectly or is formatted in a way that the system cannot read it, that information will be missing from the final decision. There is no universal format for a resume, so you can imagine the problems that could ensue there. Sometimes people just are not very good at writing their CV. They may miss out something that the AI is searching for. There may be a spelling error. They may possess extraordinary skills but few qualifications. The AI can only view the information in a way that it has been instructed to. It is fascinating when you start to think of all of the things you can pick up from a resume instinctively, and if you had to tell someone else your thought process, you would be there for a week, because you would have to impart the stories of all the place and situations in which you learned those things. Suffice it to say, there isn’t enough time to teach a machine what is inside your head.

And all of that is before you even get to the emotionless impersonality of it all, the lack of understanding when it comes to personality and the way machines struggle to process the ever-evolving and complicated rules of slang.

Here at Zenshin Talent we embrace AI and ML to help us target the right candidates in a more efficient way. The other element we always include is our human understanding, using it to nurture the client and candidate relationships in order to ensure the correct match.

Curious about how Zenshin Talent can help your organisation? Contact us today for a no-strings conversation about your needs and our experience.