Famous for 15 minutes

We care about technology-enhanced human systems. So we thought we would share regular thoughts and opinions about why we think they matter so much.

Modern recruiting tools, neurodiversity and keeping human

On the 2nd April, it was World Autism Awareness Day - and we wrote a little article about Recruiting Neurodiverse Talent - but at the same time we were pulling that together I spotted an excellent programme on the BBC around modern recruitment approaches, 'Computer says no', which also looked at the challenges that neurodiverse people have with software that professes to level the playing field. The reality however is, as proven time and time again, that this oft trotted out pitch is little other than a smokescreen trying to make the reality more publicly palatable - which is essentially that it's primarily about labour &/or time saving = cost. God knows those humans are damned expensive to run - especially the experienced ones. And we all know they're flawed too right. So here's to progress - right?

For all the spin and science that such providers sprinkle the tech will be based on pretty binary decisions and arbitrary logic patterns, assessing elements that no business would ever put up front on an application form because they'd soon find themselves hauled before a tribunal for discriminatory recruitment practices. But it's seemingly OK if you have a computer doing that on your behalf behind the scenes.

Well I shared the link to the BBC programme around the business and one of the developer team came back with the following, which I wanted to share more broadly. Of course there are different perspectives, that's what makes life so interesting, but in a world where the VC/PE fuelled market grab means your AI solutions are getting a helluva lot of air time, I wanted to add a small voice to bring some balance. And when one of our dev's pipes up it's generally with some fascinating insights to boot :

A really good AI is usually based upon an artificial neural network; weighting incoming values and deciding what output to return.

That being said: it's still a computer, a process system. If you know how it weights those incoming values, if you know the rules by which it processes the output then you can game the system.

Most neural networks have maybe ten to twenty inputs, a few hundred individual nodes doing the process to output to maybe ten outputs. The human brain has about 80 billion individual nodes, each individually wired to other parts of the network - the idea that AI is anywhere near the level of replacing a human at a decision based level is plain wrong.

I use Google's face recognition technology on my personal site to pick out the faces of wrestlers from pictures. It will still send pull out images that are clearly not faces but match the criteria for skin tone, two circular shapes in rough alignment and some sort of mouth or nose. As an example it thinks the lower left image is a face, we can clearly see it's a hand and arm, but for Google's AI it's 26% confident that's a face... only Sheamus in the bottom right ranks higher than 70%.

To me technology should not be a replacement for a human, it should enable a human to be more effective in the areas they are needed. I think AI can provide insight to a recruiter but it shouldn't be used to arbitrate and make decisions.

Video interviews can be an enabling technology, but certainly not for everyone and surely being flexible in trying to find the best candidate is a no-brainer, but either way the idea that a computer can pick up all the nuances of facial expression and vocal patterns that our brains have evolved over millions of years to process is plain arrogant.

We're barely a decade into understanding what the amygdala does, a computer can't tell the difference between yawning and screaming; it just knows there is a large dark circle at the bottom and that might be a mouth and it might be open.

Side note: pigeons are better than computers at spotting tumours. 

Every technology that comes along has its flaws; and the biggest is usually a human's over dependence on it.

So hope you found something of interest in there, even if it just makes you a little more curious, but ideally if you're someone involved in establishing or managing a recruitment process then maybe it'll help convince you to ensure you don't lose your human touch.

One final real world tale. I know someone very well who works for a very large company - and he manages a team in the logistics and distribution side of the business. Fairly typically they have a roster of agency staff that they can throttle up and down as needed based on demand, and when a full time position becomes available then they generally recruit from there.

But recently this company has instigated a pan-company AI powered recruitment solution. So it seemingly doesn't matter that someone has been doing EXACTLY the job that they're applying to for the best part of 18 months as agency staff, with good attendance and no issues - if you can't pass the new computer based test then, well, you're not getting the job. But that same person is good enough to go back to keep moving the inventory around the warehouses as agency staff, never coming into contact with any computer interface in their day-to-day role that fundamentally hasn't changed for decades.

Computers can do programmed logic - they don't do compassion. Seems that some of the people who put these systems in place as a blanket solution aren't that removed from the systems they plug in.

Stay human people - it's our differences that make us all the more wonderful, which also goes for the businesses we work in.

Article written by Alex Hens, HARBOUR's CEO 

Previous Next