Artificial Intelligence (AI) and Machine Learning (ML) technologies can have positive outcomes in terms of productivity gains, enabling/automating high-volume and time-consuming activities and enhancing customer/stakeholder engagement.
In the Talent Acquisition process AI has been heralded as offering great potential for greatly improving the recruitment process and do it better and faster than a human recruiter.
This view was founded on the notion that AI could be ‘trained or programmed’ using actual historical data of past successful candidates for a specific job or jobs and would excel in accelerating the sourcing, screening, and assessment of in-demand candidates. AI was also expected to eliminate unconscious bias and discrimination based on gender, race, age, or history in the recruitment process.
Sounds reasonable and indeed many companies, some of them global giants, are currently using AI in their recruitment process.
But what if the raw data used to build and program the AI system contains unintended bias from the outset? What could unintended consequences look like?
The global Amazon Corporation’s experience provides a salutary lesson.
It has been widely reported that several years ago Amazon established a team to develop an AI program to mechanize its search for top talent. System developers used compatible information and patterns obtained from applications previously submitted to the company over a 10-year period. As the tech industry traditionally attracted and employed more men than women, naturally, most of the data came from male applicants.
In time, Amazon recognised that its system basically had ‘self-educated’ to penalize non-male references in candidate applications and resumes such as female oriented educational institutions, activities, interests, and memberships etc. The AI system essentially ‘learned’ that male candidates were preferred based on historical ‘evidence’!
Hence, unintended bias was built into the system from the start. It is useful to remember that AI algorithms are only as good as the data used to build the program and the data is only as good as the people who collect and build it into the system.
Amazon ultimately decided to abandon the project over concerns the AI might not be able to remain gender neutral. (Australian Financial Review report 11/10/18).
The other aspect to consider in the use of AI in recruitment is whether candidates and clients would prefer to interact with a system or with a human recruitment professional. We believe it is the latter.
From a talent attraction professional’s point of view, while we recognise the potential of AI in the recruitment process, we believe that at this point in time, AI should primarily be used as a tool in the hands of a suitably trained and experienced human practitioner. We prefer to assess each application and resume submitted to us personally and to communicate with candidates and clients directly.
To find out more about our approach to talent acquisition please call Allan Rae on 9607 8444 or 0418 323 743.





