As organizations increasingly adopt artificial intelligence in hiring, it’s essential that they understand how to use the technology to reduce bias rather than exacerbate it.

Mahe Bayireddi, CEO, Phenom

August 9, 2022

3 Min Read
robot arm selecting a cube representing an employee
Zoonar GmbH via Alamy Stock

Even with signs of a recession, hiring continues to be a top priority and challenge for several industries, including healthcare, hospitality, manufacturing, and transportation. There are approximately 11.4 million unfilled jobs in the US, according to recent reports from the US Bureau of Labor Statistics. With the current ratio of one qualified talent professional for every eight open roles, talent teams must find ways to be more efficient and effective.

AI enables a quick, efficient hiring process. It also can be a powerful tool to uncover hidden hiring biases and drive change, prompting organizations to assess their historical hiring data and improve recruiting and hiring processes.

The Meaning of Bias

Data scientists will tell you all data is biased because bias is about finding patterns in data. Some bias is desirable, such as the bias used to recommend jobs to candidates related to their preferred location, skills, interests, and job title. Encouraging biases toward satisfying these preferences when setting up your algorithmic models helps match candidates with jobs they want and recruiters with best-fit individuals to fill roles. In this case, bias benefits both job seekers and employers.

Social bias, however, concerns HR professionals and regulatory groups. This bias excludes job candidates based on gender, sex, age, ability, or other demographic attributes. Social bias also includes a discriminatory preference toward candidates who attended certain universities or specific listed previous employers on their resume.

AI bases its predictions on the data it receives. It may appear as if AI creates bias, but that’s not the case. AI only amplifies the bias already present in an organization’s historical data.

When left unchecked, AI’s amplification of existing bias can reduce diversity in the candidate pipeline because technology will recommend candidates with backgrounds similar to past hires and exclude people who don’t fit those criteria.

Organizations finding success with AI in hiring use this technology to shine a light on biased hiring trends while understanding that implementing AI doesn’t directly solve diversity or inclusion issues. Instead, it offers insight into diversity deficits so organizations can work to mitigate them.

Humans in the Loop

The best way to combat bias is to catch it before training your models. Early exploratory data analysis helps identify and remove social bias from a company’s hiring data before it’s included in training sets or goes into production.

Once your organization starts using AI, a “human in the loop” -- a team dedicated to assessing the AI’s output -- is essential to monitoring the technology’s progress. This person doesn’t need highly technical skills. Any team member passionate about advocating for diversity can learn to check for bias indicators.

The ongoing process can be as simple as scanning your organization’s AI dashboard weekly. Teams should compare the system’s hiring recommendations against organization standards. It's one important way to ensure no demographic groups are over- or underrepresented. Insights from the human in the loop help engineering teams adjust models to meet their goals.

An organization’s end users also contribute to refining algorithmic models. When recruiters or hiring managers notice indications of bias in the candidate recommendations they receive, they should raise the issue with a product manager. Then, a data scientist can investigate further to understand what happened and make recommendations on how to avoid it in the future.

Better Models with Constant Iteration

Some companies think of AI as an autonomous algorithm they simply “set and forget.” But AI doesn’t stand still — it keeps learning from your data. Your organization must treat AI as an ongoing process. Better-trained algorithmic models generate AI predictions more aligned with your hiring goals.

A keen understanding of the relationship between bias and data, combined with a passionate, diversity-minded team member will equip your organization to reap the most from AI in hiring processes. With AI supporting recruiters and hiring managers, they’ll have time to focus on the high-level, human elements of HR that no algorithm can replace.

About the Author(s)

Mahe Bayireddi

CEO, Phenom

Mahe Bayireddi is the CEO of one of Inc. 5000's fastest-growing private companies, Phenom. As a leader in the HR technology space, Phenom is known for pioneering the first and only Talent Experience Management (TXM) Platform. Bayireddi is a passionate entrepreneur leading Phenom in the mission to help a billion people find the right job opportunity. He previously co-founded several successful technology companies, including SnipSnap, BijaHealth, and BHSP Nexus Software.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights