Americas

  • United States

Asia

Linda Rosencrance
Contributing Writer

How AI can give companies a DEI boost

feature
Jun 09, 20227 mins
Artificial IntelligenceDiversity and InclusionHiring

Artificial intelligence has a spotty past when it comes to diversity, equity, and inclusion, but more carefully crafted tools can point out, rather than perpetuate, bias in the workplace.

woman leader abstract diversity
Credit: Thinkstock

As artificial intelligence (AI) makes inroads into the enterprise, the adopters who’ve seen the most success are taking a holistic approach to AI, according to PwC’s 2022 AI Business Survey. Comprising 36% of survey respondents, these “AI leaders,” as PwC calls them, are using AI to target business transformation, enhanced decision-making, and systems modernization simultaneously rather than addressing one area at a time.

These and other organizations are also beginning to use AI to solve more complex business decisions around diversity, equity, and inclusion (DEI). In fact, 46% of AI leaders are using AI to make workforce decisions that include DEI, compared with 24% of other companies, according to PwC.

“Companies are using AI for recruiting and hiring as well as around retention and engagement,” said Bret Greenstein, PwC partner for data analytics and AI and co-author of the report.

AI’s harmful past in hiring

Although many companies are experimenting with AI as a tool to assess DEI in these areas, Greenstein noted, they aren’t fully delegating those processes to AI, but rather are augmenting them with AI. Part of the reason for their caution is that in the past, AI often did more harm than good in terms of DEI in the workplace, as biased algorithms discriminated against women and non-white job candidates.

“There has been a lot of news about the impact of bias in the algorithms looking to identify talent,” Greenstein said. For example, in 2018, Amazon was forced to scrap its secret AI recruiting tool after the tech giant realized it was biased against women. And a 2019 study conducted by Harvard Business Review concluded that AI-enabled recruiting algorithms introduced anti-Black bias into the process.

AI bias is caused, often unconsciously, by the people who design AI models and interpret the results. If an AI is trained on biased data, it will, in turn, make biased decisions. For instance, if a company has hired mostly white, male software engineers with degrees from certain universities in the past, a recruiting algorithm might favor job candidates with similar profiles for open engineering positions.

As AI developers become more aware of the potential for bias being built into recruiting and hiring software, they can work to safeguard against it. Indeed, 45% of organizations that PwC identifies as AI leaders said they have plans to address issues of fairness in their AI systems in 2022.

“I think using AI [for DEI] will move from experiment to production for recruiting and hiring as people get better at understanding and identifying bias and understanding how to assess future performance better,” Greenstein said.

Using AI to highlight bias

According to Gartner, 62% of human resources leaders report using DEI data as an input to talent processes such as recruiting and performance management. However, few are using it to effectively influence leaders’ decisions around workers. To create a diverse, equitable, and inclusive workforce, HR leaders have to better integrate DEI data strategies into day-to-day employee experience practices, said Emily Strother, senior principal, research at Gartner.

Organizations are increasingly embedding AI technology into their talent acquisition and management processes to highlight potential biases, Strother said. “In particular, we see this in how [they] handle recruiting and how [they] work with performance management. This is one of the places organizations are worried about bias the most, but AI can help.”

For example, some companies are using AI-powered tools to identify the biased language recruiting managers might use during candidate interviews. Corrective measures could include building bias reminders throughout the interview process or alerting managers when their language is biased or potentially has an unfair judgment, Strother said.

Managers’ biases can also creep in when it comes to setting goals for employees. AI can help by comparing employees’ goals against others with the same tenure and then alerting managers if they’re consistently assigning fewer or less important goals to certain workers.

“This helps managers realize some of their unintended biases in goal setting and helps them correct their behaviors,” Strother said.

AI can also help organizations ensure that their job postings are as free of bias as possible. “We see organizations using AI to review some of the job sites, such as LinkedIn or Indeed, to ensure the language they’re using when they post [open jobs] is accurate or in line with the skills [needed for the job] versus anything that might [indicate bias],” Strother said.

Kay Formanek, founder and CEO of diversity education company KAY Diversity and Performance and author of Beyond D&I: Leading Diversity with Purpose and Inclusiveness, offers an example. “If a company says, ‘We’re looking for driven leader, we’re looking for someone who’s ambitious, we’re looking at someone who’s going to deliver results,’ we call that a masculine job frame, and research has shown that women will tend to opt out” even when they’re well-qualified for the job, she said.

According to Formanek, women are looking for more feminine-leaning language, such as: “We’re looking for a leader who, together with the team, supports the growth agenda of the business. We’re looking for someone who creates a team.”

AI can help companies remove any biased language from their job posts and send alerts when the language may be biased in terms of gender or aligned to specific skill sets that may exclude qualified applicants from more diverse or underrepresented backgrounds, according to Strother.

“That’s very important,” Formanek said. “Because if you don’t do that, you’re going to turn off people who are very important for your diversity.”

Using AI to identify disengaged employees

One area PwC’s Greenstein sees great potential for AI is in worker retention. Retaining employees is the key to the success of a business, he said. The factors that drive people out of a business have a lot to do with workers feeling marginalized, disconnected, not engaged.

Companies can use AI to identify departments or roles with a high risk of attrition, workers who are dissatisfied or not engaged, and even people who feel isolated because they’re working remotely, Greenstein said.

“Generally, working remotely has had a bigger impact on diverse employees, because there [are] higher degrees of isolation. Less connection can be more harmful in most cases,” he said.

AI tools can help managers understand if some employees are more at risk than others, Greenstein said. “Managers can use AI to look for indicators in the data of how people interact to identify the degree of isolation people feel, as well as to look for triggers to determine when people seem to be more disconnected.”

While there aren’t standard tools for this purpose yet, PwC is seeing clients identifying the data they think matters most (travel, locations, calendar, performance, compensation, workload, etc.) to explore the impact that isolation has had on engagement and ultimately attrition, Greenstein said. Integrating the potentially relevant data together in data lakes or data warehouses in the cloud, companies are using mostly bespoke, cloud-native analytics tools to look for correlation and causation, create predictive models, and determine best actions, he said.

Once companies identify employees who feel disconnected or marginalized, the onus is on them to take action to make those workers feel respected and included. But knowing who feels left out is an important first step.

The dynamics of attrition and talent acquisition have changed drastically over the last two years and continue to evolve, so companies that have a handle on their data — and staffers with the analytics skills to interpret it — have an advantage, Greenstein said. “I think these tools can help us be better as managers and as partners and as peers for our people.”