As Artificial Intelligence (AI) enters the enterprise, recipients who have seen the most success are taking an holistic approach to AI, according to PwC’s 2022 AI Business Survey. Comprising 36% of survey respondents, these “AI leaders,” as PwC tells them, are using AI instead of addressing one area at a time for business transformation, improved decision-making, and system modernization.
These and other companies have begun to use AI to address more complex business decisions around diversity, equity and inclusion (DEI). In fact, according to PwC, 46% of AI leaders use AI to make workforce decisions that include DEI, compared to 24% of other companies, according to PwC.
“Companies are using AI for hiring and hiring, as well as for retention and engagement,” said Brett Greenstein, data analytics and AI and co-author of the report, PwC Partner.
AI’s detrimental past in recruitment
Although many companies are experimenting with AI as a tool to evaluate DEI in these areas, Greinstein noted that they are not entirely devoting those processes to AI, but rather enhancing them with AI. One reason for their caution is that in the past, AI often did more harm than good in terms of DEI in the workplace, because biased algorithms discriminate against women and non-white job seekers.
“There’s a lot of news about the impact of bias on algorithms for talent identification,” Greenstein said. For example, in 2018, when the tech giant realized that it was biased towards women, Amazon was forced to scrap its secret AI recruitment tool. And a 2019 survey conducted by Harvard Business Review concludes that AI-enabled recruitment algorithms have introduced anti-black bias in the process.
The causes of AI bias are, often unconsciously, those who design AI models and interpret the results. If an AI is trained on biased data, it will in turn make biased decisions. For example, if a company has in the past hired mostly white, male software engineers with degrees from certain universities, a hiring algorithm may be in favor of job candidates with a similar profile for the open engineering position.
As AI developers become more aware of the potential for bias in hiring and hiring software, they can work to protect against it. In fact, 45% of the organizations that PwC identifies as AI leaders say they plan to address fairness issues in their AI systems by 2022.
“I think using AI [for DEI] Recruitment and hiring will go from testing to production because people are better at understanding and identifying biases and understanding how to better evaluate future performance, ”Greenstein said.
Using AI to highlight biases
According to Gartner, 62% of HR leaders report using DEI data as an input for talent processes such as recruitment and performance management. However, very few workers are using it to effectively influence the decisions of the leaders around them. To create a diverse, equitable and inclusive workforce, HR leaders need to better integrate DEI data strategies into the practice of everyday employee experience, says Emily Strother, Gartner’s Senior Principal of Research.
Companies are increasingly embedding AI technology in their talent acquisition and management processes to highlight potential biases, Strother said. “Specifically, how we see it [they] Recruitment and how to handle [they] Work with performance management. This is where companies are most concerned about bias, but AI can help. “
For example, some companies are using AI-powered tools to identify biased language that hiring managers may use during candidate interviews. Corrective measures may include creating biased reminders or warning managers throughout the interview process when their language is biased or potentially an unfair judgment, Strother said.
Managers’ biases can also crawl when it comes to setting goals for employees. AI compares employee goals with others over the same period and then warns managers if they can consistently assign less or less important goals to specific employees.
“It helps managers understand some of their unintentional biases in goal setting and helps them correct their behaviors,” Strother said.
AI can help companies make sure their job postings are as impartial as possible. “We see companies use AI to review certain work sites, such as LinkedIn or, indeed, to make sure they use the language when they post. [open jobs] Accurate or consistent with efficiency [needed for the job] That could be anything vs. [indicate bias]”Strother said.
K Farmanek, founder and CEO and author of Diversity Education Company KAY Diversity and Performance Outside of D&I: Leading diversity with purpose and inclusion, Provides an example. “If a company says, ‘We’re looking for a driven leader, we’re looking for someone who is ambitious, we’re looking for someone who is going to deliver results,’ we call that a masculine work frame, and research has shown that women tend to opt out.” Although eligible for the job, he said.
According to Farmanek, women are looking for more feminine language, such as: “We are looking for a leader who, along with the team, supports the business growth agenda. We are looking for someone who can make a team. “
AI can help companies remove any biased language from their job posts and send warnings when the language may be gender biased or linked to a specific skill set that may exclude eligible applicants from a more diverse or under-represented background, according to Strath.
“It’s very important,” Farmanek said. “Because if you don’t, you’re going to shut down people who are very important to your diversity.”
Using AI to identify isolated employees
One area sees a lot of potential for AI in retaining PwC’s Greenstein staff. Retaining employees is the key to a business’s success, he said. The factors that drive people out of a business have a lot to do with workers feeling marginalized, disconnected, not employed.
Greenstein said companies can identify departments or roles using AI that have a high risk of attrition, employees who are dissatisfied or not hired, and even those who feel isolated working remotely.
“Generally, working from a distance has a big impact on different employees, because there [are] High degree of isolation. Low connectivity can be detrimental in most cases, ”he said.
AI tools can help managers understand if some employees are at greater risk than others, Greenstein said. “Managers can use AI to look for indicators in the data of how people communicate to identify the level of human isolation, as well as triggers to determine when people seem more disconnected.”
Although there is still no standard tool for this purpose, PwC is looking to help clients identify the data they deem most important (travel, location, calendar, performance, compensation, work stress, etc.) to explore the impact that isolation has had on involvement. Depression in the end, Greenstein said. By combining potentially relevant data into data leaks or data warehouses in the cloud, companies are using mostly bespoke, cloud-native analytics tools to find relationships and causality, create predictive models, and determine best practices, he said.
Once companies identify employees who feel disconnected or marginalized, it is their responsibility to take action to respect and include those employees. But knowing who is left out is an important first step.
The dynamics of attrition and talent acquisition have changed dramatically over the past two years and are evolving, so companies that have a handle on their data – and employees with analytical skills to interpret it – have an advantage, Greenstein said. “I think these tools can help us become better as directors and as partners and as collaborators with our people.”
Copyright © 2022 IDG Communications, Inc.