FEATURE-Can artificial intelligence help close gender gaps at work?

The real key to change is opening difficult, honest, conversations about bias that can challenge misconceptions, said Allyson Zimmermann, a director of women's workplace rights organisation Catalyst. But AI tech can help to upend those preconceptions and open opportunities, she added, citing the case of a young woman who got an interview after being selected using technology that "blinded" recruiters as to her gender and age.


Reuters | Updated: 17-11-2021 15:49 IST | Created: 17-11-2021 15:29 IST
FEATURE-Can artificial intelligence help close gender gaps at work?
Representative image Image Credit: Max Pixel

* AI-powered human resource tools target gender inequality * Software used to evaluate CVs, performance appraisals

* Algorithms can 'lock in' existing bias, experts warn By Sonia Elks

Is it because she is a mother? Or perhaps she is perceived as lacking ambition or leadership qualities? Gender stereotypes continue to hold women back at work, but a handful of tech firms say they have developed artificial intelligence (AI) systems that can help break biases in hiring and promotion to give female candidates a fairer chance.

Employers and the wider economy could stand to gain, too. "We are at this moment in artificial intelligence, that we either can hardwire our biases into the future or ... to hardwire equity," said Katica Roy, chief executive of Colorado-based software firm Pipeline Equity.

"A lot of the time that we talk about equity, we talk about it as a social issue or the right thing to do, which it is, but it's a massive economic opportunity." Organizations are increasingly turning to AI to help make hiring decisions, prompting concern among digital rights experts who warn that algorithms can perpetuate biases.

An AI hiring tool developed by Amazon had to be scrapped after it taught itself male candidates were preferable to women. But women's rights groups and digital experts said well-designed tech aimed at targeting bias can "shine a light" on the hidden factors holding women back.

"Bias is as old as human nature, and traditional hiring practices have been shot through with several different biases," said Monideepa Tarafdar, a professor in the Isenberg School of Management at the University of Massachusetts Amherst. "I think AI can be part of the solution. Definitely. But I do not think it can be the only solution." INCLUSIVE ALTERNATIVES

These equality-focused technology firms are using AI to bypass or review decisions such as scanning CVs or deciding pay rises and offer personalized, data-based advice. Software developed by Pipeline Equity, a startup founded in 2014, has several human resources uses - from checking for biased language in performance reviews to offering advice on hiring and promotions.

Textio also uses AI to analyze companies' corporate statements and job postings to identify whether they are adopting a masculine tone that will alienate women or members of minority groups, and suggesting more inclusive alternatives. Pymetrics, another leading firm in the space, offers gamified assessments that it says evaluate potential hires more fairly than reading CVs.

Studies have found that businesses led by diverse teams tend to be more profitable while boosting women's presence and role in the workplace could be worth billions of dollars to national economies. But COVID-19 has spurred a "shecession" that has seen a disproportionate number of women pushed out of the labor force. The International Labour Organization found gender gaps have widened and women's employment is set to recover more slowly.

Meanwhile, companies are struggling to fill open positions with record numbers quitting in the United States in what has been dubbed "the great resignation". "Businesses have so many roles that they're unable to fill, I mean, empty seats can't do your work for you," said Kieran Snyder, chief executive of Textio.

"You need to hire great people if you're going to have any kind of success." HELPING OR SPYING?

But AI will not be a silver bullet in creating fairer workplaces, women's rights advocates and researchers said, warning that the technology could raise as many problems as it solves. The idea that technology offers some kind of unbiased factual truth or objectivity is an illusion, said Manish Raghavan, a postdoctoral fellow at the Harvard Center for Research on Computation and Society.

"All AI has to learn from data in some way; it has to learn from past decisions," he said. "That's not to say it's impossible to use technology to mitigate your own implicit biases, I think it just has to be very, very carefully designed. And I honestly just don't think we're at that point yet where we're able to do that."

A lack of transparency about how most commercial algorithms work makes it hard to scrutinize their performance, he added. Tarafdar, who is leading a research project to analyze how AI can lead to unintentional workplace bias, said effective solutions cannot just pinpoint key hiring decisions but must also look at the wider workplace culture.

Bosses should also carefully consider how much data they can gather on workers before their actions slip from helping towards surveillance, she added. The real key to change is opening difficult, honest, conversations about bias that can challenge misconceptions, said Allyson Zimmermann, a director of women's workplace rights organization Catalyst.

But AI tech can help to upend those preconceptions and open opportunities, she added, citing the case of a young woman who got an interview after being selected using technology that "blinded" recruiters as to her gender and age. "When she showed up for the interview, they just burst out laughing. And it wasn't, you know, a rude kind of laughing. They were so shocked that she was this young woman," she said.

"It opened their eyes; they thought they would have a middle-aged man coming in ... She went into the interview, she got the job. She told me it was an extremely positive experience."

(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)

Give Feedback