Should I Worry About... Gender-biased Artificial Intelligence?

Should I Worry About... Gender-biased Artificial Intelligence?

GP writes: My first entry into a CGTN Europe series, Should I Worry About…? (The answer’s usually “Yes”.)

What’s the problem?

Artificial Intelligence is artificially male. The World Economic Forum’s latest Gender Gap Report notes that only 22 percent of AI professionals are female. And while AI may seem futuristic to some, it’s already gone from science fiction to societal fact: Adobe says that 88 percent of European companies will use AI for customer analytics by 2020, while Servion predicts that AI will drive a whopping 95 percent of customer interactions by 2025. 

Sadly, research firm Gartner calculates that by 2022, 85 percent of AI projects will deliver erroneous results due to bias in data, algorithms - or the teams that manage and build them. Because the problem lies not so much with AI itself, which is agnostic, but how it is built by humans, who have their own in-built biases. AI is humanity’s child, but its parents are overwhelmingly male. 

What’s the worst that could happen?

A world built on an infrastructure of gender inequality. Some may say this already exists, but having it run by a biased AI will only exacerbate it. “If we don’t build better and more transparent systems, we’ll build in bias and we’ll build in unfairness and that will disadvantage people,” says Gina Neff, a professor at the Oxford Internet Institute at the University of Oxford. “It will look like it’s technologically neutral and transparent but it won’t be.”

Think for example about virtual assistants, like Alexa, Google Home and Siri. By default, they’re usually ‘female’. University of Southern California sociology professor Safiya Umoja Noble calls this “a powerful socialization tool that teaches us about the role of women(...) to respond on demand.”

What do the experts say?

“An algorithm is an opinion expressed in code,” says Ivana Bartoletti, founder of the Women Leading In AI network. “If it’s mostly men developing the algorithm, then of course the results will be biased… you’re teaching the machine how to make a decision.” Neff warns that women are less likely “to be represented in news articles, to be the subject of political news, to be contacted as sources of information, to be on Wikipedia, to be in medical data. And so when you’re building technologies that scan for massive sources of information, we’re simply leaving out information about women and information created by women.”

So should I worry?

Yes, particularly if you’re a woman or have ever known one. “We need to do social systems analysis for how data systems and AI will continue to influence people for decades,” says Neff, who fears that even seeking gender equality in AI appointments wouldn’t fully solve the problem. “Having a woman in the room is a good start, but it’s not enough. We really need to be building systems for all people, across a whole array of divisions in society. If we don’t do these kinds of social, structural, systemic, holistic analysis of information and data we’re going to end up with biased systems, and those systems are going to fail people.”

Originally published by CGTN Europe, 19 Sep 2019

Why is the Queen involved in Boris Johnson's Supreme Court case?

What English football can learn from the disasters of Bolton and Bury

What English football can learn from the disasters of Bolton and Bury