6 best TED talks on gender bias and artificial intelligence

People are filled with biases and prejudices. All our decisions and actions are subject to our imperfect view of the world we live in. But can artificial intelligence be partial as well? Yes, it’s possible. While AI can make decisions more efficiently and less partially, it is not a clean slate.

A study by MIT and Microsoft researchers in 2018 on gender and race discrimination by Machine Learning Algorithms found that prominent facial recognition software has higher error rates when presenting images of women. Rates of error are even higher when a woman with darker skin is the subject. Similar to these examples, there are numerous cases of use, such as voice and speech recognition, where women had worse performance in AI applications.

Artificial intelligence is just as good as the data. It depends on how well creators have programmed it for thinking, deciding, learning, and acting. It may inherit or even amplify the prejudices of their makers — who often have no knowledge of their prejudices— or the AI may use biased data. The effects of such technology can change lives. Existing gaps in recruiting and promoting women and people of color may increase in workplaces, if the bias is not deliberately written into the AI code or if the AI is discriminated against.

Shall we look at some of the recent examples of gender bias by AI?

An employer was advertising on Facebook in 2018 for a job opening in a male-dominated industry, and the ad algorithm of the social media giant pushed jobs to only men to maximize returns on applicants’ number and quality. [Facebook accused of bias against women in job ads]

Amazon spent years creating an AI hiring tool from top candidates by feeding it. Reviewing candidate resumes and recommending the most promising candidates was the function of the AI. Since the industry is malignant, most of the curricula used to teach the AI were made by men, which ultimately led the AI to discriminate against women’s recommendations. It means resumes that included words such as “women” or “women’s colleges” education. Despite several unsuccessful attempts to correct the algorithm, the company finally had to scrap the AI because this bias could not be “unlearned.” [Amazon Reportedly Killed an AI Recruitment System Because It Couldn’t Stop the Tool from Discriminating Against Women]

Our AI bias explanation is often overlooked by biased training data. However, bias can indeed continue long before the information is gathered and during many other phases of profound learning, especially if engineers develop an in-depth learning model to determine what they are looking for.

To understand more on this topic, we recommend you to watch these videos.

Technologist Kriti Sharma explores how the lack of tech diversity creeps into our AI, offering three ways in which we can begin to create more ethical algorithms.

Joy Buolamwini worked with software for facial analysis and noticed an issue. The software did not detect her face because it was not taught to identify a wide range of skin tones and facial structures by the people who coded the algorithm. She is now on a mission to fight machine learning bias.

Documentary filmmaker Robin Hauser claims that we need a conversation on how to govern AI and ask who is responsible for overseeing these supercomputers’ ethical standards. “Now we have to figure this out,” she says. “Because once skewed data enters deep learning machines, removing it is very difficult.”

Social scientist Safiya Umoja Noble investigates the bias revealed in the results of the search engine and argues why we must be skeptical about the algorithms on which we rely every day.

Techno-sociologist Zeynep Tufekci explains how smart machines can fail in ways that do not fit patterns of human error — and how we will not expect or be prepared for them.

Finally, but not least. Let’s hear about gender and racial biases and inclusiveness in AI from Gunay Kazimzade. This talk was given by a local community at a TEDx event using the TED conference format but organized independently.