Episode 12: Hey Siri, Does AI have gender and diversity bias?
This pod’s topic is a unique one as it mixes technology with human bias. For decades the words “Artificial Intelligence” evoked aliens, machines taking over the human race, and even action movies like Avengers: Age of Ultron. In the tech world we’ve seen Siri and Alexa gain their fair share of the market, and corporations are scrambling to strategize how they can best use AI. But has technology really become smart enough to let machines make decisions meant for humans?
Join us with special guest Rishi Behari as we discuss this in detail.
Inspiration:
We talked a lot this episode about gender bias, but AI can pose a problem for applicants with disabilities as well. We’re linking an article from Technology Review that dives into the challenges that can arise, including asking for reasonable accommodation during the application process.
At the end of 2020 a group of senators wrote a letter to the chair of the Equal Employment Opportunity Commission asking about its oversight authority over AI screening and hiring tools. Specifically they argue that “The Commission is responsible for ensuring that hiring technologies do not act as built-in headwinds for minority groups.” This led to the EEOC affirming that AI hiring does require oversight, as reported by Bloomberg Law.
Related Resources:
Rishi mentioned a playbook created by the University of Chicago that dives deeper into algorithmic bias. Check out the full playbook on their site here.
We’re wondering why we never noticed all our digital assistants - like Alexa and Siri - have female voices. Looks like Amazon noticed the trend too; we’re happy to see there’s a new masculine voice option for the Alexa virtual assistant (although we’re not sure how to feel about the name Ziggy…)
About our Special Guest Rishi Behari: