Gender Bias in Artificial Intelligence
Everyone here reading uses mobile phones and laptops, also right now you are viewing the post on your laptop? Despite all the modernization, technical advancements and progressive societies, the gender bias and “otherness” is quite obvious in Artificial Intelligence. For Example, the voice assistants which are now part of most of our lives propagate the similar objectification observed in advertisements and movies. “Siri”, “Alexa” or “Cortana”, familiar with the applications? All have female voice to assist you with daily life tasks. Although, we all agree with how artificial intelligence has impacted in our lives and made life very easy but the self-fulfilling prophecy of gender biasness is propagated through it. The way in which AI bots and voice assistants reinforce harmful gender stereotypes is one area that need further consideration. Gendered identities, voices, and looks are used by many customer-facing automation across the world, including computerized hotel personnel, waiters, bartenders, security officers, and caregivers. In the United States, Siri, Alexa, Cortana, and Google Assistant, “which together account for 92.4 percent of the smartphone assistant market share in the United States, have generally used female-sounding voices.” (BBC)
Because of their early acceptance and usage in the mass consumer market, voice assistants around the world serve as a practical illustration of how AI bots elicit basic critiques regarding gender representation and how software firms have handled these issues. The history of voice assistants, gender prejudice, the diversity of the IT workforce, and recent breakthroughs in gender depictions in voice assistants are all finally open for discuss.