Policy & Ethicsvoice assistantsconversational aigender bias
AI Voice Assistants Reinforce Gendered Stereotypes
7.1
Relevance Score
In 2024, AI voice assistants exceeded 8 billion worldwide and predominantly use female voices and names, researchers and activists report. Studies from 2020–2025 and incidents such as Microsoft’s Tay and Brazil’s Bradesco chatbot document high rates of verbal and sexual abuse, showing design choices normalize gendered subordination. Advocates urge gender-impact assessments, stronger regulation, and accountable design to prevent reinforcing real-world misogyny.



