Addressing

Exploring AI: Combatting Racial Bias in Image Generating Technology

Gettyimages 1062086882
This week in AI, Google paused its AI chatbot Gemini’s ability to generate images of people after a segment of users complained about historical inaccuracies. Google’s ginger treatment of race-based prompts in Gemini didn’t avoid the issue, per se — but disingenuously attempted to conceal the worst of the model’s biases. Yes, the data sets used to train image generators generally contain more white people than Black people, and yes, the images of Black people in those data sets reinforce negative stereotypes. That’s why image generators sexualize certain women of color, depict white men in positions of authority and generally favor wealthy Western perspectives. Whether they tackle — or choose not to tackle — models’ biases, they’ll be criticized.