Google halts AI system Gemini’s image generation of humans following historical errors

“While we do this, we’re going to pause the image generation of people and will re-release an improved version soon,” it added. While we do this, we're going to pause the image generation of people and will re-release an improved version soon. https://t.co/SLxYPGoqOZ — Google Communications (@Google_Comms) February 22, 2024Google launched the Gemini image generation tool earlier this month. Gemini’s Al image generation does generate a wide range of people. An earlier AI image classification tool made by Google caused outrage, back in 2015, when it misclassified black men as gorillas.

Google has announced a temporary suspension of its flagship generative AI tool, Gemini, while they work on improving its historical accuracy. Specifically, the company has paused the tool’s ability to generate images of people due to recent issues related to historical inaccuracies.

We’re already working to address recent issues with Gemini’s image generation feature. While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.

Google officially launched the Gemini image generation tool earlier this month, but it has faced criticism in recent days for producing incongruous images of historical figures. This includes depictions of Founding Fathers as American Indian, black, or Asian, leading to backlash and even ridicule.

In a post on LinkedIn, venture capitalist Michael Jackson joined the criticism, labeling Google’s AI as “a nonsensical DEI parody”, with DEI standing for ‘Diversity, Equity, and Inclusion’.

While we do this, we’re going to pause the image generation of people and will re-release an improved version soon. https://t.co/SLxYPGoqOZ

In a statement released on X, Google acknowledged the issues with inaccurate historical depictions and promised to work towards improving them immediately. It also clarified that the tool is designed to generate a wide range of people, which is generally seen as a positive feature. However, it seems to have missed the mark in this specific area.

Generative AI tools, such as Gemini, use training data and other parameters to produce outputs. However, they have often been criticized for producing biased and stereotypical outputs, such as sexualized imagery of women or white men being consistently portrayed in high-status job roles.

A previous AI image classification tool created by Google sparked outrage in 2015 when it falsely classified black men as gorillas. Despite promising to fix the issue, the company’s solution was simply to block the technology from recognizing gorillas at all.

Avatar photo
Dylan Williams

Dylan Williams is a multimedia storyteller with a background in video production and graphic design. He has a knack for finding and sharing unique and visually striking stories from around the world.

Articles: 874

Leave a Reply

Your email address will not be published. Required fields are marked *