Skip to main content

Google explains how it got Gemini image generation ‘wrong’

Google announced yesterday that it disabled image generation of people in Gemini following criticism about its historical accuracy. The company today issued a more detailed explanation.

Following two statements over the course of this week, Google is now out with a longer blog post penned by Prabhakar Raghavan, who is Senior Vice President of Search, Assistant, Ads, Geo (Maps), and other related product areas.

Google says it does “not want Gemini to refuse to create images of any particular group,” or “create inaccurate historical — or any other — images.”

…if you prompt Gemini for images of a specific type of person — such as “a Black teacher in a classroom,” or “a white veterinarian with a dog” — or people in particular cultural or historical contexts, you should absolutely get a response that accurately reflects what you ask for.

Google identified two issues as being responsible:

  • “…our tuning to ensure that Gemini showed a range of people failed to account for cases that should clearly not show a range.”
  • “…over time, the model became way more cautious than we intended and refused to answer certain prompts entirely — wrongly interpreting some very anodyne prompts as sensitive.”

Top comment by Shane N

Liked by 25 people

I tried to get Gemini to output an image of a couple at the movies and it wouldn't generate a single image of a Caucasian person despite running the query over and over. And when I specified "image of a white man and mixed race woman at the movies" it shut down the query.

I'm a progressive liberal but think the pendulum toward representation has over corrected and is now un-inclusive the other way now.

I think Gemini's failure in this is representative of all media at the moment overcorrecting, broaching on pandering.

View all comments

This led Gemini’s image generation feature, which is powered by Imagen 2, to “overcompensate in some cases, and be over-conservative in others.”

As previously stated, Google “will work to improve it significantly before turning it back on” for people-related prompts and conduct “extensive testing.” 

Of note in this blog is how Google distinguishes Gemini and Search: “we recommend relying on Google Search, where separate systems surface fresh, high-quality information on these kinds of topics from sources across the web.”

The blog post ends on the following note:

I can’t promise that Gemini won’t occasionally generate embarrassing, inaccurate or offensive results — but I can promise that we will continue to take action whenever we identify an issue. AI is an emerging technology which is helpful in so many ways, with huge potential, and we’re doing our best to roll it out safely and responsibly.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing