Skip to main content

Google says the wild mistakes in AI Overviews are just ‘isolated examples’

Google launched its AI Overview experience in Search to the general public last week and, already, results are going off the rails. Google, though, says that these are just “isolated examples.”

AI Overviews in Google Search work by using generative AI to craft a summary or direct answer to a user’s query. The basic idea is to condense the information from many different sources to give the end user something quick and easy to digest.

But, almost immediately, it’s gone wild.

Running through social media online, it’s very easy to find examples of Google Search AI Overviews confidently spitting out provably incorrect information. This includes telling users to put glue on their pizza, that chicken only needs to be cooked to 102-degrees F, saying blinker fluid is a real thing, telling people they can put out an oil fire by adding more oil, and saying that there’s not a country in Africa that starts with the letter “K.” And it’s doing all of that while directly stealing content from other websites to use as its own without even linking to those websites.

Of course, as with anything, the bad examples aren’t always the most common ones. For every bad result we’re seeing from AI Overviews, it’s very likely many other good ones are coming out. That’s what Google says, anyway.

Speaking to The Verge regarding mistakes like the ones below, Google explained that mistakes circulating online are “generally very uncommon queries, and aren’t representative of most people’s experiences” and added that the company is using the “isolated examples” to refine the new experience. Hopefully, that will mean these results get better with time.


Update: In a statement to 9to5Google, Google reiterates that “uncommon queries” are producing these results in AI Overviews and adds that it “appreciate[s] the feedback” from users.

The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web. Many of the examples we’ve seen have been uncommon queries, and we’ve also seen examples that were doctored or that we couldn’t reproduce. We conducted extensive testing before launching this new experience, and as with other features we’ve launched in Search, we appreciate the feedback. We’re taking swift action where appropriate under our content policies, and using these examples to develop broader improvements to our systems, some of which have already started to roll out.


AI Overvierws are now the default experience in Google Search, though they don’t appear for every query you run.

9to5Google’s Take

AI in Search was always going to result in problems like these. Between pulling answers from sarcastic Reddit threads and the hallucination that’s so common with any and all generative AI, the examples floating around were inevitable. But still, to see such glaring errors so quickly, it’s hard not to feel that AI Overviews being the default experience was a huge misstep, especially given they literally cannot be turned off.

At least there will be ads to break up the misinformation soon.

More on AI Overviews:

Follow Ben: Twitter/XThreads, and Instagram

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Schoon Ben Schoon

Ben is a Senior Editor for 9to5Google.

Find him on Twitter @NexusBen. Send tips to schoon@9to5g.com or encrypted to benschoon@protonmail.com.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing