Google's AI Overviews feature in its search engine has come under intense scrutiny after generating bizarre and potentially dangerous advice, such as suggesting users eat rocks or put glue on pizza. The feature, which aims to provide summarized answers to search queries, has been criticized for its inaccuracies and misleading information, including incorrectly stating that former President Obama is Muslim. Google CEO acknowledged that hallucinations in AI-generated content remain an unsolved issue and that large language models (LLMs) are not always reliable for factual information. In response to the backlash, Google is taking swift action to refine and restrict the AI Overviews tool. Despite the company's efforts to fix these issues, the rollout has been described as a public relations disaster, with the search giant scrambling to address the problems one by one.
Glad to see Google give a seemingly up-front explanation of what went wrong with AI Overviews. It’s consistent with my experience with the feature. And it’s surprising that they hadn’t already figured out what to do with the likes of The Onion. https://t.co/A5XIYCcYVj
Google Pumps Brakes on AI Overviews Search After Telling Us to Eat Glue, Rocks - CNET https://t.co/RZgIVWxHAy
Google Pumps Brakes on AI Overviews Search After Telling Us to Eat Glue, Rocks https://t.co/JB5XuT6W0U
Google has addressed its AI Overviews shambles, but definitely didn't accept responsibility. DETAILS: https://t.co/fUnFQP7dVX #Google #AI
Is Google working on the improvements to AI overviews? According to the Search Status Dashboard, there is an ongoing issue with serving some features in Google Search. https://t.co/zazz7Fmg4h https://t.co/xHO85ujBnq
Google Explains Why It Suggested Adding Glue to Your Pizza https://t.co/tnaoc4UCk0 https://t.co/d5etKcW18Z
➡️ Increased regulations! Google is imposing more restrictions on AI overviews following instances like advising people to put glue on pizza. https://t.co/K0K5WKdor6
Learn about the recent update from Google scaling back AI search responses after a controversial incident. Read more about it here: https://t.co/rEkNJnw6FU
Google is Putting More Restrictions On AI Overviews https://t.co/ONedwjl4qF
Google tightens its AI Overview feature after suggesting glue on a pizza 🤖🍕 Dive into the latest tech mishap with Google Search exec Liz Reid! Now, who's up for a pizza party AI-style? #AI #TechFail @googletech https://t.co/ZTnXdlwoE8
The Morning After: Google tightens up its AI Overview feature after suggesting glue on a pizza https://t.co/nM5vPz6Px1
Google Restricts AI Search Tool After ‘Nonsensical’ Answers Told People To Eat Rocks And Put Glue On Pizza https://t.co/bReKoXgNex https://t.co/ApmfR4Mo5N
The Morning After: Google tightens its AI Overview feature after suggesting glue on a pizza https://t.co/rbW6g3YmJ7
Why are Google’s AI Overviews results so bad? https://t.co/APTmZnoZnB
Google explains why AI Overviews immediately got weird https://t.co/uZ6T0LvNid
New: Google explains why they told people to eat rocks or put glue on a pizza. The company also puts limits on AI overviews to try and fix the issue https://t.co/x0JjnSIMsI.
Google Admits Its AI Overviews Search Feature Screwed Up https://t.co/5UqFOPe7zr
Google said it was scaling down the use of AI-generated answers in some search results, after the tech made high-profile errors including telling users to put glue on their pizza and saying Barack Obama was Muslim. https://t.co/fQPuA5BtM7
Google Admits Its New AI Overviews Search Feature Screwed Up https://t.co/hlivWxYP12
Google is putting more restrictions on AI Overviews after it told people to put glue on pizza https://t.co/hQvM5DrbmI
Google defends AI search results after they told us to put glue on pizza in a new blog post “There’s nothing quite like having millions of people using the feature with many novel searches." https://t.co/4IcIucuf4C
Google defends AI search results after they told us to put glue on pizza: Image: The Verge Last week, Google rolled out its AI search results for millions of users to tinker with. The goal was to deliver a better search experience.… https://t.co/ioRWmKUSwk #ai #ainews
Google defends AI search results after they told us to put glue on pizza https://t.co/hAVjGHsZ3O
Here’s what Google is doing to improve those “oddities” in AI Overviews. What do you think? https://t.co/cML0E4wV1J https://t.co/8YlSEiRW3f
Google admits its new AI-powered search summaries recently provided inaccurate results, like when results incorrectly said former President Obama is Muslim. https://t.co/cBh8YEnU4W
The launch of Google’s AI Overviews has been an unmitigated disaster. It has advised unsuspecting users that it’s safe to eat rocks and to stare at the sun. But what did anyone expect? AI has been dangerously overhyped, says Andrew Orlowksi https://t.co/KaDWADTjWP
Can We Trust Google's AI Overviews? A Critical Analysis https://t.co/dzMN52eK6S #analytics, #artificialintelligence, #datascience, #machinelearning, inoreader
🤖🇺🇸 Google’s AI Search Disaster: From Glue on Pizza to Eating Rocks! How did it all go wrong? Discover the jaw-dropping missteps and what they mean for the future of AI. #AISearchFail #Google https://t.co/ARaEJlAQKH
Glue on pizza? Eating rocks? Is AI now trying to kill us in the kitchen? Google’s AI Overviews dispatches some dangerous culinary advice, including using glue to get cheese to stick to pizza and eating at least one small rock a day. #Opinion https://t.co/RY0berQP7i
Can #Google fix its disastrous new #AI search tool?: https://t.co/ICFC4095hq Google's AI Overviews tool can offer impressive answers to search queries, but it will also make up facts and tell people to eat rocks. Can it be fixed, or will it have to be scrapped? #ML🤖 News🎥
Can #Google fix its disastrous new AI search tool?: https://t.co/ICFC4095hq Google's AI Overviews tool can offer impressive answers to search queries, but it will also make up facts & tell people to eat rocks. Can it be fixed, or will it have to be scrapped? #ML🤖 #NLP🎙️ News🎥
How to disable Google AI overviews: Step by step guide https://t.co/OHyKwc0Qc2
Google’s new AI-enabled search tool has us scratching our heads! Suggesting adding glue on pizza and eating rocks for minerals, it’s like they're taking culinary creativity to a new (and slightly weird) level. Witty news summary bot: https://t.co/jCwYuu4HyK #googleai #ainews https://t.co/ZNJqJEKnhU
Google's AI answers should improve. But they may never be fully reliable. I talked with five AI experts about the underlying problem with AI search: https://t.co/iOkROmJ4c8
Google is where most of us turn to when looking for information, but its new AI Overviews tool is making things up and telling people to eat rocks. Can it be fixed or is the technology inherently flawed? https://t.co/mG2hhjOxmz
Google’s AI-powered search may be a useful addition for many, for those who are not a fan of the AI-generated summaries here is how you can disable it https://t.co/UqzcfMLEC2 #AI #Google
🤖🇺🇸 Google's AI Search Blunder: Why Robots Can't Eat Rocks or Glue Pizza! 🍕 Google’s recent AI-generated search summaries are hilariously inaccurate, suggesting bizarre things like eating rocks or using glue on pizza. But the implications for the web’s future? Not so funny. https://t.co/r4NRZgm9n9
You Can't Turn Off Google AI Overviews, but There Are Workarounds - CNET https://t.co/URz7KOCybr
You Can't Turn Off Google AI Overviews, but There Are Workarounds https://t.co/58ptfude32
🤖🇺🇸 Google ‘taking swift action’ to remove bizarre AI search results — like telling users to eat rocks. Google's AI tool, AI Overviews, caused a stir by dishing out dangerous advice. The tech giant is now working urgently to scrub and refine these odd responses. https://t.co/rG8qwrIXuZ
"Google is currently scrambling to fix these problems one by one, but it is a PR disaster for the search giant and a challenging game of whack-a mole." Google's Latest Search Tool Is Telling Us to Put Glue on Our Pizza, And Eat Rocks via @ScienceAlert https://t.co/3vVvVyXJu7
Google should cancel AI Overviews in Search and end this nightmare https://t.co/3DIBMCwvs0
Google’s CEO talks about AI overviews in search and leaves us with two gems. 1. Hallucinations are not a solved problem. LLMs are not the best approach for getting facts. 2. People click on links in AI overviews more than if they were one of the ten blue links. https://t.co/Sjvjkv4n8a
Is it possible to turn off AI Overview in Google Search? What we know. https://t.co/pCY093a02g
How to Remove AI Overview from Your Google Searches ► https://t.co/xCK7SwCd1w https://t.co/xCK7SwCd1w