AI Overhauls Google Search: How It Works, Opt-Out Guide

People turn to the internet to run billions of search queries each year. These range from keeping tabs on world events and celebrities to learning new words and getting DIY help.

Authors

  • T.J. Thomson

    Senior Lecturer in Visual Communication & Digital Media, RMIT University

  • Ashwin Nagappa

    Post Doctoral Research Fellow, Queensland University of Technology

  • Shir Weinbrand

    PhD Candidate, Digital Media Research Centre, ADM+S Centre, Queensland University of Technology

One of the most popular questions Australians recently asked was: "How to inspect a used car?".

If you asked Google this at the beginning of 2024, you would have been served a list of individual search results and the order would have depended on several factors. If you asked the same question at the end of the year, the experience would be completely different.

That's because Google, which controls about 94% of the Australian search engine market, introduced " AI Overviews " to Australia in October 2024. These AI-generated search result summaries have revolutionised how people search for and find information. They also have significant impacts on the quality of the results.

How do these AI search summaries work, though? Are they reliable? And is there a way to opt out?

Synthesising the internet

Legacy search engines work by evaluating dozens of different criteria and trying to show you the results that they think best match your search terms.

They take into account the content itself, including how unique, current and comprehensive it is, as well as how it's structured and organised.

They also consider relationships between the content and other parts of the web. If trusted sources link to content, that can positively affect its placement in search results.

They try to infer the searcher's intent - whether they're trying to buy something, learn something new, or solve a practical problem. They also consider technical aspects such as how fast the content loads and whether the page is secure.

All of this adds up to an invisible score each webpage gets that affects its visibility in search results. But AI is changing all this.

Google is the only search engine that prominently displays AI summaries on its main results page. Bing and DuckDuckGo still use traditional search result layouts, offering AI summaries only through companion apps such as Copilot and Duck.ai.

Instead of directing users to one specific webpage, generative AI-powered search looks across webpages and sources to try to synthesise what they say. It then tries to summarise the results in a short, conversational and easy-to-understand way.

In theory, this can result in richer, more comprehensive, and potentially more unique answers. But AI doesn't always get it right.

How reliable are AI searches?

Early examples of Google's AI-powered search from 2024 suggested users eat "at least one small rock per day" - and that they could use non-toxic glue to help cheese stick to pizza.

One issue is that machines are poorly equipped to detect satire or parody and can use these materials to respond in place of fact-based evidence.

Research suggests the rate of so-called "hallucinations" - instances of machines making up answers - is getting worse even as the models driving them are getting more sophisticated.

Machines can't actually determine what's true and false. They cannot grasp the nuances of idioms and colloquial language and can only make predictions based on fancy maths. But these predictions don't always end up being correct, which is an issue - especially for sensitive medical or health questions or when seeking financial advice.

Rather than just present a summary, Google's more recent AI overviews have also started including links to sources for key aspects of the answer. This can help users gauge the quality of the overall answer and see where AI might be getting its information from. But evidence suggests sometimes AI search engines cite sources that don't include the information they claim they do.

What are the other impacts of AI search?

AI search summaries are transforming the way information is produced and discovered, reshaping the search engine ecosystem we've grown accustomed to over two decades.

They are changing how information-seekers formulate search queries - moving from keywords or phrases to simple questions, such as those we use in everyday conversation.

For content providers, AI summaries introduce significant shifts - undermining traditional search engine optimisation techniques, reducing direct traffic to websites, and impacting brand visibility.

Notably, 43% of AI Overviews link back to Google itself . This reinforces Google's dominance as a search engine and as a website.

The forthcoming integration of ads into AI summaries raises concerns about the trustworthiness and independence of the information presented.

Where to from here?

People should always be mindful of the key limitations of AI summaries.

Asking for simple facts such as, "What is the height of Uluru?" may yield accurate answers.

But posing more complex or divisive questions, such as, "Will the 2032 Olympics bankrupt Queensland?", may require users to open links and delve deeper for a more comprehensive understanding.

Google doesn't offer a clear option to turn this feature off entirely. Perhaps the simplest way is to click on the "Web" tab under the search bar on the search results, or to add "-ai" to the search query. But this can get repetitive.

Some more technical solutions are manually creating a site search filter through Chrome settings. But these require an active act by the user.

As a result, some developers are offering browser extensions that claim to remove this aspect. Other users are switching search engines entirely and turning to providers that don't provide AI summaries, such as Bing and DuckDuckGo.

The Conversation

T.J. Thomson receives funding from the Australian Research Council. He is an affiliate with the ARC Centre of Excellence for Automated Decision Making & Society.

Ashwin Nagappa receives funding fromthe Australian Research Council. He is a Postdoctoral Research Fellow in the QUT node of the ARC Centre of Excellence for Automated Decision Making & Society.

Shir Weinbrand receives funding from the Australian Research Council. She is a PhD candidate in the QUT node of the ARC Centre of Excellence for Automated Decision Making & Society.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).