Google unveiled its AI powered Search Generative Experience (SGE) at its I/O developer conference last week. It’s shaping up to be a real game-changer for Search.
It puts AI firmly front and center in Search and will dramatically change the most visited page on the internet - the search results page.
Google describes SGE as an “early step” in using generative AI to transform Search, helping you ask entirely new types of questions you didn’t think search could answer, get a quick overview of a topic with links to explore further, and ask follow-ups in conversational mode.
SGE is currently powered by Google’s MUM and a next generation LLM (large language model) called PaLM 2.
The new Search experience
SGE will “where appropriate” show an AI-powered snapshot above the search results listings. The snapshot is clearly labelled as experimental.
Here’s an example for a question which normally you’d have to break down into separate queries: "what's better for a family with kids under 3 and a dog, bryce canyon or arches?".
With this query you’re shown an AI-powered snapshot of key information with links to dig deeper and next steps. Links to search results come below the snapshot.
Below the snapshot you’ll be able to click to ask follow-up questions, either one of your own or a suggested one such as “How long to spend at Bryce Canyon with kids?”. This then takes you into conversational mode.
Conversational mode carries over context from previous questions to reformulate the query and better reflect intent, and web links below SGE will change throughout to remain relevant as the conversation progresses.
Corroboration and links
One of the big concerns about Google Bard was that it didn’t provide sources or links for the information it presented. SGE shows a thumbnail and clickable link to resources that corroborate the information in the snapshot. There’s also an option to toggle deeper via a button at the top right, which will show more sources you can click on.
Google says Search ads will continue to appear with SGE, in dedicated ad slots throughout the page.
Vertical searches: shopping and local
SGE for shopping is built on Google’s Shopping Graph which has over 35bn product listings, and over 1.8bn updates hourly, meaning fresh, up-to-date information.
Google says SGE will help uncover insights to aid complex purchase decisions. For product searches it will generate a snapshot of noteworthy factors to consider and a range of product options. Product descriptions include relevant, up-to-date reviews, ratings, prices and product images.
Here’s SGE in action on a search for a bluetooth speaker for a pool party.
For a local search, Google gave the example of a comparison between two popular places for lunch in New Orleans.
The snapshot shows a direct comparison, with key information from Google Business Profile such as reviews, and a cost guide.
You can then also add a third venue to the comparison through a follow-up, which is pretty neat.
Google is using color to highlight to people that SGE is a new way to interact with Search, and the color container of the snapshot will dynamically change. Google says this use of color will evolve over the next few months “to better reflect specific journey types and the query intent itself”.
Google acknowledges there are known limitations with generative AI and LLMs. While SGE also applies LLMs, it has been trained to carry out tasks specific to Search.
“By constraining SGE to these specific tasks, including corroboration, we’re able to significantly mitigate some of the known limitations of LLMs, like hallucination or inaccuracies.
We further mitigate these challenges by using our existing Search quality systems and our ability to identify and rank high-quality, reliable information.”
SGE has been trained not to show potentially harmful, hateful or explicit content and there are topics for which it won’t show a response. These are typically YMYL (Your Money or Your Life) topics such as finance, health or civic information.
“Just as we do on Search, for YMYL topics, SGE places even more emphasis on producing informative responses that are corroborated by reliable sources.”
SGE won't generate a response for explicit or dangerous topics, or where there is an information gap - a lack of quality or reliable information available.
Google has made a conscious choice of factuality over fluidity with SGE. It found giving the models leeway to create fluid, human-sounding responses resulted in a higher likelihood of inaccuracies. At the same time, human evaluators were more likely to trust fluid and conversational responses, and miss errors. For this reason, Google has deliberately constrained conversationality in SGE, meaning you’ll find it more factual than free-flowing and creative.
A radical change in search
Clearly, SGE marks a massive development in search, with the potential for transformational change and a big overhaul of Google’s search results page. The AI-generated snapshot, while it does have links, is still an amalgamation of myriad sources across the web which can’t be unpicked, even with the deeper dive toggle.
The snapshot takes up the main real estate on the page, pushing organic links further down, and even more so with mobile.
There’s a lot to like in having complex queries answered quickly and simply, and then going deeper or in a slightly new direction with follow-ups. But, and it’s a big but, it’s taking us further into a world where Google is providing the answers without us accessing primary sources ourselves. And, as Google itself admits, Search doesn’t always get it right.
You can sign up for the waitlist for early access to SGE at Google Search Labs. Google says access will start in the coming weeks.
For more detail see Google's overview of SGE.