Google introduced its MUM technology earlier this year. Billed as 1,000 times more powerful than Bert, it’s the next generation of tech in Search, aiming to provide answers to complex queries with fewer searches for the user, and working in a multimodal way, across multiple formats.
Up to now, we’ve had few details of exactly what MUM can do, but in its Search On event last week, Google showcased some capabilities, giving us a glimpse of what will be possible with MUM and how it’s likely to change the way users search.
Google says a new way to search visually, with the ability to add questions will be rolling out in the next couple of months.
“By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways.”
They gave a couple of examples of this in action.
In this example, you’d like socks with the same pattern as the shirt, which might be tricky to find. Just tap on the shirt image in Lens and type in your query.
Here, you’re looking for a spare part for your bike. Again, tap on the image and enter your query and you can find the exact moment in a video which shows you how.
Redesigned search page
Google says MUM is being used to redesign Google Search.
A new Things to know feature makes it easier to understand and explore new topics. In Google’s example, say you’re interested in finding out how to create acrylic paintings.
When you search on this term “Google understands how people typically explore this topic, and shows the aspects people are likely to look at first”. For example, it says they can identify over 350 topics related to acrylic painting, and help you find the right path.
Google says MUM will unlock deeper insights in the future, showing you content you wouldn’t have otherwise found.
Refine and broaden your search. New features help you further explore ideas, making it easy to zoom in and out of a topic.
These features will launch in the coming months.
Identify related topics in video
Another application of MUM technology is in video, where Google is using it to identify related topics, enabling you to dig deeper.
“Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video.”
Google’s example shows that while the video doesn’t say ““macaroni penguin’s life story,” their systems understand that it’s a related topic and so will suggest the query.
The first version of this feature will roll out in the coming weeks, with more visual enhancements being added over time.
Google is changing search
These previews show SEOs and marketers what to expect from MUM in search, and highlight different opportunities for visibility in the SERPs.
Our separate blog post - Google’s new technologies LaMDA and MUM - explores the implications of Google’s new AI and what this means for search. With these previews we have a better idea of how MUM will change search and the way users query and access information online. Google frames this as being more helpful, providing useful information including topics that the searcher may not even have thought of. However at the same time, is it is also putting more control in Google’s hands to manage and shape the process of search and delivery of results.