The Google BERT update and what it means for your rankings

Posted by Owen Powis on 25 Oct, 2019
View comments Search News
Google has announced a major update introducing Google BERT, “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”

So just what is Google BERT?

Here at Wordtracker we've been integrating Natual Language Processing into our toolset for some time with Inspect. This is powered by Open Calais, which allows us to better understand the meaning of content and so match keywords to it that may not directly appear in the content itself.

Natural Language Processing is really important for search and Google BERT is a massive step forwards in this. BERT stands for Bidirectional Encoder Representations from Transformers and is a new Natural Language Processing technique that Google has developed. The ‘T’ is the key here.

The transformer in BERT is a model that processes words as part of a whole sentence. Each word is looked at in the context of the whole sentence. This provides a much better overall understanding than by looking at the words one by one.

This translates to Search because it means that Google can now better interpret the intent or true meaning of a sentence and therefore display more relevant results. In other words, understanding what someone wants even if they don’t quite know the best way to ask for it.

BERT is being applied to both ranking results and featured snippets. Meanwhile Google are coming under increasing pressure about their use of featured snippets, which may explain some of their resistance to comply.

Google expects this to impact around 1 in 10 searches.  As much as 10% of rankings could change as this is rolled out. However this seems to be very much targeted at more complex queries and so is unlikely to affect core terms driving large volumes of traffic. There could be an increased chance of being affected where you rely on question-driven terms or those which are more likely to be driven by long tail searches.

“Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”

https://www.blog.google/products/search/search-language-understanding-bert/

Here’s an example search and how it will be affected:

The old result is just finding information about the query, whereas the new result is more focused on trying to solve the searcher's potential problem.

Here’s another example of BERT better grasping the nature of the searcher's request:

This is impressive as it has linked the term ‘stand’ to the physical activity and so shown a highly relevant result. The interesting thing is that there does not appear to be a matching result currently available:



So where Google cannot find a keyword match BERT has managed to understand what someone might have been looking for and display what it thinks is most likely to be a relevant result.

In fact a little rewording and we can get the same result that BERT does:


The same result is in the top position for both queries. Your content that already ranks well may see additional traffic from the ‘lost’ searches. Previously your article may have provided the right answer, however Google lacked the ability to link it to the searcher's question.

In some cases it appears that BERT is simply doing a better job of understanding the importance of the keywords within a search. Here’s an example:
 


Previously, Google sees that the word ‘curb’ is important for the search and so focuses the results on that word. With BERT, Google can better understand the meaning of the sentence and therefore which words are actually important. It stops Google from focusing on the wrong keywords.

You can find out more about BERT here from Google's announcement:
https://www.blog.google/products/search/search-language-understanding-bert/

You can find out much more indepth technical information from the Open Sourcing announcement for BERT here:
https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html

Expect to see the BERT update rolling out over the next week. We’ll be keeping a close eye on it as it does so and there will definitely be more on this to come.


Update : Article edited and updated for accuracy on 30th October 2019

Recent articles

Google launches new personalisation options in Search
Posted by Edith MacLeod on 27 November 2023
Google adds small business filter to Search and Maps
Posted by Edith MacLeod on 21 November 2023
Google releases Nov 2023 reviews update
Posted by Edith MacLeod on 9 November 2023
Interactive content: engaging your audience in the digital age
Posted by Brian Shelton on 8 November 2023
Google releases November 2023 core update
Posted by Edith MacLeod on 3 November 2023