Google BERT: an update to understand natural language

Merche Martínez

Written by Merche Martínez

The week of 21 October 2019 Google announced that it was going to roll out a new algorithm update called BERT. This latest update represents one of the major advances in the history of search engines, and one of the biggest advances in the last 5 years.

What is Google BERT?

The BERT acronym stands for Bidirectional Encoder Representations from Transformers.

This is a technique based on neural networks for pre-training natural language processing (NLP). BERT models can interpret the entire meaning of a word by analysing words that come before and after, which is highly useful to understand user search intent when they input a query. Presently, words are processed in a linear way, one after the other, focussing on each term without paying attention to the context as a whole.

Pandu Nayac (Google Fellow and Vice President of Search) claims that in order to make this change happen, traditional hardware had to be expanded, using the latest cloud TPUs to build BERT models and to obtain the most relevant information as fast as possible.

Why is Google BERT useful?

15% of daily searches in Google are new. There isn’t any previous information of whether the displayed results coincide with the user’s search intent. Sometimes, users don’t exactly know which terms they need to use to search for what they’re interested in, they want to learn, but they don’t know where to start.

For that reason, BERT tries to understand human language, discover what the user is looking for, and show them information they’ll find useful. It especially focusses on queries containing prepositions like “to”, “from”, “for”, or the negative “no”, when they are important for the meaning.

In what aspects we will notice its impact?

In the search results and featured snippets.

Barry Schwartz claimed on 25 October that with the recent update 10% of queries in English in the United States were affected. In time, it will spread to more languages and countries, although, for the time being, we do not know when it could arrive in Spain.

With regard to featured snippets, Pandu Nayac says that a BERT model is already being used to improve them in 24 countries. Languages, for which significant improvements have already been noticed, are Corean, Hindi and Portuguese.

Google BERT search result examples

To better understand the update’s real application, Google provides a few search result examples for several search queries, before and after the Google BERT update.

Google BERT update before and after example

«2019 Brazil traveler to usa need a visa» search query has a very clear intent: someone from Brazil wants to know whether they need a visa to travel to the US in 2019. Before the update, Google didn’t understand the importance of “to” preposition, and displayed results on US citizens travelling to Brazil. With BERT, search results for this query are now relevant.

Google BERT update example

For this search query, “do estheticians stand a lot at work”, up until now the keyword system displayed results with the “stand-alone” term when “stand” was used, even though this isn’t the correct association if we look at the context of this question. Google BERT now understands that in this case, “stand” is related to a job requirement.

Google BERT before and after example query

In this query, “can you get medicine for someone pharmacy”, BERT understands the importance of “for someone” and displays results accordingly.

Google BERT update before and after maths query

Another good example is this “math practice book for adults” query. Before BERT, the top result was an Amazon product page with a maths textbook included in the young adult book category. Now the algorithm understands the term “adult”and returns a much more useful search result.

Featured snippet example post-Google BERT

Featured snippets post Google BERT update

For “parking on a hill with no curb” up until now, the search engine only took into account the term “curb”, disregarding the preceding “no”. Google BERT, however, focusses on the context and displays useful results for this query.

What should we do to optimise our website for Google BERT?

There’s nothing new we can do, really. Just keep on writing our website’s content focussing on our users. It is Google’s researchers, who are working hard at making machines understand human language increasingly better, in order to return the most relevant result for each query.

 

Where do keywords stand now?

For some time now, we’ve been observing how pages ranked for specific keywords, without including the exact terms within their content or prominent areas like their title or description. This doesn’t mean that keywords entered by users are no longer taken into account, but that we must strive to make our landing pages useful within the context of these queries.

References

  • https://searchengineland.com/welcome-bert-google-artificial-intelligence-for-understanding-search-queries-323976
  • https://www.blog.google/products/search/search-language-understanding-bert/
  • https://medium.com/@lola.com/nlp-vs-nlu-whats-the-difference-d91c06780992
Merche Martínez
Autor: Merche Martínez
SEO consultant at the Human Level online marketing agency. She's an expert in search engine optimization at both national and international levels. She's also a certified Google AdWords user.

Leave a comment

Your email address will not be published. Required fields are marked *