At the end of October of this year 2019, Google has announced the use of a new open source neural network for natural language processing called BERT. It is an update of the Google algorithm that began last year in the testing phase. In the words of the vice president of Google Search, Pandu Nayak, the “greatest advance of the last five years” and one “of the greatest in the history of Google Search”.
As Nayak says in an article in English, “15% of queries are made for the first time, have not been done before.” So the search engine has developed ways to return results for queries that cannot be anticipated. And although users believe that the search engine understands the keyword chains they write, “sometimes we are still not able to get it right and do it quite well.”
How does Google Search face queries in which words do not combine well because the user does not know exactly how to do the query? Taking into account, as noted, that of billions of daily consultations, 15% are queries made for the first time.
He does it with the development of BERT. An advance that has been until December 9 in progress only in the English version in the United States of Google Search. And from this December 9, in more than 70 countries, as announced in the official Twitter account @searchliaison.
Nayak states that Google Search has achieved “significant improvements” in languages such as Korean, Hindi and Portuguese. Because, as he says, the understanding of the language and nuances of the consultations “remains a continuous challenge and keeps us motivated to continue improving the search engine.”
BERT’s objective and Google’s algorithm: understand the intention of the queries
As Nayak explains, the essence of Google Search is to understand the language. That is, the job of the Google Search department is to “decipher what you are looking for and find useful information on the web, no matter how you type it or combine the words in your query.”
Google’s goal is to provide “useful” information to user queries and that they write the queries in the most natural way possible. Although it is not always easy, especially in long, conversational consultations or in which the prepositions “for” or “a” have many possible meanings. For these cases, BERT and the Google algorithm will help the search engine “be able to understand the context of the words in the query”. But, yes, these models are not infallible.
“No matter what you are looking for, or the language in which you speak, we hope to be able to put aside some keywords and make the search engine work in a way that is natural for you. But anyway, you’ll still stumble on Google from time to time. Even with the use of BERT, we don’t always do things right, ”says Nayak.
Examples of how the Google algorithm update understands the intention of the query and the context
Pandu Nayak, vice president of Google Search, explains some examples of how the search engine in English shows results trying to understand the intention of the query. There are three of these examples:
Check “2019 Brazilian traveler to the United States needs a visa”
Here is a search “2019 Brazil traveler to usa need a visa” (“2019 traveler Brazil to the United States needs a Visa”). The word “a” and its relationship with others in this query are “particularly important for understanding the meaning,” says Nayak. This is a Brazilian person who is going to travel to the United States and not vice versa. Previously, “our algorithms could not have understood the importance of this connection and we returned results on US citizens who wanted to travel to Brazil,” he explains. However, he adds, “with BERT, the Google search engine is able to understand the nuance and know that the word” a “, very common, in fact is very important, and we can return a much more relevant result for this query.”
This change in Google’s machine learning algorithm and that is directly related to natural language processing, seeks to understand what words mean in a sentence but taking into account all the nuances of the context. This highlights that keyword searches tend to be more refined and fully focused on returning the most optimal results for what the user is looking for. Thus, where before a root keyword served us, now it will be a contextual keyword that is positioned better in the first results of Google. With BERT, the positioning will be based on the whole context and the contents that we believe should conform to it if we want to appear in the results ”.
What does this change mean for business?
For companies, it represents an important challenge in terms of staying positioned and relevant in the search engine. Since many of the formulas that were used in SEO must be changed to a SEO more oriented to the content and the technical part.
The opportunity lies in taking advantage of this change and starting to implement organic positioning work that really means a difference for the business and its web traffic in digital, so as to keep it stable or increase it significantly over time.