Google BERT Algorithm Update Trying to Improve Search Quality by 10%
As of the time of this writing, Google has been updating to the BERT Algorithm Update. They say the update could change up to 10% of result rankings. BERT, as it is known, is based on natural language understanding. Pandu Nayak is a Google Fellow & Vice President of Search. He said in a recent article that, “At its core, a search result is about understanding language. It’s our job to figure out what you are searching for and surface helpful information from the web.”
BERT Algorithm Update – Better Understanding Sentence or Query Meaning.
Google claims that this update will improve the quality of search results. It does this by understanding how words in a sentence relate to each other. Pandu also stated, “last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system.”
Old Gooogle Versus the New Algorithm Update
Before Google’s BERT Algorithm Update, the search algorithm treated sentences as a jumble of words, looking at the essential words such as tool or mechanic and returning SERP results. The new Bert algorithm Update allows Goolge to better understand the context of words in any sentence or query. Pandu states, “by applying BERT models to both rankings and featured snippets in search, we’re able to do a much better job helping you find useful information.” Pandu also states, “In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.”
What is a BERT Update?
BERT stands for Bidirectional Encoder Representations from Transformers. BERT looks at all of the words in a sentence an interprets meaning. By doing this, it can determine the meaning of the words making the sentence and deciding if some words should of is they are critical to the overall purpose of the sentence or query.
BERT knows to pay attention to specific words by self-learning. Google looked at millions of English sentences and randomly removed some of the words. They had BERT figure out what it thought the missing words might be. The training has turned out to be very useful in teaching Google’s natural language interpretation models how to understand and get the context of a sentence.
Has Google Tested This Change Enough?
Whether their testing is considered to be enough is hard to know and only time will tell. This is a major change but according to Google, they have run extensive testing to ensure they are improving and not degrading results. They used hundreds of people who are training the algorithms by subsequently rating search result quality and by running live A/B tests.
BERT should not Affect Many Queries According to Google
BERT, will not affect many queries, and it’s the latest tool Google is using to better rank search results. As everyone knows, how it all works together is never fully disclosed to the public. Google is intentional about this to avoid people from gaming the system. However, organically, when computers use machine learning to make decisions, understanding those choices can be nearly impossible.
As an example, taken from Pandu’s article, “Here’s a search for, “2019 brazil traveler to USA need a visa.” Pandu says. ” The word
to and its relationship to the other words in the query are particularly important to understanding the meaning.” He further states, “It’s about a Brazilian traveling to the US and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection.” BERT, can supposedly grasp sentence structure nuance and know the word “to” is essential, thus Google can improve the quality of the query result.
Unknown Effects of the BERT Update
Machine learning is excellent until it is not. If the algorithm understands incorrectly and the results are not helpful, it can be challenging to know why. Furthermore, while Google believes that it has ensured BERT will not increase bias, those training the model are themselves prone to bias. Because BERT is trained on thousands of sentences, each having a possibility of bias, it is something for which we should be aware.
Google has claimed that it does not believe there will be a significant difference in where they direct traffic. However, when Google makes a change to its search algorithm of this magnitude, everyone involved takes notice. Many companies live and die by Google’s SERP ranking changes.
As web professionals, we should all be aware of the potential impacts of this or any change. We believe this update is one of the most significant changes Google has rolled out in at least the last five years. You can guess that Google’s guidance will be the same as it always is. “If you are following our content guidelines you should see no negative impact.” That is hard to take for face value when they are claiming to improve the results by 10% as that likely means someone may lose out on the related search. However, in the end, we hope that this means truly means that search results returned on a given query are more meaningful.