This week, marketers saw some large fluctuations in organic rankings, while Google quietly rolled out one of the largest updates to their algorithm in years.
Google is calling this latest algorithm “BERT,” and like many marketers, we wanted to better understand what this update means for our clients.
What is BERT?
BERT stands for Bidirectional Encoder Representations from Transformers. This is a Natural Language Processing (NLP) algorithm.
How does BERT operate?
BERT operates by helping Google to better understand the context of words and make better connections between the context of entities and actions.
Imagine for instance that you write a conversational query that has select elements that would be interpreted as “vague” by Google. For example:
“Where can I find my furnace?”
What the user actually wants to know is “Where should my furnace be located in my home?”
We see this kind of vague search all the time when we use voice search to execute something and we say “it” or “them” when referring to an exact entity.
Now imagine if Google could better understand the relationship between that pronoun and something or someone else, using Google’s API from Google Natural Language Processing Machine.
Let’s test this with a query like:
“What is a good SEER rating for my home?”
Google previously may have made a relationship between “SEER” and “Home” but what may have been missing is the understanding that SEER ratings are directly tied to HVAC systems.
By making this connection, BERT recognizes the relationships between entities, making the appearance of a relevant result much higher.
Another sample query we can test here is “Average alarm response time.”
For a security business, this is a big one. Imagine if Google could differentiate between a user who was searching for this in regards to a commercial alarm or a residential alarm.
Google knows this is related to security, but what if Google also knew that this question could be answered by stating “from the time the alarm sounds, to call, that the average security company (not stated in search) should be able to respond within XX amount of time.”
How will it impact local search?
- BERT only applies to US English queries at the time of this post.
- BERT will impact featured snippets across multiple languages.
- BERT is similar to Google’s RankBrain algorithm in that it is machine-learning based. It will continue to evolve, so the first results of the implementation most likely won’t mirror what is there today.
- BERT is neural network based. So optimizing for it isn’t necessarily possible. Instead, think about this as “Are we optimized for humans?”
- This update will allow Google to better understand the context of words within searches and make better connections between the context of entities and actions.
What Rocket expects in the coming months
This advancement places a greater emphasis on brands and their ability to produce relevant and valuable content. Our approach has always been to produce timely and informative content that's relevant to users, so our approach won’t really change based on this update.
If you are worried this update has impacted your strategy, do not hesitate to contact us.
For more resources on BERT, check out these trusted publications: