Google recently announced the most significant change to its search algorithm in five years, the Google BERT update. This is the largest change since Google released RankBrain. Google stated that the update would impact 10% of searches.
BERT started rolling out last week (announced on 24th October 2019) and will be fully live shortly. It is rolling out for English queries in the US now and will expand to include other languages and countries soon. Google said BERT is being used globally, in all languages and on featured snippets.
What you need to know:
- What is BERT?
- How BERT Works
- How BERT Affects RankBrain
- How BERT Affects You
- How to Optimise for BERT
What is Google BERT?
Google BERT is the term being used for a new Google algorithm update based on a new language representation model called BERT (Bidirectional Encoder Representations from Transformers). BERT is designed to pre-train deep bidirectional representations from unlabelled text by jointly conditioning on both left and right context in all layers.
Want to skip the Googledygook and go straight to how it affects you? Click here.
Let’s break it down. BERT is conceptually simple but empirically powerful. Current techniques restrict pre-trained representations by being unidirectional. For example, OpenAI GPT uses a left-to-right architecture. This essentially restricts learning. This is what BERT aims to solve by incorporating context from both directions.
What does that mean?
Essentially, BERT is better able to understand context in user search queries. By being bidirectional, BERT can better understand longer-tail and more conversational queries. This is especially true in search queries where propositions such as “for” or “to” are used. Unlike directional models, which read text sequentially, BERT reads the entirety of the text at once. This means it can learn the context of a word based on all of its surroundings (left and right of the word).
Some BERT models for understanding queries are so complicated they need to be handled by high-powered computer processors specifically designed for the cloud.
“We’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search.”
– Google Search Vice President, Pandu Nayak.
Learn more about BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
How Google BERT Works
Since Google BERT works by understanding search queries bi-directionally, Google is now better able to understand the nuances and context of words in searches and match those queries with more relevant results.
“Well by applying BERT models to both rankings and featured snippets in search, we’re able to do a much better job helping you find useful information.”
– Google Search Vice President, Pandu Nayak.
So how does Google BERT work? Google provided a few examples:
Example 1
For a search query such as “2019 brazil traveller to usa need a visa”, the word “to” is very important to the results the user is seeking. Before BERT, Google would not have understood the importance of this word and would display results for US citizens travelling to Brazil.
“With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query,” Google explained.
Example 2
In another example Google gave, a search for “do estheticians stand a lot at work” would have returned results with terms around “stand-alone”. Thanks to BERT, Google can now understand that “stand” is related to the concept of physical demands of a job and display more relevant results.
Example 3
For the query “parking on a hill with no curb” Google now returns a more relevant featured snippet. Prior to BERT this query would have returned results for parking on a hill with a curb.
Google said, “We placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query.”
Example 4
In a briefing with journalists, Google gave the example: “Can you get medicine for someone pharmacy?” According to VP of Search, Pandu Nayak, the old Google search would have treated the sentence as a “bag of words” placing importance on the words “medicine” and “pharmacy” and returning local results.
Now, Google understands that this is a question about whether you can pick up someone else’s prescription.
How BERT Affects RankBrain
So how does BERT affect RankBrain? Simply put, it’s not taking over. RankBrain looks at both queries and the content of web pages to better understand the meaning and relationship of words. BERT is not a replacement for RankBrain. Instead, they’ll work together to create a more relevant search experience for users.
How Google BERT Affects You
Google BERT is not here to penalise you or reduce your SEO efforts. It simply improves how Google understands search intent and queries. What does this mean for you? Organic traffic is going to be more relevant than ever. If you notice a drop in traffic, you’ll likely notice that the drop doesn’t coincide with a decrease in conversion, as that portion of the traffic likely wasn’t your target.
If your site already has great content, then BERT might actually help you. For example, if you run an eCommerce site, the unique details on product pages could now drive more organic search traffic with high purchase intent. On the other hand, BERT will not do any favours for websites with poorly-written content or content that tries to game Google.
How to Check if BERT Affected You
It’s most likely you won’t notice any big changes. As BERT is essentially for longer and more conversational queries, tracking tools, that primarily track shorter queries, won’t notice much of a change. Site owners also often don’t track a lot of long-tail queries, so it’s unlikely you’ll notice much of a gain or drop in traffic caused by the rollout.
FYI, I wouldn’t expect to see much impact of BERT in MozCast — the daily tracking set is mostly shorter phrases and so-called head terms. They’re not the kind of phrases that are likely to require NLP at the level BERT acts (from my own, limited understand).
— Dr. Pete Meyers (@dr_pete) October 25, 2019
Virtually all tools showed a similar level of fluctuation in the past week, minor in comparison to core search algorithm updates like Panda and Penguin.
While it will be difficult to track fluctuations via third-party tools, site owners should check their analytics and search console data to identify fluctuation in total impressions and clicks. We recommend checking your data now for any changes to traffic, ensuring you take into account any seasonality and other industry trends.
How to Optimise for Google BERT
While most SEOs will tell you you can’t optimise for BERT, it’s not exactly the whole truth. Sure, it’s not new information but you can optimise for BERT by providing users with highly relevant, well-written content. As always, optimising for BERT is all about optimising for your audience NOT bots.
Hopefully, you’re already writing expert content for your audience instead of trying to play the algorithm. If you’re not, start now. Google’s algorithm is only going to get more and more human so both present and future success relies on having a website optimised for people, not bots.
There’s nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.
— Danny Sullivan (@dannysullivan) October 28, 2019
We recommend continuing to answer user questions and give your audience exactly what they’re looking for. That’s not only how to optimise for Google BERT but also how to make more conversions and dollars.