What Is BERT? Does it affect your SEO Strategy?

Krishna Murthy
4 min readDec 31, 2020

--

Google regularly rolls out updates to its search algorithms. Google rolled out BERT during the week of October 21, 2019. The changes related to BERT impact Google’s search capabilities in over 70 languages.

An image representing BERT — A Natural Language Processing Algorithm

What is BERT?

BERT (Bidirectional Encoder Representations from Transformers) was a paper published in 2018 by Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova from Google. It is a neural network-based technique for Natural Language Processor pre-training. Google has leveraged BERT’s capabilities in its recent algorithm updates!

Algorithmic changes in the past have significantly impacted SEO rankings and marketers’ efforts to stay current with changed guidelines to maintain their Google search rankings. In that sense, BERT may be an exception in that it may need a little to no change in the direction of the marketers to stay on top of the Google search. Surprised?, read on to get more details of BERT and how this can affect your SEO strategy or not!

BERT has created a buzz in the Machine Learning (ML) community by presenting very encouraging results in a range of NLP tasks, including Natural Language Inference, Question Answering, and such.

How is BERT different from previous Natural Language Processing (NLP) algorithms?

The technique employed in BERT is a significant departure from previous efforts. Here, the transformer encoder reads the entire sequence of words at once, instead of left to right or combined left-to-right and right-to-left training.

BERT employs MLM and NSP training strategies in predicting the next word in sequence.

Masked Language Model (MLM)

Masked language modeling is a fill-in-the-blank task, where a model uses the context surrounding a [MASK] token to try to guess what the [MASK] should be.

Next Sentence Prediction (NSP)

To understand the relationship between the two sentences, the BERT training process uses the next sentence prediction. BERT model gets input pairs of sentences, and it learns to predict if the second sentence is the next in the original text as well.

What is new with BERT?

Old and conventional language models use only previous tokens to predict the next token, whereas BERT uses both the previous and the next tokens. Bidirectional refers to BERT using both the left and right sides of a token’s context during the training.

Why does BERT matter to SEOs?

Search intent is a significant factor in Google’s search results. BERT is designed to better understand the “intent” behind a user’s query compared to Google’s previous versions of the algorithms. Google has said that 10% of all searches are impacted by the BERT update.

Can you expect different search results from BERT?

Yes, in some cases!

Pre-BERT Google search returned the same SERP for “Parking on a hill with no curb” and “Parking on a hill.” The word “no” is an essential part of the search phrase. The pre-BERT algorithm missed that part!

BERT knows grammar! The pre-BERT algorithm had a severe limitation in interpreting the prepositions and pronouns. The search phrases like “over the moon” and “on the moon” returned the same results.

Let’s say you Google the phrase “2019 Brazil traveler to USA need visa”.

In the Pre-BERT search, on the U.S. Google page, it would bring about a page telling whether American citizens need a visa to go to Brazil.

But that’s not the intent of the query. The word “to” is a significant token indicating “people traveling to the USA” pre-BERT algorithms ignored this. With BERT, the top result is the U.S. Embassy in Brazil, which shows Brazilian nationals who want to go “to the USA” to apply for the relevant visa required.

Do you have questions about Google updates and how it affects your Digital Marketing strategy? We will be happy to discuss this. Please set up an appointment today!

Do you need to Optimize for BERT?

“There’s nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.” — Danny Sullivan, Google (@dannysullivan) October 28, 2019

So, Google emphasizes that it’s going to continue rewarding the great content that helps users. So, keep producing useful and great content!

In summary, according to Google, nearly 10% of all searches are impacted by BERT. Long-tail keywords now return more meaningful results compared to the pre-BERT algorithm because BERT can better interpret the search intent. BERT will also impact featured snippets.

References:

https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270

https://towardsml.com/2019/09/17/bert-explained-a-complete-guide-with-theory-and-tutorial/

https://searchengineland.com/welcome-bert-google-artificial-intelligence-for-understanding-search-queries-323976

--

--

Krishna Murthy
0 Followers

A Digital Growth Consultant, Photographer, Travel Enthusiast, Software Engineer who is continually working on building a better campaign for clients!