There is again a new Google Update. This time the name is BERT. According to Google, BERT should better understand users’ search queries. According to their information, the search engine can better understand 10 percent of English search queries.
Already on October 23, the SEO community in Twitter reported that there are a ranking and algorithm update on Google. The changes may have been related to BERT.
Examine saw a nice uptick during the September core update, but that paled in comparison to where they were. But starting yesterday, they absolutely surged with many top 10 rankings returning. I have to dig in a bit more to see the full impact, but search visibility is spiking. pic.twitter.com/NnEqsKDOXF
— Glenn Gabe (@glenngabe) October 24, 2019
What is BERT?
BERT is an abbreviation for Bidirectional Encoder Representations from Transformers, which is a model that processes and understands the relationship between all words. Google wants computers to understand natural language better and to look at word dependency or search intent.
What is the difference between BERT and RankBrain?
Google launched RankBrain in 2015. The algorithm uses machine learning to understand users’ search queries better. This is necessary because a large proportion of all search queries that Google receives still appear for the first time. Google estimates its share at 15 percent.
Similar to the previously developed Hummingbird algorithm, RankBrain is also based on human language. The same applies to BERT.
BERT will probably not replace RankBrain but complete it. The question is whether it is an either-or, or both-as-also.
How does BERT work?
As shown in the example above, based on the search query “2019 Brazil traveler to the USA need a visa,”, the relevance of the word “to” is very high. Previously, Google would understand the search so that an American wanted to travel to Brazil. Now Google understands the dependency between all the words and answers the actual question (after entering the US) from the user.
Impact on SEO
Like the RankBrain, BERT does not allow for optimization, as it focuses on search queries to provide more relevant results. The content should continue to be written for users. The texts should continue to be natural, unique and easy to understand for the search engine.
Does BERT affect voice search?
Because BERT is aimed primarily at more complex search queries, the algorithm should be particularly important for voice search queries. The reason is these queries are usually more complex than typed searches. Frequently, voice searches are done in the form of complete sentences. This should have been another reason for Google to develop BERT. The Google Assistant, Google’s digital assistant, will undoubtedly benefit from BERT.
- BERT does not replace RankBrain; it should work as a supplement.
- BERT currently only works in English but will be extended to other languages in the future.
- BERT tries to understand the word combinations and present a relevant result.