Google BERT Update – What it Means
Google BERT Update – What it Means. Google announced what they called the most important update in five years. The BERT update impacts 10% of search queries. What is BERT and how will it impact SEO?
BERT is a Major Google Update
According to Google this update will affect complicated search queries that depend on context.
This is what Google said:
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
What is the BERT Algorithm?
Search algorithm patent expert Bill Slawski (@bill_slawski of @GoFishDigital) described BERT like this:
“Bert is a natural language processing pre-training approach that can be used on a large body of text. It handles tasks such as entity recognition, part of speech tagging, and question-answering among other natural language processes. Bert helps Google understand natural language text from the Web.
Google has open sourced this technology, and others have created variations of BERT.”
The BERT algorithm (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm related to natural language processing. It helps a machine to understand what words in a sentence mean, but with all the nuances of context.
BERT And On Page SEO
I asked search algorithm expert Dawn Anderson (@dawnieando on Twitter) what that meant for SEOs and she responded that it won’t help websites that are poorly written.
According to Dawn:
“BERT and family improve the state of the art on 11 natural language processing tasks. Even beating human understanding since linguists will argue for hours over the part of speech a single word is.
But what if the focus of a page is very weak? Even humans sometimes will be like “what’s your point?” when we hear something.
And pronouns have been very problematic historically but BERT helps with this quite a bit. Context is improved because of the bi-directional nature of BERT.
There will still be lots of work for us to do since we need to emphasise importance, utilise clear structures, help to turn unstructured data into semi structured data, utilise cues on content light pages (e.g. image heavy but not text heavy eCommerce pages) using such things as internal linking.”
Read more: https://www.searchenginejournal.com/google-bert-update/332161/