Google Algorithm Update: Hey There, BERT
For most of my childhood, I heard Ernie from Sesame Street greet his best friend Bert with the same phrase day after day, “Hey there, Bert.” When Google named their latest algorithm update BERT, it’s the first thing that popped into my head.
Today, we all get to say “Hey there, BERT,” because it’s an integral part of the overall Google algorithm that impacts about 1 in 10 search queries. What is it and how does that impact you?
These Are the Algorithms in Your Neighborhood
Sesame Street was a mix of everything. People and Muppets of all races, colors and sexual orientations worked together to make their neighborhood a better place. The Google algorithm is similar in that it has many parts that work together to get you the best search results.
BERT is one of those new pieces and works to understand the context of searches better. It stands for Bidirectional Encoder Representations from Transformers. Sadly, it has nothing to do with Optimus Prime and the Autobots.
When people put in search queries they don’t follow a rigid rule or syntax created by Google. They put in search keywords conversationally. It’s because of this that the algorithm doesn’t always fit the keyword to the right context, BERT helps with that.
It’s a neural network that recognizes patterns and uses that information to create better search results.
Google, You’re the One. You Make Searching Lots of Fun.
Originally, Google’s algorithm determined context by reading the words from left to right and placing importance based on the order of words. With this logic, the words at the end of the search query matter less even if they are the most important part of the query.
For example, “cookbook recipes for children” could have search results that provide cookbook recipes, but not necessarily for children or feature them further down the page. This could be for both search queries and snippets. The reason is because it doesn’t count the importance of “for children” in the query.
BERT is able to read the search query as a whole and discern the context. It would provide results for cookbook recipes designed for children because it understands the context of the query. This is the bidirectional aspect of the algorithm in that it recognizes the words before and after the subject equally.
As such, it’s best used with longer search queries and those that use prepositions such as for, of, to, etc.
Hey BERT, How Does This Impact Me
As I mentioned above, BERT impacts about 10 percent of total searches. It may cause some shift in rankings, but as of November, the impact hasn’t been massive. It’s designed to provide a better understanding of search queries, so if you were ranking for BERT queries then it’s likely they weren’t leading to conversions or desired traffic because Google had the context incorrect.
Instead, you’ll likely see improvements based on better AI understanding of the search query. It will pull you up because it finally recognizes your site as the better match than those currently located at the top of the search page.
If you want to take advantage of BERT, then your pages need to have a focus. Google continues to hammer that poorly written page content and content light pages are not good for ranking. BERT is another push in that direction.
It’s more important than ever that websites also have structured data to provide the clearest context on your site. BERT reads the query and can match the structured data with the query for pinpoint accuracy.
C is for Content, Is Good Enough for Google
Much like Google, Bert and Ernie are still around after all these years and will likely be here for many more. While Bert’s relationship with Ernie might be confusing, there’s no confusing BERT’s relationship with search results.
If you want to learn more about BERT and SEO, then feel free to contact us.