Being a broad core update, the January 2020 core update will impact all search results on a worldwide scale. It’s not an update that targets something specific that webmasters can improve upon, like the “Speed Update.”

Google’s guidance regarding this update remains the same as previous core updates. Google points to this blog post specifically.

When explaining how core updates work, my favorite analogy to reference is this one from Google:

“One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before.”

Let’s break down this analogy. If you ranked your top films a couple of years ago you may have had Goodfellas in the number one slot. For arguments sake let’s say The Irishman was a better film, which came out last year, and now it’s at the number one position. Now Goodfellas has dropped in rankings, but that doesn’t mean it’s a bad film.

Unlock opportunities to improve your Google Ads.
FREE report analyses your campaigns, suggests improvements and gives you an advantage in 60 seconds.

As Google says: “The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.”

With that in mind, there’s nothing necessarily wrong with pages that drop in rankings following a core update. They’re just being reassessed against content that has been published since the last update, or content that was previously overlooked.

Widely noticeable effects are to be expected, which may include drops or gains in search rankings, so paying attention to your rankings in the days and weeks to come is paramount.

If your rankings drop, look at what is now ranking ahead of your content and consider how you can provide an even more comprehensive solution for searchers.

Google’s John Mueller answered a question about what to do to optimize for the BERT algorithm. Google has maintained that there is “nothing to optimize for.” Mueller offered a more nuanced answer that offered a little more information to publishers who are concerned about BERT.

BERT Algorithm

The BERT algorithm is a way to understand text. So rather than analyzing text in the sense of matching keywords, BERT helps Google understand the topics and concepts behind sentences, paragraphs and search queries.

So it’s kind of like the difference between matching words and understanding words.

Google’s blog post explanation,  Understanding Searches Better than Ever Before , arguably falls short for publishers and those in the search community. Google’s post was written as an explanation to users of Google’s search engine and not so much for publishers.

How to Optimize for BERT?

Danny Sullivan suggested to the SEO community that there was nothing to optimize for other than to write content for users.

Google’s John Mueller expanded on Danny’s advice. He doesn’t contradict Danny but rather gives more details.

The question posed to John Mueller was:

Will you tell me about the Google BERT Update? Which types of work can I do on SEO according to the BERT algorithms?

Mueller began his explanation by relating what the purpose of the BERT algorithm is:

I would primarily recommend taking a look at the blog post that we did around this particular change.

In particular, what we’re trying to do with these changes is to better understand text.

Which on the one hand means better understanding the questions or the queries that people send us.

And on the other hand better understanding the text on a page.

The queries are not really something that you can influence that much as an SEO.

John then offered his explanation on what a publisher can do with text content:

The text on the page is something that you can influence. Our recommendation there is essentially to write naturally.

So it seems kind of obvious but a lot of these algorithms try to understand natural text and they try to better understand like what topics is this page about.

What special attributes do we need to watch out for and that would allow use to better match the query that someone is asking us with your specific page.

So, if anything, there’s anything that you can do to kind of optimize for BERT, it’s essentially to make sure that your pages have natural text on them…

Mueller then began to say something but stops before he completes his sentence:

“..and that they’re not written in a way that…”

Mueller stops for a second, takes a breath, then appears to not finish the above thought.  He then begins a new sentence:

“Kind of like a normal human would be able to understand.  So instead of stuffing keywords as much as possible, kind of write naturally.”

Relevance Signals

I saw a Facebook post the other day where someone related his opinion that fixing algorithm updates often comes down to fixing issues with authority and relevance. I agree with that observation 100% because that’s been my experience for the past several years.

Most of the major algorithm updates where Google tells us what changed revolves around relevance. BERT, Rank Brain and Neural Matching were all about understanding language and that is all about relevance.  They all touch on understanding what text means.

Yet according to Google there’s nothing to optimize for other than writing naturally. One takeaway that I find interesting is that John Mueller cautioned against keyword stuffing.

I know some will respond to Mueller’s advice with smug eye rolls as they reach for another keyword to SEO with.

But I’ve been doing SEO for twenty years and in my opinion what Mueller said has some deeper implications, particularly to those who have lost rankings in recent updates.

In my work providing solutions for clients that lost rankings, a common issue is the one about relevance. Relevance becomes an issue because so many publishers tend to focus too hard on keywords and not enough on relevance signals.