Google BERT is a Google update powered by AI that has big implications for marketers.

BERT is an AI language model that Google now applies to search results and featured snippets. BERT stands for Bidirectional Encoder Representations from Transformers. But you don’t need to understand all the AI behind BERT to understand its impact.

All BERT does is help Google better understand the context around your searches.

It uses sophisticated AI to process every word in a search in relation to all the other words in a sentence. In the past, Google used to process words one-by-one in order.

The difference in results can be dramatic.

Google offers the example of a search like “2019 brazil traveler to usa need a visa.”

In the past, Google would have interpreted this search as a US traveler looking for a visa to Brazil. That’s because Google didn’t account for prepositions and context. In this example, Google would not have taken the word “to” into account. That changes the meaning of the search.

Compare that to BERT. BERT takes the whole sentence into account, including prepositions. In this example, BERT now understands the searcher is a Brazilian looking for a US visa — not the other way around.

A lot of people use natural language to search for information. This language includes plenty of context clues that change search meaning.

Thanks to BERT, Google will now return information that better understand this context.

Google says BERT will affect 10% of all US searches, so it’s a big deal. And, the AI that powers BERT also works with non-English languages. So, expect its impact to be even bigger over time.

But what is that impact exactly?

How is BERT’s AI-powered understanding going to change your marketing?

We asked three experts from top marketing technology companies for their take.

1. Don’t try to game the update.

“The introduction of BERT is very similar, in terms of implications for marketers, to the release of RankBrain,” says Matthew Howells-Barby, Director of Acquisition at HubSpot.

“Ultimately, BERT will dramatically increase Google’s ability to better understand the context behind queries being made in the search engine, thus allowing them to better serve results that match intent.”

But marketers shouldn’t be looking for ways to game the new update, he says.

“There’s no ‘optimizing for BERT’ in the same way that there was no ‘optimizing for RankBrain’ (despite the many articles claiming otherwise). The result of BERT being introduced has simply meant that Google’s natural language processing engine has reached new heights and you can expect much more granular answering of queries. If you’re already optimizing for intent, you’re in a good place—this is the essence of what our Topic Cluster methodology is all focused on.”

2. Focus on consumer intent.

“BERT is Google’s next iteration in its long-running effort to better map search results to search intent,” says Lemuel Park, CTO at BrightEdge. “It is Google’s neural network-based method for natural language processing (NLP) to understand queries that are more conversational based in nature.”

“This an algorithm change and not an update,” explains Park. “As a marketer this means increasing the specificity and depth of content and working farther into the longtail, or queries using more than three words.”

“Longtail keywords present an opportunity for marketers to develop content that has lower volume per page and less competition but necessarily requires more breadth in content development, a strategy and tactic that was built into the BrightEdge platform in 2014.”

His advice for navigating BERT? Focus on consumer intent.

“The best thing you can do as a digital marketer is to maintain and increase your understanding of the consumer intent through looking at conversational type search themes and keywords to ensure that you connect and have the right content to address the question or query optimized and on your website.”

3. Create comprehensive, topic-rich content.

“The ability of BERT to better understand the intent behind long conversational-style search phrases, and surface relevant content, is a welcome development,” says Jeff Coyle, Co-Founder and Chief Product Office at MarketMuse.

“This is further evidence that marketers need to shift from keyword-driven articles towards crafting comprehensive and topically-rich content.”

But, Coyle cautions, BERT’s impressive capabilities are also its biggest disadvantage. The way BERT contextualizes sentences ends up using a lot of computational firepower, which could limit its use in products or services.

“With BERT, being a technology that creates ‘contextualized’ vectors, I will have to feed both sentences into the BERT network,” says Coyle. “That means I need to do to thousands of FLOPS as BERT is a deep neural network with many layers and neurons.”

“This capability of contextualizing makes BERT and it’s related models state-of-the-art on many natural language understanding tasks but it also makes it compute-intensive and hard to bring into production. That’s why it is powerful for a question or rewrite service but has limitations.”