|

BERT updates in 2019—first for English, then 72 languages

BERT Update 2

 

Last October 25, Google announced this year’s critical update to BERT (short for “Bidirectional Encoder Representations from Transformers”). They have referred to this development as their “most important update in five years.” 

BERT is a natural language processing pre-training approach Google first open-sourced in 2018.

In its latest form, it handles language tasks such as entity recognition (which identifies key words in a string or phrase and categorizes them), part-of-speech tagging, and question-answering, amongst other natural language processes. 

Google is delving deeper and deeper into machine language learning—the stuff of science fiction, except now it’s real—to produce natural language algorithms such as BERT. 

BERT, applied to search in 2019, brings Google closer toward decoding “longer, more conversational queries”. The algorithm does this by learning how to “read” and “interpret” phrases in a contextual manner by looking at words to the left and right of the most significant term (the “entity”) in the phrase or sentence. 

This update will likely affect about 10 percent of all queries. It will also affect the generation of search results both for organic search and featured snippet content displayed above organic search engine results. 

But that’s not all. This week, BERT has just rolled out to 72 languages (including Tagalog!), according to this tweet from Danny Sullivan, official spokesperson for Google Search.

This means two things: SEO marketing in the future should give more emphasis to Rank 0, aka rich results, aka featured snippet content, which is more and more voice-generated these days, as well as own-language optimization, for all businesses aiming to get ahead of the curve.

We plan to write more about this in the near future. 

Similar Posts