What You Need To Know About Google BERT
The latest algorithm is the biggest leap forward in Search.
Google recently made a significant algorithm update, known as Google BERT, to better understand searches and produce results for more natural language queries. The algorithm update will also feed in natural language and search context to their AI technology. Billions of searches everyday will help grow Google’s AI capabilities which improve search results, improve voice search understanding, and help Google grow a better understanding of consumer behavior.
Say hello to Google BERT!
BERT is Google’s biggest search algorithm since the introduction of RankBrain in 2015. In fact, Google states that this update represents “the biggest leap forward in the past five years, and one of the biggest leaps forward in Search.” BERT makes searches more focused by understanding user intent within queries that are more conversationally structured.
Let’s get to know BERT a little better and find out how it can help refine your searches.
What is BERT?
BERT, an artificial intelligence (AI) system, stands for Bidirectional Encoder Representations from Transformers. This search advancement was the result of Google’s research on transformers, which are models that process words in relation to all other words in a sentence, rather than one-by-one in order. In short, this update focuses on phrases instead of simple words.
When it comes to ranking results, BERT will impact 1 in 10 of all search queries. This algorithm update is also being applied to help make searches better for people around the world. By taking learnings from one language, relevant results can be applied to many other languages. Google is using the BERT model to improve snippets in many countries, supporting over 70 languages such as Korean, Hindi, and Portuguese.
Yet BERT is more than just a search algorithm. It’s also a machine learning natural language processing framework, an evolving tool for computational efficiency, and an open-source research project and academic paper first published in October 2018 as BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
How It Works
The beauty of BERT is that no matter how words are spelled or the order in which they’re within the query, it figures out your search and surfaces pertinent information. BERT has the ability to train language models based on the entire set of words in a sentence rather than the traditional sequence of words, for instance left-to-right or combined left-to-right and right-to-left. Google can now solve ambiguous phrases that are made up of lots of words with multiple meanings.
Also, there are subtle nuances of daily language that computers don’t quite understand the way humans do. So when a search consists of a phrase, BERT will interpret it and give results based on how the sentence is created and sounds. This is important because even the simplest phrases can have a completely different meaning compared to a singular word. For example, in phrases like “New York to LA” and “a quarter to nine”, the word “to” has different meanings, which may cause confusion to a search engine. BERT distinguishes such nuances to facilitate more relevant searches.
RankBrain Is Still Kicking
RankBrain was Google’s first artificial intelligence method for understanding queries. It looks at both searches and the content of web pages in Google’s index to better understand the meanings of words. BERT does not replace RankBrain, it’s an extension to better understand content, natural language, and queries. RankBrain will still be utilized, but when Google thinks a query can be better suited with the help of BERT, the search will use the new model. Seems the adage is true...two search algorithms are better than one!
Smarter Search Results
As Google’s newest algorithmic update, BERT impacts searches by understanding natural language better, particularly in conversational phrases. BERT will affect around 10% of queries along with organic rankings, and featured snippets. So this is a big deal for Google...and for all of us. With so many questions, finding relevant results matching our “normal” phrased queries is certain to make our search experience much easier. Happy searching!