Google today announced a change to its core Search algorithm that it says can better understand conversational search queries. Through improvements in natural language analysis, the company says that it has improved its ability to analyze queries that reflect how people speak in real life and recognize the relevant context.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search should be able to understand the context of the words in a query, allowing users to search in a way that feels more natural.
The company says the improvements are down to a system it introduced last year called BERT, or Bidirectional Encoder Representations from Transformers, which allows Google to analyze the context of a sentence a lot better and return more pertinent information.
In a blog post announcing the change, Google offers the following example to show off BERT’s capabilities – a search for “2019 brazil traveler to usa need a visa.”
The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.
Google reckons BERT will help Search better understand one in 10 searches in the U.S. in English, and it plans to bring the capability to more languages and locales over time.
As far web searches go, the changes usher in the “biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search,” said Pandu Nayak, a Google vice president.