On average, Google sees billions of searches every day, and 15 per cent of those queries are ones it has not seen before.īERT models can consider the context of a word by looking at the words that come before and after it-particularly useful for understanding the intent behind search queries. “By applying BERT models to both rankings and featured snippets in search, we’re able to do a much better job helping you find useful information,” Pandu Nayak, Google Fellow and Vice President, Search, said in a blog post on Friday. The technology behind the new neural network is called “Bidirectional Encoder Representations from Transformers” (BERT), which Google first introduced last year. The new algorithms are now able to understand the context of the words like ‘for’ and ‘to’ to realise what the query actually mean. Therefore, it looked at the important words like pharmacy, or medicines, and simply returned local results. Previously, the search algorithm treated a sentence as a bag of words. ![]() ![]() ![]() In essence, Google is claiming that it is working on improving the search results by making the system better understand how each word in a sentence is related to others. Google has announced a change to its core search algorithms that it claims can better understand conversational queries and throw more relevant results.īy applying improved natural language analysis, the tech giant claims that it has improved its ability to analyse search queries and offer relevant results for as many as one in ten queries in the US English and support for other countries will come later.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |