Microsoft Research, Bing Make Search A Little Smarter

Bing, with help from Microsoft Research, is using Natural Language Representation (NLR) models to improve the results on each search. On Tuesday the Microsoft search engine introduced a feature that serves a one-word answer for queries, as well as a carousel of related excerpts from a variety of resources.

The answers, “yes” or “no,” required large-scale, multilingual NLR models that Microsoft engineers developed to perform two separate but complementary tasks.

The technology assesses the relevance of the passage in relation to the query and provides a definitive yes or no answer by reasoning and summarizing multiple sources.

In the query “can dogs eat chocolate” the NLR technology can infer from the multiple sources that “chocolate is toxic to dogs,” so it returns the answer “no.”

This new search experience already launched in the United States, but will expand to more markets soon.

Queries must understand user intent first to return the best web pages.

Microsoft has used NLR for other tasks. Last year, the company created a NLR-based model to improve intelligent answers and caption generation in English-speaking markets. It also improved the answers to queries based on understanding intent.

Microsoft did this by creating a NLR-based model tuned to rate potential search results for a given query, using the same scale as human judges. This model is able to understand complex and ambiguous concepts much better than its predecessor.

The query, “brewery germany from year 1080,” for example, did return an answer, although there is no known German brewery founded that year. Bing assumes the user searched for a very old brewery in Germany, although they may have misremembered or mistyped the year.

While the previous search model returned a generic list of German breweries, the new NLR-based model correctly identified the Weihenstephan Brewery, founded during that time period, but in year 1040, rather than 1080.

Next story loading loading..