Could search engines harness the power of crowdsourcing to serve up better query results and target more specific ads? Today, search engines provide links to questions, but this technique would put searchers in touch with the direct answer in the results, to more than just the most popular queries.
A group of researchers from Microsoft and the Massachusetts Institute of Technology found that crowdsourcing queries could return better results. The work -- titled "Direct Answers for Search Queries in the Long Tail," presented at the Association for Computing Machinery's Conference on Human Factors in Computing Systems, this week in Austin, Texas -- suggests that engines could return more relevant data through data mining and crowdsourcing.
Researchers found in a survey of 361 participants that search engines could improve user perceptions of search quality by providing more direct answers to queries through crowdsourcing to identify the answers to simple but frequently asked questions. Some might argue that this is the job of question-and-answer site Ask.com, supported not only by its Google search results, but also a live community or humans willing to jump in and answer questions.
The Microsoft and MIT research team created a custom version of the Bing search engine that inserted Tail Answers at the top of the search results whenever the user issued a matching query. Tail answers are defined as "special results inserted inline in the search interface." The Tail Answer contains edited text from a Web page where other searchers found the answer to the same information. To create a Tail Answer the platform needs to identify pages that are answer candidates, filter candidates that answers cannot address, and extract the Tail Answer content.
The researchers had data mining software filter through more than 2 billion events from more than 75 million search queries on Bing from about 15 million users, looking for queries that resulted in click-throughs to another site. They then identified those queries that could be succinctly answered in the results. Overall, adding Tail Answers had a positive effect on users’ search experience, as reflected by their ratings, according to the research paper.
Since search engines want to return information rather than links, researchers conclude that crowdsourcing -- or Tail Answers -- could provide more relevant information for a small fee. (The research paper details costs to build in this feature and for each returned query, using a product called Crowdflower.)
The impact of Tail Answers was nearly half as much as result rankings, where search engines focus much of their effort. That positive effect was more than doubled when participants were asked whether they needed to click through to a URL. Answers were able to fully compensate for less relevant search results. The findings suggest that one answer becomes as important as good search engine ranking.