Pandu Nayak, a former NASA research scientist who joined Google 14 years ago to work on its search engine, told the "Guardian" that mass murders present an increasing challenge for the search engine to deliver accurate results.
The tragic increase in shootings, especially during the past few years, seems to have prompted a lot of misinformation to serve up online. Much of it ends
up topping search engine queries as more publications pick up and aggregate the news.
The algorithm, he said, now recognizes that a "bad event is taking place and that we should increase our notions of 'authority,' increase the weight of 'authority' in our ranking so that we surface high quality content rather than misinformation in this critical time here.”
Google published a multipage document that defines what the company means by “authority.” It’s basically pages that comply with Google’s search quality guidelines originally published in 2013. There are raters who also rate the page and its content.
Those employees are responsible for checking tweaks to Google’s algorithm to ensure the best results. The process rates search results by marking whether the searcher’s needs are met with the correct content, and the other marks the page’s quality, defined by more than 80 pages of the guidelines, from very high-quality main content, to very high level of expertise, authoritativeness, trustworthiness, to very positive reputation.
In 2017, per the report, Google added the ability for raters to flag search items as “upsetting-offensive,” after the Guardian and Observer began a series of stories showing how the search engine promotes extremist content, highlighting one story that questions whether the Holocaust actually happened. The story was intended to promote denial, according to The Guardian.