Google turned over a new leaf on Friday. In a blog post, the company explains the reason for rethinking the use of the word “autocomplete” to provide search “predictions” rather than suggestions.
The “autocomplete” feature serves up “predictions” based on how Google’s algorithm thinks those searching on google.com will “likely” continue to type in the rest of the query in the search box.
Suggestions would suggest a bias from Google or suggest “new types of searched to be performed.”
The post explains and shows examples of how the predictions feature works with Google search for desktop and mobile.
“The predictions change in response to new characters being entered into the search box,” according to Google. “For example, going from “san f” to “san fe” causes the San Francisco-related predictions shown above to disappear, with those relating to San Fernando then appearing at the top of the list.”
It's not clear whether an algorithm focused on prediction vs. suggestions will get advertisers any closer to serving perfect search ads to consumers. One thing is certain: Google wants to get rid of any bias in query results. Has it become a matter of semantics: predictions vs suggestions? Google says the change will make predictions more useful.
“When you’re using Google on desktop, you’ll typically see up to 10 predictions,” per Google. “On a mobile device, you’ll typically see up to five, as there’s less screen space.”
On mobile or Chrome on desktop, Google might show information like dates, local weather, and sports information.
Google will remove some predictions when they are against based on specific guidelines. The predictions served are common and trending related to what someone begins to type, but those being removed from serving up include sexually explicit predictions that are not related to medical, scientific or sex education topics.
Google also will remove spam; those closely associated with piracy; hateful predictions against groups and individuals on the basis of race; religion or several other demographics; violent predictions; dangerous and harmful activity in predictions; and those in response to valid legal requests.
In the coming weeks Google will expand on the types of predictions removed. Last year the company launched a feedback tool and has since been using the data to make improvements.
In the coming weeks, Google will begin apply what it calls “expanded criteria” to hate and violence, but there may be some exceptions, such as predictions for song lyrics or book titles. They might serve up in the predictions box only when combined with words like “lyrics” or “book” or other cues that indicate the person is searching for specific work.