Google fields searches daily on topics like suicide, sexual assault, and domestic abuse. The company -- which has integrated its latest machine-learning model, Multitask Unified Model (MUM), into its search engine -- said it has found that artificial intelligence (AI) does a good job of parsing the complexities of language and directing those searchers to the information they need.
MUM can “more accurately detect a wider range of personal crisis searches.” The company unveiled MUM at its IO conference last year, and has since brought it to platforms such as Search and Lens to answer many related questions to solving problems.
Today, when someone searches on Google for information related to the topics of suicide, sexual assault, substance abuse and domestic violence, they will see contact information for national hotlines alongside the most relevant and helpful results. That will change in the coming weeks, as Google integrates MUM into search.
People in personal crises need all types of information, and it’s not always obvious to Google which types of information they need.
“If we can’t accurately recognize that, we can’t code our systems to show the most helpful search results,” wrote Pandu Nayak, Google fellow and vice president of search, in a post. “That's why using machine learning to understand language is so important.”
MUM analyzes the intent behind the words in someone’s question and detect a person in need, he wrote.
The technology can translate information into 75 languages. Google trains one MUM model to perform a task, it also learns to do it in all the languages it knows.
Keeping someone safe also means steering them away from unexpected shocking results. This can be more difficult than expected. Content creators sometimes use benign terms to label explicit or suggestive content, Nayak wrote, adding that “even if people aren't directly seeking explicit content, it can show up in their results.”
The SafeSearch mode, which is on by default for people under 18, offers users the option to filter explicit results. And even when SafeSearch is off, Google’s technology still reduces unwanted racy results for searches.
Nayak also explained how the company is using AI technologies like BERT -- abbreviated for Bidirectional Encoder Representations from Transformers -- to better understand the type of information people want. “BERT has improved our understanding of whether searches are truly seeking out explicit content, helping us vastly reduce your chances of encountering surprising search results,” he wrote.