Google announced new applications for its Multitask Unified Model (MUM) technology, including multimodal search with Google Lens, Related topics in videos, and new search result features today during its virtual Search On event.
These features will provide users with new ways to search and give marketers opportunities to expand on the ways they reach consumers.
Some of the features will support holiday campaigns and shopping, while others will improve on the way consumers use search to find information and local goods.
Pandu Nayak, vice president of search at Google, talked about ways the company is integrating MUM technology with Google Lens. The app enables users to take a photo and add a query to find answers.
Some answers may relate to solving problems like fixing a broken camping stove or a mountain bike. MUM will allow a consumer to point the mobile phone camera at an object, such as an unknown bicycle part, take a picture and upload it through Lens. Google returns images. The consumer types in “how to fix it,” and the query pulls answers from blogs, forums and websites, making optimization much more important.
Consumers, early next year, will have an option to take a picture of a pattern on a shirt, and ask Google to find the same pattern on another article of clothing like socks. The query returns stores in which the consumers can find the item locally.
The technology also will identify and serve up related topics not explicitly mentioned in a video. The MUM-based experience will launch in the coming weeks.
Topic features supported by MUM that will serve up in search engine results pages also will be released in the coming months. They include Things to know, as well as features to Refine and broaden searches.
The Things to know feature lists aspects of the topic for which the user searched. Things to know can enable users to see the different dimensions other people typically search for, which may help them get to the information they are looking for faster. And a visually browsable results page gears toward searches where the user needs inspiration or wants to explore information visually.
Just in time for the holidays in the U.S., Google today launched a shoppable experience that aims to make it easier to browse for apparel on mobile from search results. It’s powered by Google’s Shopping Graph, a real-time dataset of products, inventory and merchants with more than 24 billion listings.
There’s an in-stock filter to see if nearby stores have specific items on their shelves. It launches today in English in the U.S., the UK, Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, Netherlands, New Zealand, Norway, Sweden, and Switzerland.