Google Focuses On Making Search More Visual For The TikTok Era

Google will change the traditional search experience, allowing users to more naturally explore information.

The company on Wednesday, at its Search On event, showed how advancements in artificial intelligence (AI) enable its developers to transform its products, making features a more visual experience.

The idea is to go beyond the search box to create experiences that work more like human minds, and that are as multidimensional as we are as people, according to Prabhakar Raghavan, Google senior vice president.

“We call this making search more natural and intuitive, and we’re on a long-term path to bring this vision to life for people everywhere,” he said.

Google introduced multisearch earlier this year as a beta in the U.S., and at Search On, announced the expansion of it to more than 70 languages in the coming months.

Taking it one step further, multisearch near me will enable users to take a picture of an unfamiliar item such as a dish or plant, then find it at a local place nearby, like a restaurant or gardening shop. The plan is to roll out multisearch near me in English in the U.S. this fall.

The changes will not only make search a more visual experience, but highlight maps snippets, imagery and even video in new ways.

Google said it will bring search to Live View, a feature that overlays arrows and directions on top of the view of the world. It allows people to use the phone’s camera to find essential places such as shops, ATMs and restaurants.

In addition to Search with Live View, there are several new updates for Maps in the coming months, with features including Immersive view, Neighborhood vibe.

The company says all of the features are part of building a visual-first Maps experience to help users navigate the world.

The upcoming Immersive View is designed to help plan ahead and gain a deeper understanding of a city before arriving. It leverages computer vision and artificial intelligence technology to combine Street view and aerial imagery of weather, traffic and crowds.

Using predictive modeling, Immersive View learns historical trends to determine what an area will be like tomorrow, next week, or next month.

The feature will launch in Los Angeles, New York, San Francisco, and Tokyo in the coming months on Android and iOS. More cities will be added in the future.

Neighborhood vibe gives users the ability to instantly see the vibe of any neighborhood through photos and information from the Google Maps community. Neighborhood vibe serves trendy places to highlight what is interesting about a specific neighborhood.

The new feature is launching globally in the coming months on Android and iOS.

Next story loading loading..