Google is testing the ability to have a conversation with its search app, where it continues to refine the search query as the user asks follow-up questions.
It is likely part of Google’s support for multimodal technology, where inputs include text, voice, images, video and sound.
In this version of search, it appears that Google is testing the ability for users to ask questions in a more conversational tone using multimodal techniques.
The demo of this new type of conversational search in Google app continuously listens to the user’s voice. Users can ask follow-up questions while performing a search, wrote AssembleDebug (Shiv), who spends his time finding new features similar to this one in Google apps, in a post on X.
“The conversational search feature in the Google app is a work in progress that needs more work,” he wrote in an email to MediaPost.
advertisement
advertisement
He said the feature could be more useful when it is paired with the AI Overviews in Google search because AI Overviews provides the important summarized information, so users can read it quickly and ask another question using voice-based prompts for additional information rather than typing.
“I am not using it much myself personally, because I really don't do a lot of searching, but other users may find it more useful,” he wrote.
Conversational search is not new to Google, but the ability to use multiple inputs simultaneously and continuously is new.