
An ecosystem of consumer apps and what seems like billions of
interactions daily has given Google an advantage to make its Gemini app the smartest personal assistant.
Google on Wednesday introduced a beta feature called "Personal Intelligence" that lets
its Gemini app share data from a variety of its platforms to personalize responses.
Personal Intelligence can retrieve details from a user’s texts, photos, or videos like their YouTube
watch history to customize Gemini’s responses. It also has access to data from Search, Shopping, News, Maps, and Google Flights and Hotels.
With tailored answers, along with reasoning
from across complex sources, Google rolled out the feature in the U.S. today on Google AI Pro and AI Ultra. In the future, the company plans to bring the technology to the free Gemini app and AI Mode
in Search.
advertisement
advertisement
Google gave this example, which I can completely relate with.
“We needed new tires for our 2019 Honda minivan two weeks ago,” Josh Woodward, vice president for
Google Labs & Google Gemini, wrote in a blog post. “Standing in line at the shop, I realized I didn't know the tire size.”
So, he asked Gemini, which found the tire specs and
suggested different options depending on daily driving and for all-weather conditions, referencing his family road trips to Oklahoma.
Gemini mentioned Oklahoma in the query requests based on
photographs it found in Google Photos. It also pulled ratings and prices for each of the types of tires it had suggested.
What’s even scarier, Woodward needed the license plate number
for his car. Rather than going outside to get the number, he asked Gemini, which pulled the seven-digit number from a photograph he had uploaded to Google Photos.
In this instance, Woodward
opened up about Gemini saying it may “struggle with timing or nuance, particularly regarding relationship changes, like divorces, or your various interests.”
For example, seeing
hundreds of photos of a person at a golf course might lead to assume that person loves golf. Gemini missed the nuance that you don’t love golf, but your son loves golf and that’s why
you’re there. Woodward said all you need to do is tell Gemini you don’t love golf and it will remember.
Google also released a paper that takes a deeper look at its methodology, current limitations, and how
developers are working to fix them.
In the report, the team recognizes a "tendency for the model to rely too heavily on a personalized inference where it’s not appropriate — a
phenomenon Google calls “tunnel vision.”
For example, someone might love coffee shops and the model understands that as part of the Personal Intelligence. When you ask it to
“plan a trip to Australia,” it may plan a trip where the itinerary for the trip is focused on coffee shops.
It can overlook corrections, mix up timelines and misinterpret
relationships as demonstrated in the example about the golf course.
The paper acknowledges that the release of these "features is just the first step in a broader research program
designed to deliver a universal assistant that is helpful and truly personal" and "to achieve this vision, we are continuing to do robust research in areas like securely integrating additional
personal data and improving retrieval, long-context usage and model quality for personalization."
I’m not sure I want technology to know me that well. What say you?