
Google
Monday began rolling out real-time search for the Web and mobile Web that automatically updates in the search query.
Demonstrated at an event in Mountain View, Calif., Google highlighted the
future of its search algorithm, as well as audio and visually assisted capabilities for mobile.
Marissa Mayer, Google's vice president of search products and user experience, kicked off the
event describing Google's vision for search as being built on four modalities: modes, media, language, and personalization.
Amit Singhal, Google fellow, demonstrated the real-time feature during
an event at the Computer History Museum in Mountain View, Calif., calling it "Google relevance technology meets the real-time Web."
A scroll bar to the right of the results on a PC allows people
to move forward or backward to find results the person might have missed. The feature integrates news from sources such as The Wall Street Journal, Twitter tweets, along with Facebook and
MySpace status updates.
Clicking on the "latest results" tab in the Google search engine automatically refreshes Internet content in real time. Singhal demonstrated how a search for "Obama"
would stream tweets, Web pages, and other Internet content as generated. Taking the demonstration a step further, Google search guru Matt Cutts tweeted on Twitter from the audience and it instantly
indexed on the Web.
Social network site members will become responsible for managing their privacy and the type of information fed into the query results. The announcement that MySpace will
participate raised questions around News Corp Rupert Murdoch's decision to wall some content from his publisher sites. Mayer declined to confirm whether an agreement exists to pay News Corp for the
content.
Precautions have been put in place to prevent spammers from taking advantage of real-time search, Singhal says. A system in place can prevent "gaming the system," he says.
The
feature will initially become available in English-speaking countries, followed by other languages during the first quarter in 2010.
Google also demonstrated the Google Labs project Google
Goggles, available today, which focuses on computer vision for mobile phones. It lets a person take a picture of a product or object with a camera phone and search the Internet for information. Images
are sent to Google's cloud and algorithms search for the information in an index. The best matches are ranked and sent back to the mobile device within seconds. The product works well in certain
categories, but not all. Today, you have to frame the entire image and snap a picture, but soon you will need only a portion of the picture.
"We are at the beginning of the beginning," says Vic
Gundotra, vice president of Google engineering, explaining that cloud computing will give Google the power to offer these types of applications.
A feature Google hopes to deliver in 2010 will
provide near real-time translation in multiple languages on a mobile phone by speaking the query into a search engine application. "Hi, my name is Vic -- can you show me where the nearest hospital
is?" he says. The audio request translates into digital, which processes the request and returns the query.
Location has also been a focus for Google. Combining real-time location with retail
inventory feeds from retail stores will give consumers access to information on stocked merchandise available in stores. It will provide inventory data, prize and size, similar to the information
offered by search engine Milo.com.