Searching for the Future With Google Labs

R.J. Pitman Each day, hundreds of millions of people search on Google for information. While more than half of the searches come from outside the United States, innovations begin with Google engineers and Google Labs. In fact, Google spent 12.8% of revenue in 2008 on research and development.

Online Media Daily spoke with R.J. Pittman, director of product management at Google, about Google Labs' strategy and the impact it has on millions of users, publishers and advertisers. Pittman said Google Labs projects provide a sneak preview into the company's future. He reveals challenges and opportunities for the search engine over the next 10 years.

How does Google Labs fit into the innovation cycle for email and search?

Pittman: The relationship between engineering and end users is important to the innovation cycle. The more we can close the gap between original ideas that pop into an engineer's mind and then hand it off to the end user for evaluation, the faster we can innovate.

Having the feedback is essential. Although Labs has been around for years, we didn't rely on it as a mechanism to establish a dialog between engineers and end users. We want to make it easy for engineers to launch early and often, as well as give end users the ability to find features and functions of interest even if they aren't perfect and polished. It's a great feedback loop for what isn't working, what is working and what needs work.

How do engineers submit 20%-time projects to Google Labs?

Pittman: The 20%-time projects let engineers listen to user feedback. There are dozens of Gmail experiments running simultaneously, for example. Each is equally important to the evolution of the product.

Google Labs is organized so it can push out experiments in Gmail Labs, Google Code Labs, Search Experiments and Toolbar Labs. We will replicate it across all the product families at Google.

Engineers close to Gmail are able to build, test and launch them in Gmail as an experiment. People opt in to use them. A link in Gmail allows you to click a button to activate the feature. It becomes a bolt on for the user as long as it's left on. It tells engineers the features most important to users. There are about two dozen add-ons in Gmail today.

Both News Timeline and Similar Images found their way into Google Labs because they are recent bits of innovation not quite ready for production. In Google Labs, users can immediately get the benefits of using them while we make them ready for distribution.

We're curious to how people use them. We see many people search for Paris, but do they want information on Paris, France; or Paris Hilton? We're curious to see, if we present people with options for visual refinement, does that shift the search from the keyboard to the click? It's complex in theory, but simple in practice. It's nearly impossible to describe a picture or image in one to two words. In Similar Images, the image becomes the query. That's a big shift in the basic thinking about search.

How did Google change the back-end infrastructure or algorithms to make Similar Images possible?

Pittman: It requires heavy use of computer vision technology. It's similar to many Google searches, where text analysis reads the written words on a Web page to understand the importance of each query. The ability to study images on the page and use software to understand what's in the image on the page is the big leap forward.

The vision research technology creates a similarity index that gives Google an understanding for the likeness of one image to another. Independent of the substructure of the Web and link structure of the Internet, we look at one image and visually compare it to another.

Is there a missing search technology?

Pittman: Yes, we have yet to solve user intent. Google's goal is to help you find exactly what you're looking for in as few steps as possible. One way to get there is by understanding what a user means by the words typed into the search box.

I'm not talking about a specific implementation such as semantic search. There are interesting semantic technologies and experiments that can provide some context around search. I liken it to what we do with Google Suggest. As you start to type in the first word in the query, we suggest refinements and provide a quantity measurement. This tells you how many search results there are for the words you type. It also suggests the sentence or phrase you should type to match the intent of the query.

Google Suggest is a good first step. It's one of many interesting ways we help identify user intent in the search query. User intent is a wide-open frontier. It will require many techniques to arrive at the perfect solution in anticipating the searcher's intent.

How will this change the future of search?

Pittman: If we are able to accurately predict and target your search intent, the quality of results would strengthen. You may only need to conduct one or a few searches to find the information. If we are able to determine and return -- with a high degree of confidence -- the results you're looking for, we can deliver you the exact information. That's what search might look like in the distance.

As Google continues to perfect search results by condensing the amount of times someone needs to query a word or phrase, how does it affect the company's paid search offering?

Pittman: Google's driving principal has been to build solutions in the end users' best interest. If we continue to follow that priority, revenue models will follow, as they have since the early days of the company.

While the model will evolve, I can't predict today what it may look like tomorrow. You can, however, imagine that if you understand the end user's intent and the value of advertising, it might provide the ability to target advertising with even greater accuracy than we can today.

As we improve search, we also improve advertising. Advertisers, publishers and end users benefit as the rising tide lifts all boats.

Next story loading loading..