Written by Nitin Karandikar; all photos by Jeremiah Owyang

The highlight of the Searchnomics 2007 conference today was a keynote, at the very end, by Marissa Mayer, Vice President, Search Products & User Experience at Google. Mayer's presentation was titled The Future of Search, in which she covered eight areas Google is focusing on now and in the near future.

1. Automated Translation

Mayer began by talking about the vision for automated translation: to break the language barrier by finding anything in any language. She highlighted a Google algorithm called CLIR, which enables translation of search queries to other languages and back again for the results. For example, a search for "restaurants in New York" typed in Arabic would be converted into English, to match standard content about NY restaurants, and the results would be translated back into Arabic. Given that there is likely to be very little content about New York restaurants directly in Arabic, this would expand the information available to someone using that language. According to Mayer, someday in the future Google could automatically search content in all languages and present all the translated results to the user on the same page, regardless of language!

2. Google Book Search

In highlighting Google book search, Mayer explained that the larger and more comprehensive the underlying index, the better the search results presented to the user. As part of their library program, Google is working with 16 major libraries (including 6 international libraries) to scan their books, she said, as well as with publishing partners in all four major sectors, to bring all of that content online and make it searchable. More interestingly, they are adding metadata about books, so that Google's algorithms can understand what the book is about, relevant references, and availability of the content.

I found this metadata functionality particularly interesting - are we seeing the start of the Semantic Web?

[During the Q&A session at the end, Chris Boggs asked how Google was planning to handle copyright violation issues that are sure to come up with Google Books and with Gadgets. Mayer responded calmly with a straightforward (and beautifully scripted) response: Google is committed to working with copyright holders to meet their needs and is reaching out to them to help them achieve their goals.]

3. Images and Video

On this topic, Mayer noted that one of their recent changes is to include all web videos into Google search; it is no longer limited to content within Google Video. [During the Q&A session, she clarified that blogs would almost certainly get included in universal search this year, but podcasts would probably have to wait until voice->text technology matures further.]

4. 1-800-GOOG-411

Mayer highlighted a recent product from Google called Google 411. This is a free phone service that you can call to perform a voice search. As the usage of this system rises, the increasing number of samples of user input will be used to improve voice-to-text technology; users are, in effect, training the system to recognize voice commands. She believes that these advances can be used to make Video search better, by indexing transcripts to provide search results.

5. Universal Search

Mayer referred to Google's recent Universal Search Results offering - the blending of different types of content, such as images and news, into the main search engine. She pointed out that it also works on the mobile (where traffic rises in the summer as we all get out more!), and that more types of content would be integrated in the future.

Incidentally, this feature is not new; it was rolled out a couple of months ago, and is also to be found (in varying degrees) in other search engines.

6. Maps and Local Search

There are some interesting new advances in this area - for example, Google Maps now supports traffic display, based on data licensed from third parties. These traffic maps are also available on the Mobile. The relatively new Street View feature provides actual images of the street, so users can recognize buildings directly from the pictures.

7. Client Software

Google is making advances in Client Software in two areas: Google Gears and Google Gadgets.

Google Gears provides a browser plug-in that, in Mayer's words, takes Ajax applications and makes them better! The Google Reader application has already been ported to Gears, which not only makes it faster (by eliminating the 300K download you invoke every time you start it), but also enables it to work offline. She added that GMail may also be ported to Gears in the near future. Gears can also be used by third-parties, e.g. Remember The Milk , a task management application, has already incorporated Gears so that you can add things to the list offline and then sync up when you connect.

Google Gadgets enables third-party developers to create tiny applications that live on the desktop and connect to the web in the background to pull in information from the web. Google provides a Gadget Maker wizard that makes it easy for anyone to create new gadgets; developers can continue to use the Gadgets API.

8. iGoogle

There is a lot of excitement about iGoogle. When users start their day, they're greeted with a home page that displays the weather, a To-Do list, various modules, customized skins - whatever is important to them. Of course, custom functionality is provided by Google gadgets.

According to Mayer, gadget growth is following a trajectory similar to that of AdSense, which is certainly exciting. The big opportunity with gadgets is the ease and scope of distribution; there are already several hundred gadget developers who get page views greater than 250K/week. Gadgets allow service providers to participate in the user's home page. Could this be a new form of Advertising in the future?

As an example, Mayer said that although she's a big fan of Netflix, she probably would not make it her home page; with a gadget, however, Netflix could still establish a presence within her home page. 

In the context of Gadgets, Mayer announced a new pilot program called Google Gadget Ventures, that will provide incentives for developers to create richer, more useful Google Gadgets.

Conclusion

Overall, I was quite impressed! Mayer spoke without any notes, and was able to handle a variety of questions on a multitude of Google-related topics without any hesitation or waffling.

At the end of the session, I had the opportunity to meet her briefly [certainly one of the high points of the conference for me!], and ask her a question about Saul Hansell's recent article in the NY Times, that I've blogged about before: If indeed Google constantly makes changes to the search engine to tweak the results of individual queries such as "teak patio Palo Alto", how on earth do they maintain the integrity of their architecture? After all, that's only one query among the hundreds of millions of queries they serve every day.

In response, Mayer confirmed that the search quality team at Google does indeed make changes to the Google algorithm very frequently, up to several times a week. She explained that the changes then go through extensive testing by automated programs, as well as validation by a subset of users, before they are released into the mainline. On the "teak patio" example, she clarified that the actual changes are general algorithmic changes, designed to provide more accurate results not just for the specific query in question, but for that general class of queries - which makes sense to me.

This session was the perfect ending to a day spent immersed in Search technology, web analytics and SEO!