We at Google… Search]ology

Google is holding one of our Searchology events this morning, to discuss “an insider’s perspective on Search including recent innovtions.” Speakers will include Vice President of Search Products and User Experience Marissa Mayer. You can actually watch the event yourself on the Google site, which makes the act of liveblogging seem a little redundant. But hey, for folks who don’t want to watch an entire 90-minute presentation, I’ll try to relay the highlights.

10:05am: Gabriel Stricker, director of search communications, takes the stage. Housekeeping: You can send your questions in via the webcast, linked to above. This will be a “State of the Union” into search at Google. Web and users are becoming more sophisticated. Users demands and expectations are increasing. Fundamental task: To deliver the complexity of the web to users in a way “elegantly simple and straightforward.”

10:07am: Udi Manber, VP of Core Search, takes the stage. Humanity now in a position to shift attention from controlling nature to understanding people. “It’s more important.” So he calls search “a new rocket science … but it’s a quiet kind of rocket science.”

10:09am: Effective web search is very hard, but you take it for granted. “And that’s the way we like it.” The real goal of search, is to solve the users’ problems. “If users can’t spell, it’s our problem. If they don’t know how to form the query, it’s our problem.”

10:11am: Some examples of how you take search for granted, but it is rocket science. Visiting a university website: There’s a link in the search results just to skip the site intro. Also, a search for IRS shows a table of contents for the site. A search for an address or for plumbers will bring locally-relevant information.

10:15am: Manber says search traditionally was focused on limited by technology (storage, etc.), now it’s more focused on understanding people. And, um, now he’s juggling eggs “to highlight beginnings.” It only lasted about a second, though: “I had to do it quickly, otherwise the PR people start breathing.”

10:18am: Patrick Riley, engineer in search quality, says he’s going to give us a “ground level view” of the “Did you mean?” feature. As an example, he describes a search result for “labor” that could mean different things: Department of Labor vs. giving birth. The second, alternate meaning is highlighted in a different set of search results further down the page.

10:20am: Now looking at how to use that idea with with alternate spellings. Google developed “spellmeleon” project to return alternate spelling results, but was a huge tax on Google’s computing infrastructure. Engineer in Tokyo office decided it made sense to put those alternate results at the top of the page. It was a controversial decision, because it involved using “prime real estate” in the search results.

10:25am: Search results for “Macy Ray.” Did users want results for someone named Macy Ray, or for the singer Macy Gray? Lots of time developing the user interface. On the main results page: “We really care about every pixel on that page.” Settled on interface that reminded user both at beginning and end of “Did you mean?” section that these aren’t normal results.

10:29am: Scott Huffman, Engineering Director of Search, takes the stage to talk about mobile search. “What’s so interesting about mobile, besides that it’s got a mobile screen?” Well, for one thing, mobile search is growing faster than search from PCs. Also, Google needs to support hundreds of mobile devices with radically different capabilities. Also, it’s harder to type searches on a mobile keyboard. Lastly, mobile search is local.

10:31am: Google’s goal is to make mobile search a daily activity. “We’re not quite there today.” Three things need to happen: Mobile search needs to be complete, it needs to be easy, and it needs to be local.

10:32am: Highlighting a few examples of how Google changes the search experience for mobile. Creating a button allowing users to call businesses with just one touch. Allowing users to swipe through image search results on a touchscreen. Summarizing product descriptions, then allowing users to drill in deeper.

10:35am: In other countries, there are more sites that are either primarily accessed via mobile devices, or are mobile-only.

10:37am: Easy: Upcoming feature involves sharing queries between desktop and mobile environment, so that for example if you search for a flight on your computer, you can check it again on your mobile phone when you’re heading to the airport.

10:40am: Marissa Mayer on-stage, going to make some announcements, but first looking at what’s been accomplished. At a dinner, man told her about searching for “how to tie a bowtie” and getting helpful videos and diagrams in the results, rather than incomprehensible text descriptions. Illustrates the benefits of universal search, which was launched in 2007.

10:43am: In the past two years, the amount of rich media included in the results has proliferated. Universal search runs in 175 countries and triggers in one in four search results.

10:44am: Using “the bento box” as an illustration of delivering lots of media and information in a compact form.

10:45am: Discussing SearchWiki, allowing users to edit and annotate the search results. “It’s been a really big success for us.” Search improvmeents today are built around more interactivity and more rich media.

10:47am: Problems: Finding the most recent information. Finding just a particularly type of search result. Knowing which results are best. Knowing what users are looking for. Moving beyond keyword-driven search.

10:49am: Launching a product called Google Search Options. Going live today. Example: Search for Hubble Space Technology. You can bring up a Search Options panel that allows you to “slice and dice” results. You can view results by genre, most recent, images, timeline. and wonder wheel.

10:51am: You can also combine different options — limiting search to results from the past week, then drilling down further by seeing images in those results.

10:53am: Another example: Searching for solar ovens. If you just wanted to see videos, you can just click on videos without switching sites or contexts. Then you could switch to just seeing forum posts.

10:54am: In the reviews section, Google uses “sentiment analysis” to determine whether a review is positive or engative, then trying to show a snippet that captures that sentiment. It’s a different snippet than what you’d see in a normal search result.

10:56am: So what is the wonder wheel? It’s a visualization for exploring your search results. In the center, you see your query, then there are related topics clustered around it. The search results, meanwhile, are shown in a column on the right side of the page. As you click on related topics, you can jump between different wonder wheels.

10:58am: Search Options is also a convenient way to introduce new features to search, becomes they can be added into the existing interface, rather than creating a whole new section.

10:59am: Next project called Google Squared, which is coming later this month in Google Labs. Example query: Small dogs. Automatically builds a “square” of information — basically a spreadsheet with information for Google search results. Finds meaningful facts around names, pictures, descriptions, etc.

11:01am: You can add new rows to get information about a specific type, say adding a specific beed of small dog, or new columns to get new categories of information. You can also choose alternate values from the search results, if the given information doesn’t seem correct.

11:04am: What are some of the challenges in Google Squared? For example, in a search for “vegetables,” it returns information about squash the sport, rather than squash the vegetable. (Incidentally, I noticed that Mayer is keeping her famous laugh under control during this speech, but a few giggles escaped here.) You can also save a square for future use.

11:05am: Next product: Rich snippets. The rich snippet shows extra metadata beyond just a text excerpt. such as the number of user reviews in restaurant search results.

11:06am: Another example: Searching for an electronic device, the snippet also shows when the review was conducted. A third example: Searching for a person, you can get metadata like their location, which helps you figure out which person is the one you were looking for.

11:10am: How does this work? Google will now supporting two different open standards for annotating a page to show meaning: RDFa and microformats. “This is a step towards making the whole Internet smarter.” For example, you could use these tags to render information differently on a mobile phone.

11:11am: Last demo, “which is about the stars.” Huh?

11:14am: Showing off an Android app for viewing the stars. Pan, zoom in, zoom out. But why is this better than a paper star map? Uses GPS to produce a star amp that’s local to your location on Earth, and the stars that you would see.

11:17am: Android also knows which direction you’re looking, and as you turn, the map changes with the phone. “Can you do that with a piece of paper?” You can also search for a specific star, which delivers an arrow, pointing to where your star is in the sky.

11:20am: Mayer: Google has long joked about locating physical objects. Now with Android, it’s starting to do that (although searching for stars isn’t the most practically useful thing to find).

11:21am: Question and answer session. Q: As Google becomes more semantic, will Google start selling semantic keywords? Mayer: No plans to change how Google sells keywords, and she also resists the idea that Google is becoming more semantic.

11:23am: Q: What can you say about Google’s international support? And are search engines only useful on the web? Manber: Google is committed to international support, though that doesn’t mean it can release all produts in all companies. Mayer: Google has focused on online search, and is now trying to bring offline information online.

11:25am: Q: Can you talk about meaning extraction for some of the new features? Manber: No. Mayer: “I think we can open the kimono a little bit.” It’s “totally amazing.” Google Squared “really, really blew me away.” Basically, it looks for structures on web that seem to imply facts. Corroborating those facts by see if those structures repeat across pages.

11:28am: Q: Stephen Wolfram expressed a lot of dissatisfaction with information you get in databases to build Wolfram Alpha. Does Google feel there is enough satisfactory information in databases? Manber: You have to corroborate the information you find. Sergei Brin and Manber did see an early demo fo Wolfram Alpha, by the way. Mayer: Google is optimistic about information on the web, and how information gets corrected quickly.

11:31am: Q: Are there risks of copyright infringement with Squared, since it doesn’t point to the web pages where it’s pulling information? Mayer: Well, Squared may provide information in a more useful way than the original website, but Google will still cite its sources.

11:32am: Q: What is exposed via APIs? Especially rich snippets. Mayer: Yes, it’s an open API.

11:33am: Q: How important is it to search within videos? Mayer: Voice is much further along than video. “But that’s just my personal opinion.” Video search is very important, though. Manber: We don’t have to make a distinction between specific areas in determines of importance. “They’re all important.”

11:36am: Q: When will Google Squares be available? Mayer: Later this month.

11:37am: Q: Include all sites with rich snippets? A: Limited amount of sites initially, and sites can sign up to be indexed as well. Q: Is there any way to opt out? A: Since it’s information that’s added to the HTML, any site can choose not to provide that information. But no guarantees, since Google will be using its own algorithms too.

11:39am: Q: How much closer do today’s announcements bring Google to becoming a perfect search engine? Mayer: Search is still in its infancy, and today’s announcements just reinforced that for her. Search is a “90-10″ problem (variation on the more common “80-20″ description), so that the last 10 percent that’s unsolved will require 90 percent of the work. Huffman: Mobile search is even further away from completion. Manber: To me, every five years in science fiction. Head of US Patent Office in 1860 said: “Anything that can be invented has been invented.” Which was, um, wrong.

11:43am: Q: How have law enforcement agencies changed their attitudes towards Google’s data? Mayer: We’re very sensitive to users needs in this area. We want to provide best possible service, but want users to offer information in areas where they think it’s worth it. Can’t specifically comment on legal cases.

11:44am: Q: How will desktop queries linked to mobile phone? A: Linkage is only there if you’re signed in to Google account. When you delete things from desktop browser, it will be deleted on your mobile browser, vice versa.

~ by Travis on 05/12/2009.

One Response to “We at Google… Search]ology”

  1. 3E18TO iaesdyyqvsqu, [url=http://lrrlwdsnepxv.com/]lrrlwdsnepxv[/url], [link=http://spwuhchqagvo.com/]spwuhchqagvo[/link], http://qltpkjxdyfpx.com/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: