Google Says ...

An unofficial, unaffiliated source of comment and opinion on statements from Google, Google employees, and Google representatives. In no way is this site owned by, operated by, or representative of Google, Google's point of view, policies, or statements.

My Photo
Name:
Location: California, United States

Use your imagination. It's more entertaining.

Friday, August 18, 2006

How big is your sitemap?

Although some people continue to report losing rankings immediately after uploading a sitemap to Google, it appears that Google Sitemaps is picking up steam and rolling along.

Googler Amanda points that you can include up to 50,000 URLs in a sitemap no larger than 10 megabytes if you have a lot of content that needs to be crawled. If you have more than 50,000 URLs, don't panic. She mentions that you can upload multiple sitemaps and create an index for them.

Cool.

It would be interesting to see Google respond to the various reports of lost rankings outside the favored SEO forums. They need to make an authoritative, definitive statement (even if they end up correcting themselves later on) about the issue, because many people claim to have yanked their sitemaps from the Google system over the past few months in order to restore rankings.

Technically, one incident proves nothing. That is, if you need to prove cause-and-effect, you should

1) document your rankings (multiple queries is good) with screen captures

2) upload a sitemap

3) document your lost rankings with screen captures

4) remove the sitemap

5) document your restored rankings with screen captures

6) upload the sitemap again

7) document your once again lost rankings with screen captures

8) remove the sitemap

9) document your once again restored rankings with screen captures

If you can present that kind of evidence, you will have gone a long way toward showing cause-and-effect. But you're not done, yet.

It is possible there is an error in your sitemap. What sort of error? I don't know. It could be an undocumented aspect that Google's engineers haven't taken into consideration. In software design, it usually only takes 1 misplaced or unexpected character -- a single byte of data -- to completely render six months or two years' worth of work into stuttering gibberish. Such anomalies are common and usually are easy to fix, but they only get fixed when you get defensive programmers to stop being defensive and look at your clear-cut evidence.

Another possible reason might have something to do with your robots.txt file. Google does, after all, use different crawler programs. It may be there is something in one of them that chokes on certain things in robots.txt.

I recently learned that because I was blocking certain Google IP addresses (for their Web Accelerator tool) I was unable to validate in Webmaster Center (what used to be Sitemaps). If you are cloaking, you may have a similar issue. Uploading a sitemap on a cloaked site may not be the smartest thing to do. It's just a suggestion, and one I am not in a position to test.

But these are a few things people with lost rankings (that they feel they lose because they submitted to Google Sitemaps) can look at: size of map, number of URLs, construction/syntax of the map, construction/syntax of the robots.txt, and whether a site is cloaked.

Tuesday, August 15, 2006

Google Base solves a massive, common problem in two blog posts

One of the many common questions I see asked in SEO forums and online marketing groups boils down to: "How do I get Google to index thousands of product listings?"

The Google Base Blog has just solved that problem in two fell swoops.

On August 9, Sundar Subbarayan described in "Taking care of custom needs" how they took two feeds from a retailer with thousands of products and multiple store locations to generate a massive online catalogue for the retailer. The product feed contained lists of store IDs indicating where the products are available. Essentially, Google Base facilitated a classic table join (I'm talking about database tables, not HTML tables) to produce a virtual table.

Yesterday (August 14), Clint Guerrero mentioned in "Behind the Scenes: Life on the Google Base Support Team" that "more attributes we have, the better we can match search queries to your content. Attributes also enable us to include your items in refinement searches."

Jennifer Hyman explained how attributes can help in greater detail in her August 8 post, "Putting your attributes to work".

The potential for abuse, as with all great ideas and services, exists and I certainly hope people don't misuse Google Base's Attributes. But clearly if you need to manage a large online inventory listing, Google has provided some very useful technology. Google not only provides an extensive list of attributes to use in Google Base, you can define your own.

By branding your own queries through advertising, publicity, and other channels, you can direct targeted traffic to your Google Base-listed products and services. These tips directly address a very common need, and they should help cut across many misconceptions about what it takes to get indexed and to become visible to Google searchers.