Google Says ...

An unofficial, unaffiliated source of comment and opinion on statements from Google, Google employees, and Google representatives. In no way is this site owned by, operated by, or representative of Google, Google's point of view, policies, or statements.

My Photo
Name:
Location: California, United States

Use your imagination. It's more entertaining.

Friday, August 18, 2006

How big is your sitemap?

Although some people continue to report losing rankings immediately after uploading a sitemap to Google, it appears that Google Sitemaps is picking up steam and rolling along.

Googler Amanda points that you can include up to 50,000 URLs in a sitemap no larger than 10 megabytes if you have a lot of content that needs to be crawled. If you have more than 50,000 URLs, don't panic. She mentions that you can upload multiple sitemaps and create an index for them.

Cool.

It would be interesting to see Google respond to the various reports of lost rankings outside the favored SEO forums. They need to make an authoritative, definitive statement (even if they end up correcting themselves later on) about the issue, because many people claim to have yanked their sitemaps from the Google system over the past few months in order to restore rankings.

Technically, one incident proves nothing. That is, if you need to prove cause-and-effect, you should

1) document your rankings (multiple queries is good) with screen captures

2) upload a sitemap

3) document your lost rankings with screen captures

4) remove the sitemap

5) document your restored rankings with screen captures

6) upload the sitemap again

7) document your once again lost rankings with screen captures

8) remove the sitemap

9) document your once again restored rankings with screen captures

If you can present that kind of evidence, you will have gone a long way toward showing cause-and-effect. But you're not done, yet.

It is possible there is an error in your sitemap. What sort of error? I don't know. It could be an undocumented aspect that Google's engineers haven't taken into consideration. In software design, it usually only takes 1 misplaced or unexpected character -- a single byte of data -- to completely render six months or two years' worth of work into stuttering gibberish. Such anomalies are common and usually are easy to fix, but they only get fixed when you get defensive programmers to stop being defensive and look at your clear-cut evidence.

Another possible reason might have something to do with your robots.txt file. Google does, after all, use different crawler programs. It may be there is something in one of them that chokes on certain things in robots.txt.

I recently learned that because I was blocking certain Google IP addresses (for their Web Accelerator tool) I was unable to validate in Webmaster Center (what used to be Sitemaps). If you are cloaking, you may have a similar issue. Uploading a sitemap on a cloaked site may not be the smartest thing to do. It's just a suggestion, and one I am not in a position to test.

But these are a few things people with lost rankings (that they feel they lose because they submitted to Google Sitemaps) can look at: size of map, number of URLs, construction/syntax of the map, construction/syntax of the robots.txt, and whether a site is cloaked.

0 Comments:

Post a Comment

<< Home