Google sends Googlebot to obedience school...
What should I do if Googlebot is crawling my site too much?
You can contact us -- we'll work with you to make sure we don't overwhelm your server's bandwidth. We're experimenting with a feature in our webmaster tools for you to provide input on your crawl rate, and have gotten great feedback so far, so we hope to offer it to everyone soon.
While many of you may wonder who would complain about Googlebot coming around too often, anyone with a very active forum or other large dynamic site has probably felt the pain of being deep-crawled by Google and Inktomi.
Deep crawls bring servers down to their knees. Your user experience degrades in a matter of minutes or hours. Getting the robots to back off is a painstaking process that may require hours' or days' worth of patience. Once you've gone through the experience, you generally don't want to go through it again. But if you actually get deep-crawled once, chances are pretty good you'll be deep-crawled again.
Google gets my applause for acknowledging the tribulations Webmasters endure from being crawled. It's great to see that they may make a more responsive automated tool available to us to help slow the speed of deep-crawls. But that's still not quite far enough in my book.
Google really needs to let Webmasters opt in to bandwidth-hogging activity, rather than force us to opt out. While Googlebot is not always a nuisance, Google's Web Accelerator remains on my network's banned IP list (a very short list) because I just cannot afford to let a lot of gung ho Google users draw down my server's resources.
You're making progress, Google. Please don't stop here. Bring the process home by giving the Webmasters more control over how you utilize their resources.