Google Says ...

An unofficial, unaffiliated source of comment and opinion on statements from Google, Google employees, and Google representatives. In no way is this site owned by, operated by, or representative of Google, Google's point of view, policies, or statements.

My Photo
Name:
Location: California, United States

Use your imagination. It's more entertaining.

Thursday, August 24, 2006

Matt Cutts to SEOs: It's the content, stupid!

Matt is too nice a guy (or too professional) to be insulting and derisive in his blog or other online/public comments to the SEO community. But he has dealt the self-deluding SEO community a long-needed intellectual bloody nose by showing them that one doesn't need to rely upon links to rank well for expressions in Google.

In his August 21, 2006 blog post offering SEO advice, Matt demonstrated how one can easily rank for a popular expression ("SEO") while explaining how he used a previous blog entry to target a longer expression.

Delivering a 1-2 punch to the generally ignorant SEO community on the power of content hasn't gone unnoticed. In comments posted as followups to his blog, and on numerous blogs elsewhere, SEOs have been rationalizing how Matt is bending his own rules and suddenly revealing that "content ranks again" (tip: it always ranked, except where it was overwhelmed by massive numbers of links).

Just because SEOs have trained themselves only to rank by endlessly (and generally needlessly) building links upon links has never meant that Google was scoring only by links, mainly by links, or wantonly by links. In fact, the overemphasis on linkage has long been a self-defeating strategy for many SEOs, who have devoted much of the past three years complaining about how it's becoming more difficult to garner high rankings quickly through linkage.

This is the power of momentum in ideology: once nonsense takes on the authority of fact, the fiction outweighs all sensibility and reason. It's highly doubtful that Matt's SEO 101 lesson (develop content that is relevant to a query first and worry about linkage later) will part the waters, but it should at least make a splash that is heard around the world for a brief moment in time.

The momentity of the task is made more clear by the naive comments posted by Greg Boser (aka "Webguerilla") in his Amish Gokarts and Mini Bike Furniture post on August 23, 2006. Within a day, dozens if not hundreds of SEO blogs around the world have picked up Boser's inaccurate analysis and hailed it as another marvel of great SEO reverse engineering. Boser's conclusion that a host serving two domains from the same IP address is responsible for a Go Kart site ranking well for "Amish furniture" fails to take into consideration two links that Boser clearly didn't know how to find.

When one of the A-listers of SEOdom fails to pinpoint two or three obvious and easy-to-find links that clearly demonstrate why the Google algorithm would mistake a site for being relevant to "Amish furniture", it's time to ask if these people are really worth the money they are being paid by good businesses for their advice and consulting services.

Search engine optimization is still very much a smudgy art. Hopefully, search engineers like Matt Cutts will continue to emphasize the importance of looking at all the factors that have been openly documented for at least 8 years.

The SEO world has blathered on about inbound links and PageRank, completely blinding itself to the importance of outbound links and on-page content. Frankly, this lesson has come about five years too late.

Let's hope it doesn't take five more years for the next lesson to rattle the cages and foundations of the myths that SEOs have built their reputations on.

Wednesday, August 23, 2006

Magic healing powders

Wrong blog. If you followed a link here about "magic healing powders", that should have led to this magic healing powders entry, not the page you're now looking at (which I only created when I found the mixed-up URL).

Sorry about that, Chief.

Check out Google Says... for comments on Google and Googler announcements, discussions, whatever and check out The Semi-Official Blog of Michael Martinez for my more diverse thoughts and bloggings.

Google sends Googlebot to obedience school...

Vanessa Fox gives a detailed tutorial on how to work with Googlebot. One of the little gems buried in her list of do's is the following:
What should I do if Googlebot is crawling my site too much?
You can contact us -- we'll work with you to make sure we don't overwhelm your server's bandwidth. We're experimenting with a feature in our webmaster tools for you to provide input on your crawl rate, and have gotten great feedback so far, so we hope to offer it to everyone soon.

While many of you may wonder who would complain about Googlebot coming around too often, anyone with a very active forum or other large dynamic site has probably felt the pain of being deep-crawled by Google and Inktomi.

Deep crawls bring servers down to their knees. Your user experience degrades in a matter of minutes or hours. Getting the robots to back off is a painstaking process that may require hours' or days' worth of patience. Once you've gone through the experience, you generally don't want to go through it again. But if you actually get deep-crawled once, chances are pretty good you'll be deep-crawled again.

Google gets my applause for acknowledging the tribulations Webmasters endure from being crawled. It's great to see that they may make a more responsive automated tool available to us to help slow the speed of deep-crawls. But that's still not quite far enough in my book.

Google really needs to let Webmasters opt in to bandwidth-hogging activity, rather than force us to opt out. While Googlebot is not always a nuisance, Google's Web Accelerator remains on my network's banned IP list (a very short list) because I just cannot afford to let a lot of gung ho Google users draw down my server's resources.

You're making progress, Google. Please don't stop here. Bring the process home by giving the Webmasters more control over how you utilize their resources.

Google can't find the big SF conventions?

Google will be partying at WorldCon, a small convention in California. If they were seriously interested in hitting the major SF Geek spots, they'd have headed east toward Atlanta for Dragon*Con.

Well, either the Google recruiting budget is a little sparse or else they are just embarking upon the SF fan experience and have a lot to learn about where to find the major Geekfests. Googlers, here's a tip: just Google for...er, search for largest sf convention in America.

Works like a charm, dudes.

Tuesday, August 22, 2006

Google closes off free, public access to scholarly literature

The official Google Blog today makes exploring the scholarly neighborhood sound like a walk in the park. They have improved the service by providing links to related documents.

What they don't tell you, however, is that the search service now only links to front-end citation pages for article archives that charge membership fees or download fees.

Many of these scholarly articles have been freely indexed on the Web in the form of .PDF files and .HTML pages. If they are still there, Google has apparently now helped to bury them deeply back in the "invisible web", that segment of the World Wide Web that surfers mostly cannot reach through search engines.

Since the interface for Google Scholar has just been redesigned, one can suppose that maybe an entirely new crawl is required to repopulate the index. However, relying on fee-based article archives is hardly the best way to bring humanity's knowledge closer to people's finger tips.