Google Says ...

An unofficial, unaffiliated source of comment and opinion on statements from Google, Google employees, and Google representatives. In no way is this site owned by, operated by, or representative of Google, Google's point of view, policies, or statements.

My Photo
Name:
Location: California, United States

Use your imagination. It's more entertaining.

Friday, December 08, 2006

SEO Discussion Search Tool and Google Woes

I have been creating custom search engines with Google's little Custom Search Engine tool. All that prevents me from turning them out en masse is the amount of time I have to spend evaluating Web sites. I look at hundreds of sites whenever I create one of these search engines.

The latest one is mostly for my own personal benefit because I'm sick and tired of sifting through listings I cannot see the real content for without having to register for memberships. That bugs me. I don't have time to buy my way into every crowded venue that may occasionally generate some useful information.

In fact, I treasure SE Roundtable because they often save me the trouble of having to find interesting forum discussions. And then encapsulate the important points pretty well, too.

In that vein, it would be nice if Threadwatch were more like the original site Nick Wilsdon created. Nick didn't always appreciate having me around (or maybe he did in a link baity way) because I tend to disagree with most SEOs' conventional wisdom (usually for good reason), but he had a great resource. Nowadays, Threadwatch is mostly a rant-center, albeit an important one as long as its signal-to-noise ratio remains relatively high.

So, I set up an SEO Discussion Search engine at my SEO Web site (which, for all intents and purposes, is in an imposed stage of dormancy since I am now the Director of Search Strategies for an Internet Marketing firm -- those pesky non-compete contracts keep me a little idle in the evenings).

Anyway, I scoured the Web for interesting SEO blogs and search engines. I wanted SEO communities that were open to public scrutiny, relatively active, and/or extremely useful. By that I mean I deliberately included a few sites that don't get much traffic (or at least not much comment) but which still produce a lot of worthwhile information. Bill Slawski's SEO By the Sea blog is a must-read for anyone who likes to prognosticate about where search technology may take us -- and that is my full-time job, now.

I left out some of the more popular SEO blogs because I know someone out there has already indexed them in an SEO blog CSE and because I didn't feel those blogs really contribute much useful information for search engine optimization. For example, Danny Sullivan is extremely popular, but Daggle.com is not loaded with deep insights in search. I did include Searchengineland because it will soon be loaded with search news.

But the purose of this CSE is not really to help people find the latest search news. Rather, I am constantly searching forums and blogs for specific things I know I have read somewhere at some time. I just cannot remember where or when. By narrowing the index to a handful of sites, I reasoned, I may have a pretty good chance of finding what I am looking for.

And since I am sure other people share that occasional frustration, I decided to expand the list of included blogs and forums to make sure the search engine has a pretty solid coverage. I tried not to exclude anyone's site on the basis of personal bias, but frankly some sites are so in-your-face with ads or nonsense or vitrolic ramblings I just don't see any value in them. So a few very well-known, very popular sites joined the other excluded very well-known, very popular sites solely because I just don't see any value in them.

I don't ever search those sites for anything useful. Take that however you want to.

Nonetheless, in setting up this new search engine, I have noticed some more issues I'd like to see resolved or at least clarified. For example, if I specify a sub-directory on a large content domain, how much of that domain will actually be included? Google made it easier to include sub-domains in an index but I'm not sure of what is actually being indexed if I just specify part of a domain.

And while Google also made it possible for people to include subscribed links to their custom search engines, what I would like is the ability to override Google's filtering for the CSE.

You see, one of the very best SEO forums for years has been Spider-Food, launched by J.K. Bowman about six years ago. J.K. is no longer as active in SEO as he once was, but he stays in touch with a core group of old buds (including me). A few years ago, Spider-Food was penalized by Google for using hidden divs -- J.K. was pretty good at both the White Hat and Black Hat stuff before anyone used such nicely worded cliches to distinguish between people who followed the search engine guidelines and people who exploited the algorithmic holes.

So because Spider-food had thousands of inbound links, J.K. pretty much ignored the Google ban. But when he stopped actively participating in the forum himself, people began dropping off. J.K. always said I had a lot to do with keeping the forums active. I usually publish my most original research there, for example. But in my opinion, J.K. is and will always be the life and heart of Spider-Food. He had a very no-nonsense approach to SEO and he was especially good at the under-the-hood SEO that few people today really appreciate.

Most of J.K.'s advice was sound and ethical, and much of it would still apply today. And for over a year J.K. has been promising to remodel Spider-Food, clean it up, and ask for reinclusion. He's just a bit too much of a perfectionist, I think, as well as dragged down by other demands on his time.

Well, Google has for years indexed the Spider-food forums and I've been able to find the threads I needed to get to through Google. Not any more. I don't know if the delisting is permanent or temporary. I'm not sure of why it happened, although I know that spammers were hitting the forums pretty hard the past few weeks. J.K. finally took steps to prevent the robots from dropping any more links. But maybe he acted too late.

I don't know.

All I know is that a great resource has been delisted. Since Google just rolled out some sort of update a couple of weeks ago, and since they usually recrawl the Web after they finish an update, I'm hoping to see Spider-food come back into the index. I know it's penalized, but as long as I can do site searches, that's fine by me.

But let people sit up and take notice, because Matt Cutts did warn the SEO community a few months ago that some very serious changes were on the way. This past week at SES Chicago Rand Fishkin of SEOmoz was quietly told clean up the outbound links in your profiles or suffer some consequences. I have no doubt the Googler who conveyed that warning was doing so out of a legitimate concern to help a widely valued resource from losing search engine value.

Nonetheless, it comes across as an act of bullying. While I have always maintained that Google has every right to do whatever it pleases with its search engine, Webmasters have a right to do what they please with their Web sites. Before there were search engines there were Web sites and even without search engines there are still Web sites. But as the lines of communication open up between Webmasters and search engines, pre-emptive warnings about impending algorithmic doom only confirm what conspiracy theorists have said for years: Google acts like it owns the Web.

In fact, without the Web there is no Google, and Google cannot honestly reach into every Web property and contact the right person. For example, if they were to try to send me a warning email as they send such emails to some Webmasters, it would never reach me. I've disabled all the traditioncal admin accounts because of email spam. Many other Webmasters have, too.

I'll know if Xenite.Org hits the skids only after the fact. Shame on me for linking to sites I think are valuable but which Google may not.

Now, in Rand's case, he was apparently told that some really undesirable sites were being linked to through SEOmoz. I don't link to sites like that. I suppose I'm not in any danger from that kind of algorithmic assessment. And I've been gradually closing off directories where I don't feel Google needs to be going over the past year anyway. Old URls that no longer exist except to redirect people following old links don't really need to be indexed by the search engines. Nonetheless, my redirects have always led me to wonder when or if the axe will fall.

Frankly, if a Googler were to say to me, "Dude, fix your site or get axed," I'd say, "It's your search engine but my site."

Google only sends me a fraction of the traffic I receive. I, on the other hand, provide Google with a lot of great content. It's more their loss than mine. Besides, I could easily enough build up content on other domains that would tell people where to find my orphaned domain. I've been promoting Web sites far longer than Google has been around.

But what is the happy medium? After all, I see the Google warning to SEOmoz as a friendly interventive action intended to benefit the entire Web community. The problem is that not everyone will get such friendly warnings, and the ominous clanging of shuttered windows and doors in various blogs and forums reflects the essentially suspicious attitude of many people toward Google.

Which leads me back to Spider-food.

I want to include their threads in my custom search engine. I'm not asking Googe to lift the ban completely. I just want access to good content -- content that I believe does not violate Google's guidelines. The forums are hosted on a sub-domain.

After years of my openly complaining about how Google has treated sub-domains as if they are independent domains, has Google changed the status quo? Or did the spam robots that hammered an innocent forum get it delisted?

Or am I simply jumping the gun and the forums will be recrawled and reindexed soon?

Well, I have to get some sleep. Thanks for listening to my rant.

And I hope that the CSE I set up does, actually, help other people in the SEO community. Even the folks who didn't make the cut. I was just trying to create a tool with a different value from the one I had already read about.

4 Comments:

Blogger Brad said...

Have you tried doing a side-by-side comparison of Yahoo's custom search engine and Google's - it would be interesting to see which gives the best results given the same seed URL's particularly if Google will ignore good content like Spiser Food.

I tend to take the directory approach and link to individual threads that I think have merit. Unfortunately, one must have time to do that but it can be much more effective than a crawler reliant approach.

4:45 AM  
Blogger Michael Martinez said...

I have been debating whether I should compare all the custom search tools, but that would be time-consuming. For now, Google has the best interface for my preferences (although I don't like having to tie everything into one login).

9:18 AM  
Blogger P. Bench said...

Michael,

Whether you ever get this or not is pretty much a coin-flip I guess, but I have to ask: How did the story end...and what became of your CSE? Does it exist still, and is it available to the public?

Best to you and yours...

Jon Heller

10:08 AM  
Blogger Michael Martinez said...

Jon, here it is 2010 and I just found (and approved) your comment from 2008. Sorry about that. I seldom check the old blogs for comments any more.

As it turned out, Google's CSE technology pretty much sucks and J.K. ended up letting Spider-Food go for lack of time and interest on his part. He is no longer involved in search engine optimization.

I suppose most if not all of the content on Spider-Food would today be fairly useless information, but it still provided a valuable historical context (in my opinion).

I was sorry to lose it.

4:58 PM  

Post a Comment

<< Home