Getting down to Google Base icks...
So, in Why is the location under my ad?, they explain that if you target your ads by location, then users who are identified by the Google system as coming from that area will be told the ad is relevant to their community. I like that. Can't imagine why anyone would complain, but sometimes business people have a very different set of expectations from me.
In their Get your products into our search results with Google Base post, they share the following advice:
Your site may already be included in our crawl index, but we want to ensure that you also know how you can supplement these results with Google Base - you can submit the products or services that you offer directly to Google Base making them eligible to show on Google.com when a user searches on a relevant query.
What the heck? Seeing as I'm now promoting my SEO Consulting services on a full-time basis, I thought I'd give the system a try.
Unfortunately, the user interface burped. I created a few attributes for my ad, picking from their list suggested attributes. When I clicked on PUBLISH, the system came back and said there was a problem. They lost the label for two of the attributes and combined their data into one unnamed field.
I didn't feel like trying again, so I edited the surviving data, put in a new label, and clicked on PUBLISH again. This time the ad went through safely and I'm good to go for 30 days.
I appreciate the Googlers' giving me advice on how to promote my consulting services, but as a programmer with many years' experience I couldn't help but cringe when I saw the bug. I hate it when I find bugs in my own software after it's been deployed. That's just one of the risks programmers face, but it's still annoying.
Browsing further through the blog, I noticed their Printable coupons for local businesses post. OH..MY..GOD.
Why hasn't anyone in the SEO community made a big fuss over this feature? I know some people who need to take advantage of this service.
Hm. I wonder if I can do that....
Anyway, the AdWords blog has turned out to be very useful and interesting to me just in a few minutes' time. That ain't bad for clicking on a previously unvisited link.
Well, in other useful Google blogging, Vanessa Fox discusses better details about when Googlebot last visited a page. She says that Google Cache will now reflect when Googlebot last sought information about a page, rather than when it was actually downloaded.
Um...that's not very helpful to me. I can see how some Webmasters may be pleased with knowing that Googlebot stopped by on September 1, but it won't explain to them that they are looking at a page copy from April 26. Let me explain why this can be a problem.
Googlebot comes by on April 25, fetches my page, and then I update it on June 12. Googlebot dutifully grabs the page on June 12 and then my server crashes. I restore from a backup made on June 11 and my server will think the page hasn't changed when Googlebot comes back on June 15.
Now, ideally, I want my June 12 version of the page. But for reasons beyond my control I cannot reproduce that page until, say, August 15. If Google dutifully indexes and caches the page in a matter of days, they are out of sync with my restored Web page.
Does this scenario happen? Well, server crashes happen all the time. It's anyone's guess as to how backups are restored and how the servers figure out whether to send a code 304 (Not Modified) or not. But it's a hole in the methodology and the blog doesn't address it to allay my fears and concerns.
I can also tweak my server and screw up its ability to send a code 304 at the right time. What if I accidentally configure my server to send a code 304 every time? Now Google's cache is telling me it visited the page on August 18 but I'm still seeing the restored June 11 backup. What's up with that? After August 15, I think I should be seeing my August 15 update, but because I've misconfigured my server, Google says it visited the page on August 18 and grabbed the pre-August copy from the restored June 11 backup (in truth, it grabbed nothing but I don't know that).
My point is that most Webmasters don't read the Google blogs and they are not going to understand what they are seeing with these dates.
In my opinion, the reported date needs to be the date the file was pulled. If Google really feels anyone needs to see from Google's side when Googlebot last dropped by, the ideal thing to do is show both dates (in my humble opinion).
Sorry, Vanessa, but this latest improvement is an "ick" in my book.