Google Blocking Other Search Engine Spiders

from the how-incredibly-not-interesting dept

I’m wondering if this is going to be made into a bigger deal than it really is. Google has put up a robots.txt file (just like many many many other sites out there) telling other search engine spiders to get lost. The article here makes it out as if this is a big deal, which I don’t think it is. Google just says they’re trying to protect their server resources from being wasted on spiders. Some think that Google is doing it to protect some of their intellectual property from getting harvested. The article gets more interesting when it talks about what other sites have on their robots.txt file (if you don’t know, if you put a robots.txt file on your site, you can tell search engine spiders which directories not to look in – of course, this also tells less-than-honest people which directories they probably should look in to find the interesting stuff). Anyway, it turns out that eBay’s robots.txt file begins with “Go Away” and CNN’s says “Robots, scram”.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...