by Mike Masnick
Fri, Nov 30th 2007 8:05am
Following on the speech given earlier this month by the head of the Associated Press, where it was made clear that the AP and news organizations still think that they can be gatekeepers of news, a bunch of publishers along with the AP are now trying to revise robots.txt so that they can hide content on a more selective level. Now, it is true that robots.txt can be rather broad in its sweep. But it's rather telling that it's the publishers who banded together and are telling search engines what changes are needed, rather than working with the search engines to come up with a reasonable solution. In the meantime, there really are some simple solutions if you don't want content indexed by search engines -- but we've yet to fully understand why publishers are so upset that Google, Yahoo and others are sending them so much traffic in the first place.
If you liked this post, you may also be interested in...
- EU Parliament Dumps Link Tax, Invites News Publishers To Sue If They Think Google's Making Them Broke
- Publishing Lobbyists Suck Up To Trump With Lies About Copyright, Ask Him To Kill DMCA Safe Harbors
- Inspector General Says FBI Probably Shouldn't Impersonate Journalists; FBI Says It Would Rather Impersonate Companies Anyway
- Leaked EU Copyright Proposal A Complete Mess: Want To Tax Google To Prop Up Failing Publishers
- Johnny Manziel's Lawyer Accidentally Texts The AP And Then Threatens To Sue Them If They Report On It