by Mike Masnick
Fri, Nov 30th 2007 8:05am
Following on the speech given earlier this month by the head of the Associated Press, where it was made clear that the AP and news organizations still think that they can be gatekeepers of news, a bunch of publishers along with the AP are now trying to revise robots.txt so that they can hide content on a more selective level. Now, it is true that robots.txt can be rather broad in its sweep. But it's rather telling that it's the publishers who banded together and are telling search engines what changes are needed, rather than working with the search engines to come up with a reasonable solution. In the meantime, there really are some simple solutions if you don't want content indexed by search engines -- but we've yet to fully understand why publishers are so upset that Google, Yahoo and others are sending them so much traffic in the first place.
If you liked this post, you may also be interested in...
- UK Publishers Don't See Why Anyone's Complaining About Copyright Law
- Appeals Court Rejects Labels' Collusion Scheme To Try To Force Pandora To Pay Higher Rates
- DailyDirt: Peer Reviewed Publications Are Everywhere
- Lawmaker Who Said Snowden Committed Treason, Now On The Other Side Of Metadata Surveillance
- White House Going With 'Security By Obscurity' As Excuse For Refusing To Release Healthcare.gov Security Details