Stupidity

by Mike Masnick




Publishers Of Certain Belgian Newspapers Continue Effort To Not Be Found Online

from the why-not-stop-publishing-online dept

Here's an idea: why don't the various French- and German-language Belgian newspapers stop publishing their newspapers online? After their ridiculous win against Google for sending traffic their way without first paying them, the group of newspaper publishers is now going after Yahoo for the same thing. There really is an easy solution. Yahoo, Google and everyone else should simply refrain from linking to these newspapers. If they really want to be left alone, to lose all that traffic and to lose all that relevance, that's their own decision. In the meantime, how long will it be before someone else comes along and figures this is a cash cow and starts suing? At this point, perhaps everyone should just sue Google, Yahoo, Microsoft, Ask and others for daring to link to them.

Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    bendodge, 19 Jan 2007 @ 9:32pm

    yeah

    if they aren't in the indexes, they might as well not exist

    reply to this | link to this | view in chronology ]

  • identicon
    Mousky, 19 Jan 2007 @ 9:57pm

    Have the French- and German- language Belgian newspapers also sued libraries for making copies of their newspapers available? How about cafes that have newspapers lying around? Or how about those workplaces that buy one subscription for an office with 20 or so workers? When will companies learn to embrace, use and exploit technology to their benefit and not their detriment?

    reply to this | link to this | view in chronology ]

    • identicon
      Stephane, 20 Jan 2007 @ 4:21am

      Re:

      >

      Allowing Google to steal your content is done at the company's detriment and at Google's benefice.
      Not the opposite, as you might think.

      A lot of people are getting bored of those little google ads everywhere, or by search results showing ebay at top... page rank algorithm, yeah, sure, but with some "adjustments" of course ;-)

      reply to this | link to this | view in chronology ]

  • identicon
    Josh, 19 Jan 2007 @ 10:13pm

    it could be worse!!!

    Think of it this way; Anyone with a website could sue any of the search sites they want....
    Why, you ask...

    'cause they store cached pages "copies" of the site.

    It's technically illegal.

    Think about that!

    reply to this | link to this | view in chronology ]

  • identicon
    Josh, 19 Jan 2007 @ 10:25pm

    in fact

    if you want to get technical you are breaking the law by viewing a website because a copy is stored on your computer.

    reply to this | link to this | view in chronology ]

  • identicon
    ipanema, 20 Jan 2007 @ 5:46am

    I think that's the most sensible thing to do. NOT TO PUBLISH ONLINE! Too touchy. Are they carrying the best news anyway? Suing for linking has become a habit. Sad.

    reply to this | link to this | view in chronology ]

  • identicon
    Starky, 20 Jan 2007 @ 9:55am

    robots.txt

    Is Robots.txt not good enough for people anymore???

    reply to this | link to this | view in chronology ]

    • identicon
      Jess, 21 Jan 2007 @ 6:02am

      Re: robots.txt

      I agree. If a company doesn't want their site or select pages of their site indexed by a search engine. its as simple as creating a file (robots,txt) to exclude themselves. A lot of BS lawsuits flying around right now. There should be a law to protect companies and people against frivilous lawsuits!

      reply to this | link to this | view in chronology ]

    • identicon
      Dam, 21 Jan 2007 @ 2:43pm

      Re: robots.txt

      Problem is simple - the French and the Germans.

      Need I say more?

      reply to this | link to this | view in chronology ]

  • identicon
    Howard Bowen, 21 Jan 2007 @ 12:36am

    At this time in history when lascivious promiscuity is a basis for which to fabricate entertainment formats around, and truthfull fact is waning in journalism, perhaps the courts should clog thier dockets with a class action suit against George Bush as the villian in masterminding Huricane Katrina.

    reply to this | link to this | view in chronology ]

  • identicon
    Jeff, 21 Jan 2007 @ 10:39am

    .

    ROBOTS.txt

    Wtf

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2007 @ 11:56pm

      Re: .

      The problem with robots.txt is that if it does not exist, bots assume they have the right to index and redistribute the copyrighted material.

      Instead every site that wants to be indexed should have a robots.txt file that grants access to the bots, not the other way around.

      reply to this | link to this | view in chronology ]

      • identicon
        Chris Maresca, 22 Jan 2007 @ 10:33am

        Re: Re: .

        Er, the web is a PUBLIC forum. By default, everything is accessible to everything. If you want to stop a indexer like Google or Yahoo, you only to put one file, with TWO lines in it in your web root:

        User-agent: *
        Disallow: /

        That's it. If you can't do that, perhaps you should re-consider if publishing anything on the web is something you should really be engaging in. It's sort like expecting the inventory on your shop to be safe if you don't put a door on the storefront....

        As for server load, those two lines will stop ANY indexer from looking at ANY of your files.

        Chris

        reply to this | link to this | view in chronology ]

  • identicon
    shimon, 21 Jan 2007 @ 5:52pm

    problem in belgium is simple

    well , belgium wanna give u good beer and great chocolates, and just wants to let'em be their way :)

    well if i make a website and i choose is not to be indexed, cose i want-it free of that 35-60% search engine generate on a server, and have-it free for personal acces, what's google's problem ?

    do you guys have any ideea how much indexing is needed to keed damn search robots off your server trafic?

    it seems great service from search engine is not free, cose someone's paying for the trafic search bots generate

    so have a beer and sit relaxed if is good, must be from belgium :)

    reply to this | link to this | view in chronology ]

  • identicon
    Jurgonaut, 22 Jan 2007 @ 5:38am

    German? Or Dutch?

    Mike, shouldn't it be "French- and Dutch-language"?

    I think you have +- 10 native Belgians having German as their mother tongue...

    reply to this | link to this | view in chronology ]

  • identicon
    Chris Maresca, 22 Jan 2007 @ 10:34am

    Replying to individual comments broken

    BTW, in the last Techdirt update, not only did something change so that articles are only 200 px wide, but replying to individual comments also appears to be broken.

    Chris.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.