by Mike Masnick

Filed Under:
france, google suggest, liability


French Courts Once Again Confused About Google Suggest; Blame Google For Suggested Searches

from the gotta-understand-the-algorithm dept

Not again. Earlier this year, we wrote about a couple of lawsuits in France involving companies that wanted to blame Google for the results that pop up via Google Suggest (the feature that simply looks at what you're typing, and suggests the most popular searches based on what you've typed). Tragically, we noted that the courts wanted to find Google liable for these suggestions, and it looks like yet another French court has made that same mistake. This is, quite clearly, blaming Google for what its users are doing. The algorithm for Google Suggest is just taking the aggregate results of what people are actually searching on, and using that to make suggestions. There's no editorial control by Google, and yet this court not only found Google liable for those suggestions, but ordered the removal of all the "offending" query suggestions. In this case, it involved a guy who had been convicted for "corruption of a minor," who got upset "because the plaintiff's name was returned in response to queries on search terms such as rape, satan worshipper, and other things." At some point, you hope that courts and politicians will understand the basics of how technology works, but it seems like we may be waiting a long, long time.

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. identicon
    PRMan, 24 Sep 2010 @ 4:03pm


    They're so inaccurate, there's actually a site collecting the funny ones called AutocompleteMe (from the Fail Blog people).

    reply to this | link to this | view in thread ]

  2. identicon
    Anonymous Coward, 24 Sep 2010 @ 4:07pm

    Even though there is the risk for Google to manipulate those things I think they should prove cause first, but we are talking about the french here.

    reply to this | link to this | view in thread ]

  3. identicon
    Steve Gaucher, 24 Sep 2010 @ 4:08pm

    Re: Funny...

    They're not inaccurate. That's what people are typing.

    reply to this | link to this | view in thread ]

  4. icon
    ChurchHatesTucker (profile), 24 Sep 2010 @ 4:26pm


    It's tricky, because any change to the algorithm is 'manipulating' things. (e.g., Google is known to try to counter SEO manipulation.) At some point, presumably, that becomes an 'editorial' decision, rather than a purely technical one.

    The next question is, so what? We read newspapers (or used to) for their editorial decisions.

    reply to this | link to this | view in thread ]

  5. identicon
    Anonymous Coward, 24 Sep 2010 @ 4:27pm

    In Fascist France, liable is you!

    reply to this | link to this | view in thread ]

  6. identicon
    anon, 24 Sep 2010 @ 4:29pm

    No editoral, maybe, but don't they censor out some explicit results?

    reply to this | link to this | view in thread ]

  7. identicon
    Anonymous Coward, 24 Sep 2010 @ 4:55pm

    Re: Re:

    Difficult yes impossible no, even a newspaper if caught manipulating news can be punished under the law in some rare circumstances, and I believe Google can be too if it ever is caught with its hands on the cookie jar hence the need for probable cause, although I have no reason to doubt the honesty of Google right now, I do understand that good people die, retire, go missing and are replaced by others that could be not that honest, all those big companies once were good to people, stop laughing is serious they once were good too a long time ago, remember when Bill Gates was cool?

    reply to this | link to this | view in thread ]

  8. icon
    The Groove Tiger (profile), 24 Sep 2010 @ 5:27pm

    That's like finding Melvil Dewey liable because his Decimal Classification system puts books in a library in an order that you don't like, or because the first letter of the title of some books put together spell ASS.

    reply to this | link to this | view in thread ]

  9. identicon
    Ryan Diederich, 24 Sep 2010 @ 11:28pm

    my favs

    My favorite one is "I am afraid of..."
    -black people
    -chinese people

    I think we should sue Google because they are racist and sexist. But wait, whats that you say? Your telling me the only reason those come up is because they are most popular?

    More people are afraid of black people than sharks and other stuff like that?

    Sounds like we need to sue society, not Google.

    reply to this | link to this | view in thread ]

  10. identicon
    Mr. Oizo, 25 Sep 2010 @ 4:16am

    Re: Re: Funny...

    That is not what peoploe are typeing as such. 'why jeezus kicked the camels ass' is not what people type. What is happening is that after each word the most likely next step is taken. This is a bit off a 'graph' walk. The first word is 'what'. After 'what' comes very likely 'is'. Then after is comes 'happening', but without taking the 'What' too muich into account anymore. So, it is a bit of a combination of popular sentence postfixes based on common prefixes, but certainly not the entire sentence is what people litereally type in.

    reply to this | link to this | view in thread ]

  11. icon
    PaulT (profile), 25 Sep 2010 @ 8:38am

    Re: Re: Funny...

    Yeah, the suggestions are 100% accurate as they just display what people have been typing.

    That's right: people really *have* been typing things like "how do I get my sister pregnant"...

    reply to this | link to this | view in thread ]

  12. identicon
    Anonymous Coward, 26 Sep 2010 @ 6:14am

    Why are you surprised? I have been watching the idiots running our governments for 40 years and continue to marvel that people educated by our finest institutions of learning are so out of touch with basic humanity. They live in their cocoon worlds and rarely venture out to see what's really going on around them. These are our Judges! Seems in Europe they have a hard time even reading the law and I am appalled at human rights abuses there because of local bigotry. The law in Europe seems to depend on where you are located, not on any common sense of justice.

    reply to this | link to this | view in thread ]

  13. identicon
    Remi, 7 Oct 2010 @ 6:02pm

    Actually, google's auto-suggest is somewhat censored.

    People in France complained about a query equivalent to "Blacks are" which, on shows no suggested results.

    reply to this | link to this | view in thread ]

  14. identicon
    Zorglub, 15 Nov 2010 @ 3:04pm

    I don't think the French Courts are "Confused About Google Suggest" at all. How google suggests words or lets people sugget words is not the courts' business.

    I will put a gun on my garden gate, that will automatically fire a bullet somewhere in the street each time my neighbour's dog barks. Now I cannot be sued if somebody gets hurt?

    reply to this | link to this | view in thread ]

  15. identicon
    Glen Woodfin, 10 Dec 2010 @ 10:45pm

    Controlling Google Auto Suggest

    It's so unfair of Google. They increase the relevance of the word scam in porportion to other keyword phrases that are getting 30 times more searches.

    If anyone has been able to game the Google Suggest, I'm all ears.

    reply to this | link to this | view in thread ]

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Special Affiliate Offer

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads


Email This

This feature is only available to registered users. Register or sign in to use it.