Learning Good Privacy Rules Requires Experimentation

from the 20-20-hindsight dept

Ed Felten has an interesting post analyzing the fallout from Facebook's Beacon gaffe. It's now widely agreed that the company screwed up, which raises the question of why it wasn't obvious ahead of time that Beacon would prove unpopular with users. Felten offers a few ideas: delegating privacy to a single isolated department, treating privacy as a legal or PR problem, or underestimating the importance of peoples' emotional reactions. I think there's also something more fundamental going on: often no one really knows what the privacy rules for new technologies will be. We're now doing things with information that were literally impossible a couple of decades ago. Social conventions haven't been keeping up. As a result, no one—even users themselves—knows what will be considered a privacy violation until after it's been tried. Sometimes, as with last year's Facebook news feed announcement or Google's introduction of GMail, an initial negative reaction blows over once users learn more. In other cases opposition snowballs to the point where a company has no choice but to change course. It's often difficult to predict ahead of time which category a given product will fall into. Peoples' initial concerns are often based on very superficial impressions (such as the idea that GMail is "reading your email") that can turn out to be unfounded once users become more familiar with the product. Other features, such as Facebook's news feed, can turn out to be useful enough that users consider it to be worth a bit of foregone privacy. And in some cases, a new feature just turns out to be a plain old bad idea. We won't know until users have a chance to try a feature and provide feedback. Which is why I think it's a mistake to judge a company too harshly for introducing a new product that turns out to be a bad idea from a privacy perspective. We'll only learn good principles for privacy by experimentation, and experimentation inevitably leads to some missteps. As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don't think people should hold the occasional misstep against it.


Reader Comments (rss)

(Flattened / Threaded)

  •  
    identicon
    Todd Howe, Dec 12th, 2007 @ 2:31pm

    Experimentation?

    More like, learning how far you can push people requires experimentation.

     

    reply to this | link to this | view in chronology ]

    •  
      identicon
      Anonymous Coward, Dec 12th, 2007 @ 2:51pm

      Re: Experimentation?

      More like how can you best explain stuff so people don't actually understand what you are really doing but legally it's OK ......

       

      reply to this | link to this | view in chronology ]

    •  
      identicon
      Anonymous Coward, Dec 12th, 2007 @ 5:18pm

      Re: Experimentation?

      More like, learning how far you can push people requires experimentation.
      Exactly. I can't believe that they couldn't see that this was an invasion of privacy. (I could have told them that. Should they hire me as their CEO?) Their miscalculation was just as to how much they could get away with. When greed is driving the desire to push boundaries, experimentation is often required to find out just where that boundary is on any particular day.

       

      reply to this | link to this | view in chronology ]

  •  
    identicon
    sonofdot, Dec 12th, 2007 @ 2:50pm

    As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don't think people should hold the occasional misstep against it.
    That's the problem. Facebook didn't disclose, and wasn't responsive. On the other hand, a blind pig could have seen that tracking people's behavior when they're not logged in would cause a backlash, even amongst those who only marginally value their privacy (i.e., most Facebook users).

     

    reply to this | link to this | view in chronology ]

    •  
      identicon
      Tim Lee, Dec 13th, 2007 @ 6:02am

      Re:

      If they didn't disclose and weren't responsive, then how did those outraged users find out about it, and what do you call the changes Facebook made in response?

       

      reply to this | link to this | view in chronology ]

  •  
    identicon
    Erik, Dec 12th, 2007 @ 3:08pm

    Facebook needs money, they figured this was a way to get some, they didn't consider that people would be outraged by the "value added feature". Facebook doesn't care in the least about user privacy which is a main reason why I don't and won't use the site.

     

    reply to this | link to this | view in chronology ]

  •  
    identicon
    Anonymous Coward, Dec 12th, 2007 @ 5:27pm

    GMail

    Sometimes, as with last year's Facebook news feed announcement or Google's introduction of GMail, an initial negative reaction blows over once users learn more.

    I still think Google's mail scanning is an invasion of privacy. However, they seem to have managed to PR spin their way with it to the point that it isn't mentioned much anymore. That doesn't mean that somehow everyone is OK with it now.

     

    reply to this | link to this | view in chronology ]

    •  
      identicon
      Tim Lee, Dec 13th, 2007 @ 6:00am

      Re: GMail

      And you don't have to use it. Plenty of other users seem to feel differently.

       

      reply to this | link to this | view in chronology ]

      •  
        identicon
        Anonymous Coward, Dec 13th, 2007 @ 7:02pm

        Re: Re: GMail

        And you don't have to use it.

        I didn't say that I did and I find you insinuation that I did to be intellectually dishonest.

        Plenty of other users seem to feel differently. How many is "plenty" (what is a weasel word)? And I would like to point out to you that just because someone uses GMail doesn't mean that they don't feel their privacy is nonetheless compromised.

         

        reply to this | link to this | view in chronology ]

  •  
    identicon
    Alfed E. Neuman, Dec 12th, 2007 @ 5:54pm

    emotional reaction?

    "underestimating the importance of peoples' emotional reactions"

    More like commonsense reaction.

     

    reply to this | link to this | view in chronology ]

  •  
    identicon
    Cynic, Dec 12th, 2007 @ 6:02pm

    I guess it's analogous to "learning whether readers of TechDirt will swallow a very thin line of reasoning about privacy policies can only be discovered by publishing the article". As I read the responses above mine, there seems to be general agreement that it's not that hard a thing to figure out...research, risk analysis plus transparency I think would catch the lion's share.

     

    reply to this | link to this | view in chronology ]

  •  
    identicon
    Anonymous Coward, Dec 12th, 2007 @ 7:23pm

    Now if somebody would check out the cozy relationship that DoubleClick has with Paypal - What do you suppose happens when Google owns DoubleClick. All bets are off with the Justice Department.

     

    reply to this | link to this | view in chronology ]

  •  
    identicon
    Milan, Dec 12th, 2007 @ 7:58pm

    While there could be truth to the general argument, one thing that stood out is that both of the success stories - GMail and the activity feed - were providing some direct benefit to the user - high-quality mail service and targeted ads being it in the former case. Beacon seemed to be about, well, sacrificing user privacy for Facebook revenue. Perhaps that's what made this unacceptable?

     

    reply to this | link to this | view in chronology ]

    •  
      identicon
      Anonymous Coward, Dec 13th, 2007 @ 5:41am

      Re:

      ...direct benefit to the user - high-quality mail service and targeted ads being it in the former case.

      Oh yeah, ads are a real "benefit" to me. Hey, where can I go sign up for some spam? I love all those free "benefits" that com flooding into my mailbox.

       

      reply to this | link to this | view in chronology ]

  •  
    identicon
    Anonymous Coward, Dec 12th, 2007 @ 11:46pm

    There's never a problem if the companies that do this simply make every program "Opt-in". Is this so hard to understand?

    The problem here is that, as usual, it was "Opt-out". Compounding the problem was that they made it VERY DIFFICULT to opt out (very short screen time) and you had to do it EVERY TIME.

    Idiots.

     

    reply to this | link to this | view in chronology ]

  •  
    identicon
    Ferin, Dec 13th, 2007 @ 5:00am

    But one would think they could have asked a small subset of users: "Hey we were thinking about this new feature, what do ya think?"

     

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Save me a cookie
  • Note: A CRLF will be replaced by a break tag (<br>), all other allowable HTML will remain intact
  • Allowed HTML Tags: <b> <i> <a> <em> <br> <strong> <blockquote> <hr> <tt>
Follow Techdirt
A word from our sponsors...
Essential Reading
Techdirt Reading List
Techdirt Insider Chat
A word from our sponsors...
Recent Stories
A word from our sponsors...

Close

Email This