News You Could Do Without

by Timothy Lee


Filed Under:
beacon, privacy

Companies:
facebook



Learning Good Privacy Rules Requires Experimentation

from the 20-20-hindsight dept

Ed Felten has an interesting post analyzing the fallout from Facebook's Beacon gaffe. It's now widely agreed that the company screwed up, which raises the question of why it wasn't obvious ahead of time that Beacon would prove unpopular with users. Felten offers a few ideas: delegating privacy to a single isolated department, treating privacy as a legal or PR problem, or underestimating the importance of peoples' emotional reactions. I think there's also something more fundamental going on: often no one really knows what the privacy rules for new technologies will be. We're now doing things with information that were literally impossible a couple of decades ago. Social conventions haven't been keeping up. As a result, no one—even users themselves—knows what will be considered a privacy violation until after it's been tried. Sometimes, as with last year's Facebook news feed announcement or Google's introduction of GMail, an initial negative reaction blows over once users learn more. In other cases opposition snowballs to the point where a company has no choice but to change course. It's often difficult to predict ahead of time which category a given product will fall into. Peoples' initial concerns are often based on very superficial impressions (such as the idea that GMail is "reading your email") that can turn out to be unfounded once users become more familiar with the product. Other features, such as Facebook's news feed, can turn out to be useful enough that users consider it to be worth a bit of foregone privacy. And in some cases, a new feature just turns out to be a plain old bad idea. We won't know until users have a chance to try a feature and provide feedback. Which is why I think it's a mistake to judge a company too harshly for introducing a new product that turns out to be a bad idea from a privacy perspective. We'll only learn good principles for privacy by experimentation, and experimentation inevitably leads to some missteps. As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don't think people should hold the occasional misstep against it.

Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 12 Dec 2007 @ 11:46pm

    There's never a problem if the companies that do this simply make every program "Opt-in". Is this so hard to understand?

    The problem here is that, as usual, it was "Opt-out". Compounding the problem was that they made it VERY DIFFICULT to opt out (very short screen time) and you had to do it EVERY TIME.

    Idiots.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: Techdirt Logo Gear
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.