Learning Good Privacy Rules Requires Experimentation

from the 20-20-hindsight dept

Ed Felten has an interesting post analyzing the fallout from Facebook's Beacon gaffe. It's now widely agreed that the company screwed up, which raises the question of why it wasn't obvious ahead of time that Beacon would prove unpopular with users. Felten offers a few ideas: delegating privacy to a single isolated department, treating privacy as a legal or PR problem, or underestimating the importance of peoples' emotional reactions. I think there's also something more fundamental going on: often no one really knows what the privacy rules for new technologies will be. We're now doing things with information that were literally impossible a couple of decades ago. Social conventions haven't been keeping up. As a result, no one—even users themselves—knows what will be considered a privacy violation until after it's been tried. Sometimes, as with last year's Facebook news feed announcement or Google's introduction of GMail, an initial negative reaction blows over once users learn more. In other cases opposition snowballs to the point where a company has no choice but to change course. It's often difficult to predict ahead of time which category a given product will fall into. Peoples' initial concerns are often based on very superficial impressions (such as the idea that GMail is "reading your email") that can turn out to be unfounded once users become more familiar with the product. Other features, such as Facebook's news feed, can turn out to be useful enough that users consider it to be worth a bit of foregone privacy. And in some cases, a new feature just turns out to be a plain old bad idea. We won't know until users have a chance to try a feature and provide feedback. Which is why I think it's a mistake to judge a company too harshly for introducing a new product that turns out to be a bad idea from a privacy perspective. We'll only learn good principles for privacy by experimentation, and experimentation inevitably leads to some missteps. As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don't think people should hold the occasional misstep against it.

Filed Under: ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Learning Good Privacy Rules Requires Experimentation”

Subscribe: RSS Leave a comment
Anonymous Coward says:

Re: Experimentation?

More like, learning how far you can push people requires experimentation.

Exactly. I can’t believe that they couldn’t see that this was an invasion of privacy. (I could have told them that. Should they hire me as their CEO?) Their miscalculation was just as to how much they could get away with. When greed is driving the desire to push boundaries, experimentation is often required to find out just where that boundary is on any particular day.

sonofdot says:

As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don’t think people should hold the occasional misstep against it.

That’s the problem. Facebook didn’t disclose, and wasn’t responsive. On the other hand, a blind pig could have seen that tracking people’s behavior when they’re not logged in would cause a backlash, even amongst those who only marginally value their privacy (i.e., most Facebook users).

Anonymous Coward says:


Sometimes, as with last year’s Facebook news feed announcement or Google’s introduction of GMail, an initial negative reaction blows over once users learn more.

I still think Google’s mail scanning is an invasion of privacy. However, they seem to have managed to PR spin their way with it to the point that it isn’t mentioned much anymore. That doesn’t mean that somehow everyone is OK with it now.

Anonymous Coward says:

Re: Re: GMail

And you don’t have to use it.

I didn’t say that I did and I find you insinuation that I did to be intellectually dishonest.

Plenty of other users seem to feel differently. How many is “plenty” (what is a weasel word)? And I would like to point out to you that just because someone uses GMail doesn’t mean that they don’t feel their privacy is nonetheless compromised.

Cynic says:

I guess it’s analogous to “learning whether readers of TechDirt will swallow a very thin line of reasoning about privacy policies can only be discovered by publishing the article”. As I read the responses above mine, there seems to be general agreement that it’s not that hard a thing to figure out…research, risk analysis plus transparency I think would catch the lion’s share.

Milan says:

While there could be truth to the general argument, one thing that stood out is that both of the success stories – GMail and the activity feed – were providing some direct benefit to the user – high-quality mail service and targeted ads being it in the former case. Beacon seemed to be about, well, sacrificing user privacy for Facebook revenue. Perhaps that’s what made this unacceptable?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...