by Timothy Lee
Wed, Dec 12th 2007 2:12pm
Ed Felten has an interesting post analyzing the fallout from Facebook's Beacon gaffe. It's now widely agreed that the company screwed up, which raises the question of why it wasn't obvious ahead of time that Beacon would prove unpopular with users. Felten offers a few ideas: delegating privacy to a single isolated department, treating privacy as a legal or PR problem, or underestimating the importance of peoples' emotional reactions. I think there's also something more fundamental going on: often no one really knows what the privacy rules for new technologies will be. We're now doing things with information that were literally impossible a couple of decades ago. Social conventions haven't been keeping up. As a result, no one—even users themselves—knows what will be considered a privacy violation until after it's been tried. Sometimes, as with last year's Facebook news feed announcement or Google's introduction of GMail, an initial negative reaction blows over once users learn more. In other cases opposition snowballs to the point where a company has no choice but to change course. It's often difficult to predict ahead of time which category a given product will fall into. Peoples' initial concerns are often based on very superficial impressions (such as the idea that GMail is "reading your email") that can turn out to be unfounded once users become more familiar with the product. Other features, such as Facebook's news feed, can turn out to be useful enough that users consider it to be worth a bit of foregone privacy. And in some cases, a new feature just turns out to be a plain old bad idea. We won't know until users have a chance to try a feature and provide feedback. Which is why I think it's a mistake to judge a company too harshly for introducing a new product that turns out to be a bad idea from a privacy perspective. We'll only learn good principles for privacy by experimentation, and experimentation inevitably leads to some missteps. As long as a company clearly discloses how user information will be used and is responsive to user concerns, I don't think people should hold the occasional misstep against it.
If you liked this post, you may also be interested in...
- Baltimore PD Can Keep Tabs On The Entire City, Thanks To Privately-Donated Aerial Surveillance System
- The EFF Calls Out Microsoft's Ongoing Bullshit On Windows 10 Privacy Concerns
- Did The NY Times Give Up Its Journalism Standards The Second Facebook Threw A Few Million Its Way?
- Canadian Court Says No Expectation Of Privacy In SMS Messages Residing On Someone Else's Phone
- News Sites Realizing That Relying On Facebook For Traffic Might Not Have Been Wise