A Dystopian Future Of Ads That Won't Stop Until You Say 'McDonald's' Could Be Avoided With More Transparency
from the and-control dept
I’ve discussed in the past how many people mistake privacy as some sort of absolute “thing” rather than a spectrum of trade-offs. Leaving your home to go to the store involves giving up a small amount of privacy, but it’s a trade-off most people feel is worth it (not so much for some uber-celebrities, and then they choose other options). Sharing information with a website is often seen as a reasonable trade-off for the services/information that website provides. The real problem is often just that the true trade-offs aren’t clear. What you’re giving up and what you’re getting back aren’t always done transparently, and that’s where people feel their privacy is being violated. When they make the decision consciously and the trade-off seems worth it, almost no one feels that their privacy is violated. Yet, when they don’t fully understand, or when the deal they made is unilaterally changed, that’s when the privacy is violated, because the deal someone thought they were striking is not what actually happened.
The amount of data this thing collects is staggering. It logs where, when, how, and for how long you use the TV. It sets tracking cookies and beacons designed to detect ?when you have viewed particular content or a particular email message.? It records ?the apps you use, the websites you visit, and how you interact with content.? It ignores ?do-not-track? requests as a considered matter of policy.
To some extent, that’s not really all that different than a regular computer. But, then it begins to get creepier:
It also has a built-in camera ? with facial recognition. The purpose is to provide ?gesture control? for the TV and enable you to log in to a personalized account using your face. On the upside, the images are saved on the TV instead of uploaded to a corporate server. On the downside, the Internet connection makes the whole TV vulnerable to hackers who have demonstrated the ability to take complete control of the machine.
More troubling is the microphone. The TV boasts a ?voice recognition? feature that allows viewers to control the screen with voice commands. But the service comes with a rather ominous warning: ?Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party.? Got that? Don?t say personal or sensitive stuff in front of the TV.
You may not be watching, but the telescreen is listening.
Now, yes, some of that certainly can be useful in creating interesting features and services. And, frankly, almost all of the same things can be said about the smartphone in your pocket with Siri or Google Now listening in to anything you say at any moment’s notice. But at the very least, with those smartphone systems people tend to see and understand the immediate benefits: they use those tools to get information and they’re fairly easy to turn off without creating other problems. With the TV, it seems to be more of the promise of potentially providing some future service — but it’s still willing and ready to listen in the meantime.
And, of course, just as I was finishing up with that article, I came across a report of a patent from Sony from a few years ago. It actually got some attention back in 2012 for describing a system in which your TV may ask you to say the advertiser’s name to end a commercial. This figure in the patent is the one that quite reasonably got plenty of attention.
Either way, it’s become quite clear that while the world is becoming more connected — between our computers, our phones, our TVs and much more, people are increasingly going to run into challenges around privacy. And, while some are going to jump to the conclusion that any information gathering and sharing is automatically bad and dangerous (or just crazy), it’s going to be important to recognize the trade-offs inherent in these new devices and services. If companies don’t want the public to totally freak out, they’d do well to make these processes much more transparent, clear and controllable by the users themselves. Unfortunately, we’re not quite there yet. The focus is still on hiding these things out of a fear that no one would use them if they knew what they were giving up. That seems like a recipe doomed to create privacy panics, rather than one that actually enables innovation to advance and which lets the public be comfortable with the choices they’re making.