from the what-could-possibly-go-wrong dept
“Smart” televisions have long been the poster child for the abysmal privacy and security standards inherent in the “internet of things” space. Such televisions have been routinely found to have the security and privacy standards of damp cardboard, making the data they collect delicious targets for hackers and intelligence agencies alike.
At the same time these companies have failed repeatedly to secure (or sometimes even encrypt) consumer data, their data collection revenue is positively exploding. Vizio, for example, recently noted that it made $38.4 million in one quarter just from tracking and monetizing consumer viewing and usage data. It made $48.2 million on hardware (both TVs, soundbars, and other products) in that same period, and that gap (if not already closed) is quickly closing:
“Its device business (the part that sells TVs, sound bars and the like) had a gross profit of $48.2 million in the same period, up from $32.5 million last year. While the hardware business has significantly more revenue, profits from data and advertising spiked 152 percent from last year, and are quickly catching up.”
The problem researchers keep pointing out is not enough of that revenue is being put back into device security research and privacy standards, which is why Vizio, like most “smart” TV manufacturers, has been repeatedly caught in privacy scandals. Like that time it had to shell out $2.2 million to the FTC and NJ AG for failing to inform consumers this data was even being collected. By the time consumers got their share of that settlement, it wound up being about $20 per person. And it’s not really clear anything would have happened at all if not for a 2015 ProPublica investigation into Vizio’s lack of transparency.
The problem of course is that regulators, when they do bother to act, act half a decade after the fact, and only if a journalist exposes the problem first. Consumers then get a tiny pittance. And it shouldn’t be too hard to understand how a $2.2 million fine — for a company pulling down $38.4 million every three months off of consumer data alone — probably isn’t going to be an effective deterrent against future privacy abuses. It’s viewed as just a light gnat on the nose and the cost of doing business.
Consumers do have a bit of control. They can disable a set’s WiFi features entirely, even though in many instances doing so can disable core set functionality in obnoxious and unforeseen ways. Ideally I’d love to be able to buy a “dumb” TV that’s just a great display with HDMI ports and no “smart” internals, but because consumer data is now so profitable, most TV vendors no longer even sell such an option.
It’s also worth remembering that your smart TV is just one in a long line of systems collecting and monetizing your data, including the streaming hardware you’re using (Roku, etc.), your ISP, any additional internet of things devices you’ve connected to your network, and even your energy company. While folks intent on downplaying modern privacy abuses often like to pretend this is the age of consumer empowerment, it’s not really possible for consumers to “opt out” of data collection and monetization at the scale it’s now occurring. Even with a lot of elbow grease, technical innovation, and external help.
Organizations like Consumer Reports have been pushing hard for improved efforts to warn consumers about potential privacy abuses at the point of sale, including them in product reviews and even on product packaging. And while their “open source” efforts on this front are really interesting, we’re a long way away from this kind of transparency being the norm.