from the privacy-theater-incorporated dept
Earlier this year Apple received ample coverage about how the company was making privacy easier for its customers by introducing a new, simple, tracking opt-out button for users as part of an iOS 14.5 update. Early press reports heavily hyped the concept, which purportedly gave consumers control of which apps were able to collect and monetize user data or track user behavior across the internet. Advertisers (most notably Facebook) cried like a disappointed toddler at Christmas, given the obvious fact that giving users more control over data collection and monetization, means less money for them.
By September researchers had begun to notice that Apple’s opt-out system was somewhat performative anyway. The underlying system only really blocked app makers from accessing one bit of data: your phone’s ID for Advertisers, or IDFA. There were numerous ways for app makers to track users anyway, so they quickly got to work doing exactly that, collecting information on everything from your IP address and battery charge and volume levels, to remaining device storage, metrics that can be helpful in building personalized profiles of each and every Apple user.
Privacy advocates and the press noted how this was all giving Apple users a false sense of security without really fixing much. Privacy experts and press outlets also repeatedly informed Apple this was happening, but nothing changed. In fact, the Financial Times notes that six months after the feature was introduced, Apple has further softened its stance on the whole effort:
“But seven months later, companies including Snap and Facebook have been allowed to keep sharing user-level signals from iPhones, as long as that data is anonymised and aggregated rather than tied to specific user profiles.
Here’s the thing. There’s been just an absolute torrent of studies showing how “anonymizing” data is a gibberish term. It only takes a few additional snippets of data to identify “anonymized” users, yet the term is still thrown around by companies as a sort of “get out of jail free” card when it comes to not respecting user privacy. There’s an absolute ocean of data floating around the data broker space that comes from apps, OS makers, hardware vendors, and telecoms, and “anonymizing” data doesn’t really stop any of them from building detailed profiles on you.
Apple’s opt-out button is largely decorative, helping the company brand itself as hyper privacy conscious without actually doing the heavy lifting required of such a shift:
“Lockdown Privacy, an app that blocks ad trackers, has called Apple?s policy ?functionally useless in stopping third-party tracking?. It performed a variety of tests on top apps and observed that personal data and device information is still ?being sent to trackers in almost all cases.”
A lot of folks spend a lot of time trying to tap dance around a fundamental truth: any effort to give consumers more control and clear insight over what’s being collected or sold reduces revenues by billions of dollars annually, even if only a fraction of existing users take advantage. And nobody with their face buried deep in the data monetization trough wants that. It’s why it’s 2021 and the U.S. still doesn’t even have a basic, clear privacy law for the internet era. Not even a super clean one mandating basic transparency requirements and working opt-out tools.
So what we get instead is a lot of gibberish and privacy theater by a lot of folks who don’t want to take even a tiny hit in revenues in exchange for healthier markets and happier users.
We also get just an endless parade of semantics, like ISP claims they “don’t sell access to your data” (no, they just give massive “anonymized” datasets away for free as part of a nebulous, broader arrangement they do get paid for). We get tracking opt-out tools that don’t actually opt you out of tracking, or opt you back in any time changes are made. And we get endless proclamations about how everybody supports codifying federal privacy laws from companies that immediately turn around and spend millions of dollars lobbying to ensure even a basic privacy law never sees the light of day.
At some point this combination of feckless oversight, rampant overcollection of data, minimal transparency, and repeated failure to adhere to the basics on data security will result in a privacy scandal that makes the last fives years’ worth of scandals look like a grade school picnic. When that happens, we might finally see some traction on at least a basic law that mandates transparency, opt-out tools that actually work, and penalties for lax security. Until that momentum shift happens, the majority of “privacy reform” efforts are going to have a high ratio of meaningless song and dance.