ODNI Report Shows Uncle Sam Buys Huge Troves Of Consumer Data From Brokers To Avoid Warrants, Trample The Law
from the it's-the-corruption,-stupid dept
Story after story after story has showcased how the intentionally convoluted adtech and data broker market sloppily traffics in all manner of sensitive consumer data, whether it’s your daily physical movements (say, the last time you visited an abortion clinic), your granular browsing habits, your medical history, your household energy use patterns, or even your mental health data.
This massive trove of data is then used to categorize and classify Americans on an increasingly complicated array of criteria in a bid to sling ads and sell products online. Companies collect way more consumer data than is needed, it’s simply not secured in any competent way, and while adtech brokers will happily claim they “anonymize” this data to protect consumer privacy, study after study have showcased how that word is absolutely meaningless, providing flimsy cover as the sector sells access to datasets relatively cheaply, often without competently screening the purchaser.
Obviously that’s a problem for a long list of reasons. Vigilantes can obtain abortion clinic visitation data. Foreign governments can obtain detailed profiles of Americans. Sexual preferences can be weaponized. And, of course, the U.S. government can easily abuse this unaccountable free-for-all to obtain U.S. consumer data without pesky warrants or oversight.
The latest case in point: a new report by The Office of the Director of National Intelligence (ODNI) once again confirms the obvious: that the U.S. government is exploiting the largely unregulated data broker market to obtain vast, cheap troves of sensitive U.S. consumer data, even when Congress and the Supreme Court have expressly forbidden it:
In the shadow of years of inaction by the US Congress on comprehensive privacy reform, a surveillance state has been quietly growing in the legal system’s cracks. Little deference is paid by prosecutors to the purpose or intent behind limits traditionally imposed on domestic surveillance activities. More craven interpretations of aging laws are widely used to ignore them. As the framework guarding what privacy Americans do have grows increasingly frail, opportunities abound to split hairs in court over whether such rights are even enjoyed by our digital counterparts.
The report is also quick to note what everybody has known for a long time: claims that industry “anonymizes” this data to protect consumer identities are generally bullshit, since it’s relatively trivial to identify users with just a modicum of additional data. “Anonymization” is tossed around casually as some kind of ethical and privacy get out of jail free card, when it’s simply gibberish:
It is no secret, the report adds, that it is often trivial “to deanonymize and identify individuals” from data that was packaged as ethically fine for commercial use because it had been “anonymized” first. Such data may be useful, it says, to “identify every person who attended a protest or rally based on their smartphone location or ad-tracking records.” Such civil liberties concerns are prime examples of how “large quantities of nominally ‘public’ information can result in sensitive aggregations.” What’s more, information collected for one purpose “may be reused for other purposes,” which may “raise risks beyond those originally calculated,” an effect called “mission creep.”
Sure, wholesale corruption and greed is a major reason why it’s 2023 and we still haven’t passed even a baseline privacy law for the internet era and competently regulated data brokers. But it’s also because holding the data broker and adtech space for lax privacy and security practices operates in stark contrast to the interests of those keen on mindlessly expanding our domestic surveillance apparatus:
Perhaps most controversially, the report states that the government believes it can “persistently” track the phones of “millions of Americans” without a warrant, so long as it pays for the information. Were the government to simply demand access to a device’s location instead, it would be considered a Fourth Amendment “search” and would require a judge’s sign-off. But because companies are willing to sell the information—not only to the US government but to other companies as well—the government considers it “publicly available” and therefore asserts that it “can purchase it.”
That’s why the often performative policy fixation on TikTok — and the pretense that banning TikTok actually fixes the broader problem — are naïve baby talk. It’s also a giant (often intentional) distraction from the real problem: our corrupt inability to pass even basic privacy legislation or regulate a data broker market that’s been running amok for the better part of two decades.
You’ll see endless hyperventilation in DC about China and TikTok, yet those same folks will curiously avoid discussing how domestic intelligence is also able to obtain this same data on the cheap. And they’ll avoid it because they don’t actually care about privacy, but they do care about making money and mindlessly expanding U.S. government surveillance power as it tramples accountability underfoot.