The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Can You Protect Privacy If There's No Real Enforcement Mechanism?

from the enforcement-matters dept

Privacy laws can have a lot of moving pieces from notices and disclosures, opt-in and opt-out consent requirements to privacy defaults and user controls. Over the past few years, there has been significant progress on these issues because privacy advocates, consumer groups, industry voices, and even lawmakers have been willing to dive into definitional weeds, put options on the table, and find middle ground. But this sort of thoughtful debate has not happened when it comes to how privacy laws should be enforced and what should happen when companies screw up, families are hurt, and individuals’ privacy is invaded.

Instead, when it comes to discussing private rights of action and agency enforcement, rigid red lines have been drawn. Consumer groups and privacy advocates say let individuals sue in court — and call it a day. Business interests, when they talk about “strong enforcement,” often mean letting an underfunded Federal Trade Commission and equally-taxed state Attorneys General handle everything. Unfortunately, this binary, absolutist dispute over policing privacy rights threatens to sink any progress on privacy legislation.

It happened in Washington state, which failed to enact a comprehensive privacy framework in March because of a single sentence that could have let some consumers sue to enforce their rights under the state’s general Consumer Protection Act. Private rights of action have stymied state privacy task forces, and the issue is consuming efforts by the Uniform Law Commission to craft a model privacy bill. This is but a microcosm of what we’ve seen at the federal level, where lawmakers are at “loggerheads” over private rights of action.

This impasse is ridiculous. Advocacy groups share some blame here, but industry voices have failed to put any creativity into putting an alternative path forward. Company after company and trade association after trade association have come out in favor of privacy rules, but the response to any concern about how to ensure those rules are followed has been crickets. Few seem to have given much thought into what enforcement could look like beyond driving a Brinks truck full of money up to the FTC. That is not good enough. If industry is serious about working toward clear privacy rules, business interests have two obligations: (1) they should offer up some new ideas to boost enforcement and address legitimate concerns about regulatory limitations and capture; and (2) they need to explain why private rights of action should be a non-starter in areas where businesses already are misbehaving.

First, while we can acknowledge the good work that the FTC (and state Attorney Generals) has done, we should also concede that agencies cannot address every privacy problem and have competing consumer protection priorities. Commentators laud the FTC’s privacy work but have not suggested how an FTC with more resources will not just do more of what it’s already doing. There are outstanding considerations animating efforts to create an entirely new federal privacy agency (and that’s on top of a proposal in California to set up its own entirely new “Privacy Protection Agency”). Improving the FTC’s privacy posture will require more than more money and personnel.

Part of this will be creating mechanisms that ensure individuals can get redress. One idea would be to require the FTC to help facilitate complaint resolutions. The Consumer Financial Protection Bureau already does this to some extent with respect to financial products and services. The CFPB welcomes consumer complaints — and then works with financial companies to get consumers a direct response about problems. These complaints also help the CFPB identify problems and prioritize work, and then CFPB publishes (privacy friendly) complaint data. This stands in contrast to the FTC’s Consumer Sentinel Network, which is a black box to the public.

Indeed, the FTC’s complaint system is opaque even to complainants themselves. The black box nature of the FTC is, fairly or not, a constant criticism by privacy advocates. A group of advocates began the Trump administration by calling for more transparency from the Commission about how it handles complaints and responds to public input. I can speak to this issue, submitting my own complaint to the FTC about the privacy and security practices of VPNs in 2017. Months later, the FTC put out a brief blog post on the issue, which I took to be the end of the matter on their end. Some sort of dualtrack informal and formal complaint process like the Federal Communications Commission could be one way to ensure the FTC better communicates with outsiders raising privacy concerns.

These are mostly tweaks to FTC process, however, and while they address some specific complaints about privacy enforcement, they don’t address concerns that regulators have been missing — or avoiding — some of the biggest privacy problems we face. This is where the rigid opposition to private rights of action and failure to acknowledge the larger concern is so frustrating.

Sensitive data types present a good example. Unrestrained collection and use of biometrics and geolocation data have become two of the biggest privacy fights of the moment. There has been a shocking lack of transparency or corporate accountability around how companies collect and use this information. Their use could be the key to combating the ongoing pandemic; their misuse a tool for discrimination, embarassment, and surveillance. If ever there were data practices where more oversight is needed, these would be it.

Yet, the rapid creep of facial recognition gives us a real-world test case for how agency enforcement can be lacking. While companies have been calling for discussions about responsible deployment of facial recognition even as they pitch this technology to every school, hospital, and retailer in the world, Clearview AI just up and ignored existing FTC guidance and state law. Washington state has an existing biometric privacy law, which the state Attorney General admitted has never been the basis of an enforcement action. To my knowledge, the Texas Attorney General also has never brought a case under that state’s law. Meanwhile, the Illinois Biometric Privacy Act (BIPA) may be the one legal tool that can be used to go after companies like Clearview.

BIPA’s private right of action has been a recurring thorn in the sides of major social media companies and theme parks rolling out biometrics technologies, but no one has really cogently argued that companies aren’t flagrantly violating the law. Let’s not forget that facial recognition settings were an underappreciated part of the FTC’s most recent settlement with Facebook, too. However, no one can actually discuss how to tweak or modernize BIPA because industry groups have had a single-minded focus on stripping the law of all its private enforcement components.

Industry has acted in lockstep to insist it is unfair for companies to be subject to limitless liability by the omnipresent plaintiffs bar for every minor or technical violation of the law. And that’s the rub!

There is no rule that says a private right of action must encompass the entirety of a privacy law. One of the compromises that led to the California Consumer Privacy Act was the inclusion of a private right of action for certain unreasonable data breaches. Lawmakers can take heed and go provision-by-provision and specify exactly what sorts of activities could be subject to private litigation, what the costs of the litigation might be, and what remedies can ultimately be obtained.

The U.S. Chamber of Commerce has been at the forefront of insisting that private rights of action are poor tools for addressing privacy issues, because they can “undermine appropriate agency enforcement” and hamper the ability of “expert regulators to shape and balance policy and protections.” But what’s the objection then in areas where that’s not true?

The sharing and selling of geolocation information has become especially pernicious, letting companies infer sensitive health conditions and facilitating stalking. Can any industry voice argue that companies have been well-behaved when it comes to how they use location information? The FTC clearly stated in 2012 that precise geolocation data was sensitive information warranting extra protections. Flash forward to 2018 and 2019, where The New York Times is engaged in annual exposés on the wild west of apps and services buying and selling “anonymous” location data. Meanwhile, the Communications Act requires carriers to protect geolocation data, and yet the FCC fined all four major wireless carriers a combined $200 million for sharing their subscribers’ geolocation data with bounty hunters and stalkers in February of this year.

Businesses do not need regulatory clarity when it comes to location data — companies need to put in a penalty box for an extended timeout. Giving individuals the ability for private injunctive relief seems hardly objectionable given this track record. Permitting class actions for intentional violations of individuals’ geolocation privacy should be on the table, as well.

There should be more to discuss than a universe where trial attorneys sue every company for every privacy violation or a world where lawmakers hand the FTC a blank check. Unfortunately, no one has yet put forward a vision for what the optimum level of privacy enforcement should be. Privacy researchers, advocates, and vulnerable communities have forcefully said the status quo is not sufficient. If industry claims it understands the importance of protecting privacy but just needs more clarity about what the rules are, companies should begin by putting forward some plans for how they will help individuals, families, and communities when they fall short.

Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Can You Protect Privacy If There's No Real Enforcement Mechanism?”

Subscribe: RSS Leave a comment
12 Comments
Koby (profile) says:

The Deception Option

One nefarious possibility might be to make it legal to provide false identity information to some corporations. People might be able to pay (in cash!) to get a fake ID card made, and buy a cell phone. The carrier is creating a profile, but for who? Continue to pay the subscription with one of those prepaid debit cards. And then burn that phone after about a year, and get a new one. Of course, law enforcement won’t like this, because of course they are also taking advantage of the lack of privacy. But without enforcement, we might have an option to fight back, if some minor tweaks to existing law are made.

This comment has been flagged by the community. Click here to show it.

Upstream (profile) says:

companies need to put in a penalty box for an extended timeout.

No, the CEO’s, COO’s, presidents, vice-presidents, etc need to be put in a penalty box for an extended timeout.

It is unfortunate, but many people don’t care about their company, stockholders or whatever, as long as they can count on their golden parachutes. Prison or poverty are the only things that mean anything to these types of people. And by poverty, I mean "living in public housing and bagging groceries for a living" type poverty. Fining companies is usually meaningless, and giving victims the "opportunity" to waste time, and possibly spend lots of money, over many years of civil litigation is equally nonsensical.

That Anonymous Coward (profile) says:

"Your privacy is important to us"
The check is in the mail, the government is your freind and no baby I won’t something something in your mouth…

Little lies we all seem to accept, & refuse to demand change.

Data Broker collects your entire life & leaks it all over…
oooh credit monitoring, faux apology, & you get the bill to fix your life.
Fine to the company… less than $5 a person who got screwed.

Cory Doctrow has compared our personal data to toxic waste, something that should be locked away because allowing it out will cause huge harm. The downside is we keep accepting the "SuperFund" solution where we let it leak, slap the leaker on the wrist, & all the costs for clean up are paid by the public.

These companies are making huge amounts from collecting it all, but it seems the law wants to protect their profits over us. We need to rethink how this works & put people first profits maybe 3rd.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Then, on top of that, I think you’d really need to have an anti-SLAPP like mechanism to ward off frivolous/vexatious/nuisance suits. But I haven’t seen anyone talk about such a feature within the privacy context.

One of the problems here is the requirements to bring a "valid" privacy lawsuits.

Scenario A is where someone has managed to catch a company red handed, through whistleblowing, testimony in another lawsuit, etc. Evidence is already in hand, the fact a SLAPP-like mechanism cuts short discovery would not affect this scenario.

Scenario B is where plaintiff only has indirect evidence of privacy problems. e.g. "I have only ever supplied that particular (name+address, or email, or phone, or whatever) to (defendant)" While it can be clear that the defendant is involved, there is no evidence that defendant acted improperly. The plaintiff has a proper complaint, but without Discovery, they have no hope of proceeding.

…Unless the defendant is held to a fiduciary-equivalent standard. Is THAT happening?

Anonymous Coward says:

The U.S. Chamber of Commerce has been at the forefront of insisting that private rights of action are poor tools for addressing privacy issues, because they can “undermine appropriate agency enforcement” and hamper the ability of “expert regulators to shape and balance policy and protections.”

I’m just going to straight-up go for an argumentum ad hominem-style thought and say, "As if anyone should listen to a Chamber of commerce, ever".

Since i listened anyway… If there were agency enforcement, people probably would not be suing, particularly if said putative agency were to extract proper compensation as needed and deliver it to harmed parties. This generally doesn’t happen anyway, even if violators are somehow punished and forced to comply, so private rights of action only make sense. On the second note, what expert regulators, and assuming there are any which aren’t simply coreligionists or puppets of the corporations, why have they not done anything up to now?

Thanks, U.S. Chamber of Commerce, you’re as useful as ever.

That said, i think Mike is correct in his comment that, "I think you’d really need to have an anti-SLAPP like mechanism to ward off frivolous/vexatious/nuisance suits." But then, i think there should be a mechanism like that for all lawsuits.

ECA (profile) says:

In all of this...

Over the years there has been a few problems..
Few if any, know Who to contact/complain to If there is/were a problem.
Its like the old complaint, ‘Its made in China, so its crap’, And I suggest to them, that the US company over there, asked China to make it to THEIR spec.. Then brought it to you and sold it. Who is at fault?

How many Gov. Consumer protection agencies are there and whats their numbers? Oh! I forgot, we have the internet.. And the Gov. made sites KINDA suck.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
12:04 SOPA Didn't Die. It's Just Lying In Wait. (5)
09:30 Demanding Progress: From Aaron Swartz To SOPA And Beyond (3)
12:00 How The SOPA Blackout Happened (3)
09:30 Remembering The Fight Against SOPA 10 Years Later... And What It Means For Today (16)
12:00 Winding Down Our Latest Greenhouse Panel: Content Moderation At The Infrastructure Layer (4)
12:00 Does An Internet Infrastructure Taxonomy Help Or Hurt? (15)
14:33 OnlyFans Isn't The First Site To Face Moderation Pressure From Financial Intermediaries, And It Won't Be The Last (12)
10:54 A New Hope For Moderation And Its Discontents? (7)
12:00 Infrastructure And Content Moderation: Challenges And Opportunities (7)
12:20 Against 'Content Moderation' And The Concentration Of Power (32)
13:36 Social Media Regulation In African Countries Will Require More Than International Human Rights Law (7)
12:00 The Vital Role Intermediary Protections Play for Infrastructure Providers (7)
12:00 Should Information Flows Be Controlled By The Internet Plumbers? (10)
12:11 Bankers As Content Moderators (6)
12:09 The Inexorable Push For Infrastructure Moderation (6)
13:35 Content Moderation Beyond Platforms: A Rubric (5)
12:00 Welcome To The New Techdirt Greenhouse Panel: Content Moderation At The Infrastructure Level (8)
12:00 That's A Wrap: Techdirt Greenhouse, Broadband In The Covid Era (17)
12:05 Could The Digital Divide Unite Us? (29)
12:00 How Smart Software And AI Helped Networks Thrive For Consumers During The Pandemic (41)
12:00 With Terrible Federal Broadband Data, States Are Taking Matters Into Their Own Hands (18)
12:00 A National Solution To The Digital Divide Starts With States (19)
12:00 The Cost Of Broadband Is Too Damned High (12)
12:00 Can Broadband Policy Help Create A More Equitable And inclusive Economy And Society Instead Of The Reverse? (11)
12:03 The FCC, 2.5 GHz Spectrum, And The Tribal Priority Window: Something Positive Amid The COVID-19 Pandemic (6)
12:00 Colorado's Broadband Internet Doesn't Have to Be Rocky (9)
12:00 The Trump FCC Has Failed To Protect Low-Income Americans During A Health Crisis (26)
12:10 Perpetually Missing from Tech Policy: ISPs And The IoT (10)
12:10 10 Years Of U.S. Broadband Policy Has Been A Colossal Failure (7)
12:18 Digital Redlining: ISPs Widening The Digital Divide (21)
More arrow