Yes, Privacy Is Important, But California's New Privacy Bill Is An Unmitigated Disaster In The Making
from the not-how-to-do-it dept
We’ve talked a little about the rush job to pass a California privacy bill — the California Consumer Privacy Act of 2018 (CCPA) — and a little about how California’s silly ballot initiatives effort forced this mad dash. But a few people have asked us about the law itself and whether or not it’s any good. Indeed, some people have assumed that so many lobbyists freaking out about the bill is actually a good sign. But, that is not the case. The bill is a disaster, and it’s unclear if the fixes that are expected over the next year and a half will be able to do much to improve it.
First, let’s state the obvious: protecting our privacy is important. But that does not mean that any random “privacy regulation” will be good. In a future post, I’ll discuss why “regulating privacy” is a difficult task to tackle without massive negative consequences. Hell, over in the EU, they spent years debating the GDPR, and it’s still been a disaster that will have a huge negative impact for years to come. But in California they rushed through a massive bill in seven days. A big part of the problem is that people don’t really know what “privacy” is. What exactly do we need to keep private? Some stuff may be obvious, but much of it actually depends quite heavily on context.
But the CCPA takes an insanely broad view of what “personal info” is covered. Section 1798.140(o)(1) defines “personal information” to mean… almost anything:
?Personal information? means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following:
(A) Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier Internet Protocol address, email address, account name, social security number, driver?s license number, passport number, or other similar identifiers.
(B) Any categories of personal information described in subdivision (e) of Section 1798.80.
(C) Characteristics of protected classifications under California or federal law.
(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.
(E) Biometric information.
(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer?s interaction with an Internet Web site, application, or advertisement.
(G) Geolocation data.
(H) Audio, electronic, visual, thermal, olfactory, or similar information.
(I) Professional or employment-related information.
(J) Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).
(K) Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer?s preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
So, first off, note that it’s not just associated with any individual, but also a “household.” And, again, in the list above, there are lots of items and situations where it totally makes sense that that information should be considered “private.” But… there are other cases where that might not be so obvious. Let’s take “information regarding a consumer’s interaction with an internet web site.” Okay. Yes, you can see that there are reasonable privacy concerns around a company tracking everything you do on a website. But… that’s also generally useful information for any website to have just to improve the user experience — and basically every website has tended do some form of user tracking. It’s not privacy violating — it’s just understanding how people use your website. So if I’m tracking how many people flow from the front page to an article page… now suddenly that’s information impacted by this law. Perhaps the law is intended to mean tracking people on other websites through beacons and such… but the law appears to make no distinction between tracking people on your own website (a feature that’s built into basically every webserver) and tracking people elsewhere.
Similarly, “preferences” is private information? Sure, on some sites and for some reasons that makes sense. But, I mean, we do things like let users set in their preferences whether or not they want to see ads on our site at all (in case you don’t know, you can turn off ads on this site in the preferences, no questions asked). But… in order to make sure we don’t show you ads, we kinda have to keep track of those preferences. Now, I think many of you will recognize that in removing ads, we’re actually helping you protect your privacy. But under this law, we’re now incentivized not to keep such preferences because doing so is now its own legal liability.
And that leads into the “costs” vs. the “benefits” of such a law. Again, lets be clear: many internet companies have been ridiculously lax in how they treat user data. That’s a real problem that we don’t, in any way, mean to diminish. But, the costs of this law seem very, very, very likely to outweigh a fairly minimal set of benefits. On the benefits side: yes, a few companies who have abused your data will face some pretty hefty fines for continuing to do so. That’s potentially good. But the costs are going to be massive. For this part, I’ll borrow from Eric Goldman’s analysis of the bill, which is well worth reading. It’s long, but he outlines just some of the likely “costs”:
How Much Will This Cost? (part 1) Regulated companies?i.e., virtually every business in California?will need to spend money on compliance, including building new processes to deal with the various consumer requests/demands. Adding up all of the expenditures across the California economy, how much will this cost our society? It?s not like these expenditures come from some magic pot of money; the costs will be indirectly passed to consumers. Are consumers getting a good deal for these required expenditures?
How Much Will This Cost? (part 2) Lengthy statutes seem like they are detailed enough to eliminate ambiguity, but it actually works in reverse. The longer the statutes, the more words for litigators to fight over. This law would give us 10,000 different bases for lawsuits. One of the current tussles between the initiative and the bill is whether there is a private right of action. Right now, the bill attempts to limit the private causes of action to certain data breaches. If the private right of action expands beyond that, SEND YOUR KIDS TO LAW SCHOOL.
How Much Will This Cost? (part 3) The bill would create a new ?Consumer Privacy Fund,? funded by a 20% take on data breach enforcement awards, to offset enforcement costs and judiciary costs. Yay for the bill drafters recognizing the government administration costs of a major new law like this. Usually, bill drafters assume a new law?s enforcement costs can be buried in existing budgets, but here, the bill drafters are (likely correctly) gearing up for litigation fiestas. But exactly how much will these administration costs be, and will this new fund be sufficient or have we written a blank check from the government coffers to fund enforcement? Most likely, I expect the Consumer Privacy Fund will spur enforcement, i.e., enforcement actions will be brought to replenish the fund to ensure it has enough money to pay the enforcers? salaries?a perpetual motion machine.
Let’s dig into that first part, because it’s important. It’s important to remind people that this bill is not an “internet” privacy bill. It’s an everyone privacy bill. Any business that does business in California more or less is impacted (there are some limitations, but for a variety of reasons — including vague terms in drafting — those limitations may be effectively non-existent). So now, in order to comply, any company, including (for example!) a small blog like ours, will have to go through a fairly onerous process to even attempt to be in compliance (though, as point two in Eric’s list above shows, even then we’ll likely have no idea if we really are).
An analysis by Lothar Determann breaks out some of what we’d need to do to comply, including setting up entirely new processes to handle data access requests, including a system to verify identity and authorization of individuals, and ways of tracking requests and blocking the chance that anyone who opts out of certain practices is offered a chance to opt-back in. So… to use the example above, if someone sets a preference on our site not to see ads, but then makes a data privacy request that we not track that data, we then would likely need to first verify the person and that they are the one making the request, and then we’d need to set up a system to (get this!) make sure we somehow track them well enough so that they… um… can’t “opt-in” to request that we no longer show them ads again.
Think about that. In order to stop letting someone opt out of ads on our site “for privacy purposes” we’d have to set up a system to track them to make sure that they aren’t even offered the possibility of opting out of ads again. It’s mind boggling. Also, this:
Consider alternative business models and web/mobile presences, including California-only sites and offerings, as suggested in Cal. Civ. Code §1798.135(b) and charges for formerly free services to address the complex and seemingly self-contradictory restrictions set forth in Cal. Civ. Code §1798.125 on a company’s ability to impose service charges on California residents who object to alternate forms of data monetization.
Okay, so I’m all for businesses exploring alternative business models. It’s what we talk about here all the time. But… requiring that by law? And, even requiring that we offer a special “California-only” site that we charge for?
I’m having difficulty seeing how that helps anyone’s privacy. Instead, it seems like it’s going to cost a ton. And… for limited to negative benefit in many cases. Just trying to figure out what this would cost us would mean we’d probably have to let go of multiple writers and spend that money on lawyers instead.
And that leaves out the cost to innovation in general. Again, this is not to slight the fact that abusive data practices are a real problem. But, under this law, it looks like internet sites that want to do any customization for users at all — especially data-driven customization — are going to be in trouble. And sure, some customization is annoying or creepy. But an awful lot of it is actually pretty damn useful.
An even larger fear: this could completely cut off more interesting services and business models coming down the road that actually would serve to give end users more control over their own data.
Also, as with the GDPR, there are serious First Amendment questions related to CCPA. A number of people have pointed out that the Supreme Court’s ruling in Sorrell v. IMS Health certainly suggests some pretty serious constitutional defects with the CCPA. In Sorrell, the Supreme Court struck down a Vermont law that banned reporting on certain prescription practices of doctors as violating the First Amendment. It’s possible that CCPA faces very similar problems. In an analysis by professor Jeff Kosseff, he explains how CCPA may run afoul of the Sorrell ruling:
CCPA is more expansive than the Vermont law in Sorrell, covering personal information across industries. Among its many requirements, CCPA requires companies to notify consumers of the sale of their personal information to third parties, and to opt out of the sale. However, CCPA exempts ?third parties? from coverage if they agree in a contract to process the personal information only for the purposes specified by the company and do not sell the information. Although CCPA restricts a wider range of third-party activities than the Vermont statute, it still leaves the door open for some third parties to be excluded from the disclosure restrictions, provided that their contracts with companies are broadly written and they do not sell the data. For instance, imagine a contract that allows a data recipient to conduct a wide range of ?analytics.? Because the recipient is not selling the data, the company might be able to disclose personal information to that recipient without honoring an opt-out request.
Under Sorrell, such distinctions might lead a court to conclude that CCPA imposes a content-based restriction on speech. Moreover, the findings and declarations section of the bill cites the revelations about Cambridge Analytica?s use of Facebook user data, and states ?[a]s a result, our desire for privacy controls and transparency in data practices is heightened.? This could cause a court to conclude that the legislature was targeting a particular type of business arrangement when it passed CCPA.
I highly recommend reading the rest of Kosseff’s analysis as well. He notes that he’s generally in favor of many internet regulations — and has been advocating for cybersecurity regulations and didn’t think amending CDA 230 would be that bad (I think he’s wrong on that… but…). And his analysis is that CCPA is so bad it cannot become law:
[M]y initial reaction was that nothing this unclear, burdensome, and constitutionally problematic could ever become law.
He then goes on to detail 10 separate serious issues under the law — and notes that those are just his top 10 concerns.
While it’s nice to see lots of people suddenly interested in privacy issues these days, the mad dash to “deal” with privacy through poorly thought out legislation where the target and intent is unclear other than “OHMYGOD INTERNET COMPANIES ARE DOING BAD STUFF!!!” isn’t going to do any good at all. There is little in here that suggests real protection of privacy — but plenty of costs that could significantly change the internet many of you know and love, in large part by making it nearly impossible for small businesses to operate online. And, frankly, ceding the internet to the largest providers who can deal with this doesn’t exactly seem like a way to encourage smaller operations who actually are concerned about your privacy.