This is hardly a fleshed out response, but I find the constant argument that we cannot do X, whether it's privacy regulation or tweaks to Section 230 here, because it will assuredly further entrench Facebook and Google to be tired and played out.
It also assumes that their underlying business model -- OBA -- is legitimate, and that we must do what we can to prop up competition that can deliver targeted ads. I think there's mounting evidence that targeted advertising is fundamentally problematic. I'd rather we dramatically increase the costs to Google and Facebook of their business model than prop up equally problematic competition.
Very much appreciate the response! So I put some thoughts into a Twitter thread (https://twitter.com/joejerome/status/1273341100154130432), but you may find me generally far more sympathetic to your position than Ernesto. In general, I think your suggestion to permit some sort of privacy non-profit focused class action is a good one, and might work to formalize and expand the capacity of privacy groups to push for stronger changes within companies. I also think the idea of individualized recourse is a good one -- and really we should be speaking more about "recourse" than "enforcement" generally. The issue is that industry voices have been largely silent. For the better part of the past year, I've raised at every salon/panel/working group meeting with industry reps what sort of non-PRA type recourse they'd be open to, and this has been largely met with confused silence. I'd like to see more industry voices signal they'd be open to discussing these sorts of ideas. Now perhaps this a form of negotiating chicken, but I've seen no evidence that Republican lawmakers are getting any sort of message that there's room for discussing recourse mechanisms beyond the FTC and state AGs. But I haven't seen anything in paper from anyone. I also appreciate you were responding to my two provocations, but I also think it's also worth addressing Cam Kerry's proposal around private rights of action, as well (https://www.brookings.edu/research/bridging-the-gaps-a-path-forward-to-federal-privacy-legislation/). I'm not sure he's addressed your concerns in full, but it is an attempt to try to cabin the most trolling of privacy litigation.
Very much appreciate the comments! Mike, I agree that any private enforcement right needs to be scoped such that it isn't overly abused. My solution is to go provision-by-provision in the privacy bill of your choice and have it out as to (1) who should have standing to sue; (2) which party should bear the cost of litigation; (3) what relief could be available; and (4) what the liability rules and burden of proof should be. Cam Kerry and John Morris at Brookings have done yeoman's work on private enforcement in a recent Brookings report: https://www.brookings.edu/research/bridging-the-gaps-a-path-forward-to-federal-privacy-legislation/ Jim, first, I'm entirely to blame for not giving Techdirt some better options for a title. I'm not sure "enforcement" is the best term for what I want more discussion about. The better term might be user recourse. So much of the privacy debate has turned to corporate accountability, but I'd rather know what companies will do when I get harmed. As you note, that leads to a discussion of what's the "real harm" here, and we'll likely need to agree to disagree. I think there are real dignity interests that have been violated by data-hungry companies, but I'd love to go back to Helen Nissenbaum's notion of contextual integrity. Before that concept was reduced to a sort of disclosure regime, actually having a public discussion about what sorts of data practices should (or should not) be in bounds would have been very useful. I think the whole point of a good privacy law is that lawmakers have to make some value judgements about where the harms lie. The big problem has been that privacy bills that focus on additional transparency requirements avoid that tough conversation. I would start with secondary uses of sensitive information. Most of the biggest privacy snafus are there. To the extent we're arguing over ad tracking disclosures, we're missing the mark.
With all the respect in the world to Kate, my major critique of all pieces like this is that they never describe the sort of privacy regulation that would actually be suitable. It ends with a call for Congress to pass "one set of strong, sensible, and straightforward privacy protections," but the piece has already crapped on both the GDPR and the CCPA as places to start. So what then? Both of Mike's questions get at the fact that industry groups, even as they call for strong federal privacy laws, don't really want anything that would change the status quo, by placing limits around secondary uses of information and limiting monetization avenues. Companies cannot abide by that, which is why they cannot get behind a national proposal that would seriously move the ball forward. (Compare, for example, ITIF's proposal for a "grand privacy bargain" with Brooking's recent paper on bridging privacy divides. From the surface, both papers are addressing the same issues and it looks like compromise might be possible, but they're a million miles apart on the stuff that would actually improves privacy.) While it is unfair to shower the start-up community with huge privacy compliance costs, data-driven start-ups are often the biggest privacy problem children. Clearview says what?
Re: Re: A Response to Mike
Fair that John doesn't say behavioral advertising, but to my understanding, the entire online advertising ecosystem is premised on automating "targeting" and "reach." The opacity of the entire stack invites mischief, so for purposes of John's proposal, I don't think it matters how what we're terming the functionality being provided by Google/Facebook and the major ad networks and exchanges.