The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Many Think Internet Privacy Is Lost, But That's Because You Can't Sue Anyone Who Violates It

from the trust,-privacy,-and-liability dept

Over 90% of Americans feel like they have no control over their online privacy. It is not hard to understand why so many of us feel so powerless when it comes to using the Internet, nor is the solution to such a pervasive feeling all that complicated.

It just boils down to rules and liability—or, in other words, making sure if a company violates your privacy per the law that there is an inescapable penalty. The clearer and more direct the path to holding a company accountable for violating your privacy—much like your physical health, property rights, emotional wellbeing, or other things held in legally enforceable trusts—the more confidence will return to the Internet marketplace.

But we don't have these clear enforceable rights in today's American consumer privacy legal system for a vast majority of Internet privacy related activity. In fact, when the next Google or Facebook scandal rolls around in terms of user privacy, think back to the last one—likely just a few months old—and ask how much in damages the company paid and whether the company had to compensate individual people for the violation.

In many cases, that answer is going to be no penalty, which then feeds into users' sense of powerlessness. But the fact that companies often have to pay no penalty, and the fact that we do not have laws in place to remedy these privacy harms, is a choice we have made. It is not the natural order of things, and it is not inevitable.

We have, as a society, made decisions under our intellectual property laws where absolutely no liability is allowed to promote another profitless value, namely our freedom of expression. For example, the practice of criticizing a film on YouTube while playing portions of it in the background is considered a fair use. This means, despite copyright holders having the exclusive rights over the public performance of their work, we have decided to extinguish liability when it involves the expression of criticism.

In the absence of fair use, the critic using the film, as well as YouTube, would be directly liable for a lot of money for playing portions of it. However, we counterbalance and limit the economic right of the filmmaker in order to promote free speech values through fair use. In essence, we keep a liability-free zone for criticism and that is generally seen as a net positive for users. It also promotes the creation of open platforms, allowing those speakers to discover audiences and build engagement.

But in consumer privacy we have not seen nearly the same benefit yielded back to consumers in exchange for the mostly liability-free zone. There is no race to the top in guarding consumer’s personal information, because the profit maximizing effort isn’t about augmenting our privacy, it is about tearing it down as much as possible for profit. This is why we keep getting these privacy scandals. There is no need to apply morality to the analysis as often happens when people observe corporate behavior, but rather the simple question of how profit maximization (which corporations have to pursue under the law) is being countered by law to reflect our expectations.

When we look at the problem of consumer privacy from this angle, it becomes fairly clear that private rights of action for consumer personal privacy would be transformative. No longer would a corporation view experiments with handling personal information as a generally risk free profit making proposition if financial damages and a loss of profit were involved.

Industry wants a Consumer Privacy Law—Just so Long As You Can’t Sue Them

The long road of industry opposition—and the extreme hypocrisy of now pretending to endorse passage of a comprehensive consumer privacy law—is worth reflecting on in order to understand why in fact we have no law today.

If we go back a little over a decade to a privacy scandal that launched a series of congressional hearings, we find a little company called NebuAd that specialized in deep packet inspection.

NebuAd’s premise was scary in that it proposed to allow your ISP to record everything you do online and then monetize it with advertisers. I was a staffer on Capitol Hill when NebuAd came to Congress to explain their product, and still remember the general shock at the idea being proposed. In fact, the idea was so offensive it garnered bi-partisan opposition from House leaders and ultimately led to the demise of NebuAd.

The legislative hearings that followed the growing understanding of “deep packet inspection” led to discussions of a comprehensive privacy bill in 2009. But despite widespread concern with developing industry practices as the technology was evolving, we never got anywhere out of concern for the freemium model of Internet products. It is hard to remember this time, but back then the Internet industry was still a fairly new thing to the public and Congress.

The iPhone had just launched two years earlier, and the public was still in the process of transitioning from flip phones to smartphones. Only three years prior had Facebook become available to the general public. Google had only a small handful of vertical products, the newest being Google Voice—which allowed people to text for free at a time when each text you sent cost a fee.

All of these things were seen as net positive to users, yet all hinged on the monetization of personal information being relatively liability free. So for years policymakers, including an all out effort by the White House in 2012, searched for a means to balance privacy with innovation. Companies generally known as “big tech” today were still very sympathetic entities in that the innovations they continued to produce were seen as both novel and useful to people. Therefore, their involvement was actively solicited by the White House in trying to jointly draft a means to promote privacy while allowing the industry to flourish.

Ultimately, it was a wasted effort because what industry actually wanted was the liability free zone to be baked into law with little regard to increasing degradation of user privacy. It used to be that most of the Internet companies still had competitors with each other forcing them to try to be more attractive to users with greater privacy settings. Even Google Search was facing a direct assault by Microsoft with their fairly new Bing product.

As efforts to figure out a privacy regime for Internet applications and services were being stalled by the Internet companies, progress was being made with the substantially more mature and the already regulated Internet Service Provider (ISP) industry.

Congress had already passed a set of privacy laws for communications companies under Section 222 of the Communications Act, so a great many ISPs, being former telephone companies, had a comprehensive set of privacy laws applicable to them (including private rights of action). But their transition into broadband companies began to muddy the waters, particularly as the Federal Communications Commission started to say in 2005 that broadband was magically different and therefore should be quasi-regulated.

Having learned nothing from the fiasco of NebuAd and potentially having “deep packet inspection” banned for ISPs, other privacy invasive ideas kept getting rolled out by the broadband industry. Things such as “search hijacking”—where your search queries were monitored and rerouted—became a thing. AT&T began forcibly injecting ads into WiFi hotpots at airports, wireless ISPs preinstalled “Carrier IQ” on phones to track everything you did (which ended when people sued them directly under a class action lawsuit), and Verizon invented the “super-cookie,” prompting a privacy enforcement response from the FCC in 2014.

Even after the FCC stopped treating broadband as uniquely different from other communications access technology in 2015, the industry continued to push the line. In that same year telecom carriers partnered with SAP Consumer Insight 365[9] to “ingest” data from 20 to 25 million mobile subscribers close to 300 times every day (we do not know which mobile telephone companies participate in this practice, as that information is kept a secret). That data is used to inform retailers about customer browsing, geolocation, and demographic data.

So unsurprisingly, the FCC came out with strong, clear ISP privacy rules in 2016 that continued the long tradition of privacy protections for our communication networks.

However, the heavily captured Congress, which had never taken a major pro-privacy vote on Internet policy in close to a decade, quickly took action to repeal the widely supported FCC privacy rules on behalf of AT&T and Comcast. Ironically, the creation of ISP privacy rules by the FCC only happened because Congress created a series of privacy laws, including private rights of action, for various aspects of our communication industry more than a decade prior.

While many of the leaders of the ISP privacy repeal effort claim to be foes of big tech, they have done literally next to nothing to move a consumer privacy law. In fact, all they did was solidify the capture of Congress by giving AT&T and Comcast a reason to team up with Google and Facebook in opposing real privacy reform.

EFF witnessed this joint industry opposition first hand as we attempted to rectify the damage Congress did to broadband privacy with a state law in California. In fact, between ISPs and big tech we had absolutely no new privacy laws passed in the states in 2017 in response to Congress repealing ISP privacy rules.

Despite the arrogant belief they could sustain perpetual capture at the legislative level, along came an individual named Alastair McTaggert who personally financed a ballot initiative on personal privacy that later became the California Consumer Privacy Act (CCPA).

While they could “convince” a legislator of the righteousness of their cause with political contributions, they had no real means to convince the individual that the status quo was good. After Cambridge Analytica and wireless carriers selling geolocation to a black market for bounty hunters, virtually no one thinks this industry should be unregulated on privacy.

So rather than continue to publicly oppose real privacy protections, the industry has opted to pretend it supports a law just so long as it gets rid of state laws (including state private rights of action), putting all our eggs into the basket of a captured regulator. In other words, they only will support a federal privacy law if it further erodes our personal privacy rather than enhance it.

This opening offer from industry is a wild departure from other privacy statutes that have all included an individual right to sue such as wiretaps, stored electronic communications, video rentals, driver’s licenses, credit reporting, and cable subscriptions. Not to miss their marching orders, industry-friendly legislators were quick to put together a legislative hearing on consumer privacy that literally had no one representing consumers.

But this game by industry, where so long as they can hold and finance enough legislators to prevent any real law from passing, will only last so long. Afterall, their effort to ban states from passing privacy laws is effectively dead once the Speaker of the House from California made it clear she would not undermine her own state’s law on behalf of industry.

Furthermore, Senator Cantwell, a leader on the Senate Commerce Committee, has introduced comprehensive legislation that includes a private right of action and more than a dozen Senators led by Senator Schatz have endorsed the concept supported by EFF of creating an information fiduciary. As more and more legislators make publicly clear the parameters of what they consider a good law, it becomes harder for industry to sustain the behind the scenes opposition. But we are still far away from the end, which means more has to be done in the states until enough of Congress can break free of the industry shell game.

If We Do Not Restore Trust in Internet Products, People Will Make Less Use of the Internet and That Comes with Serious Consequences

As we wrestle with containing COVID-19, a solution being proposed by Apple and Google in the form of contact tracing is facing a serious hurdle. A majority of Americans do not want to use health data applications and services from these companies because they do not trust what they will do with their information. Since they can’t directly punish these companies for abusing their personal health data, they are exercising the only real choice they have left: not to use them at all.

Numerous federal studies from federal agencies such as the Department of Commerce, the Federal Trade Commission, and the FCC all point to the same end result if we do not have real privacy protections in place for Internet activity. People will simply refrain from using applications and services that involve sensitive uses such as healthcare or finances. In fact, lack of trust in how our personal information is handled has a detrimental impact on broadband adoption in general. Meaning a growing number of people will just not use the Internet at all in order to keep their personal information to themselves.

Given the systemic powerlessness users feel about their personal information when they use the Internet, the dampening effect it has on fully utilizing the Internet, and the loss of broadband adoption, it is fairly conclusive that the near liability free zone is an overall net negative as a public policy. Congress should be working to actively give users back their control, instead of letting the companies with the worst privacy track records dictate users’ legal rights. Any new federal data privacy law must not preempt stronger state data privacy rules and contain a private right of action.

While special tailoring has to be done for startups and new entrants with limited finances to ensure they can enter the market under the same conditions Google and Facebook launched, this is not true for big tech.

Establishing clear lines of liability and rules for major corporate entities, efforts to launch the next privacy invasive tech will be scrutinized by corporate counsel eager to shield the company from legal trouble. That ultimately is the point of having a private right of action in law. It is not to flood companies with lawsuits, but rather for them to operate in a manner that avoids lawsuits.

As users begin to understand that they have an inalienable legal right to privacy when they use the Internet, they will begin to trust the products with greater and more sensitive uses that will benefit them. This will open new lines of commerce as a growing number of users willingly engage in deeply personal interactions with next generation applications and services. For all the complaints industry has about consumer privacy laws, the one thing they never take into account is the importance of trust. Without it we start to lose the full potential of what the 21st century Internet can bring.

Ernesto Falcon is Senior Legislative Counsel at the Electronic Frontier Foundation with a primary focus on intellectual property, open Internet issues, broadband access, and competition policy.

Filed Under: greenhouse, internet, liability, privacy, trust


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 3 Jun 2020 @ 12:07pm

    Exactly what the entertainment industries paid the law makers to put in place! The violations they've committed must be beyond count!

    reply to this | link to this | view in chronology ]

  • icon
    Thad (profile), 3 Jun 2020 @ 1:36pm

    I also think that part of the problem is that it's nearly impossible to opt out of tracking (or, in the case of your cell phone provider, actually impossible, as there's no way for a cell phone to function without tracking your location).

    I run NoScript, and that's certainly preventing some tracking. But if you, for example, disable all scripts from Twitter or Google, you're going to lose a lot of functionality on the web.

    It's pretty difficult to use Android without being tracked by Google. (I'm a Sprint subscriber and I've found that there doesn't seem to be any way to get phone service working without installing Google Services.) You can go with Apple, but that comes with its share of compromises too.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Jun 2020 @ 6:49am

      Re:

      I also think that part of the problem is that it's nearly impossible to opt out of tracking

      And that's the real issue: That you should need to opt out instead of opt in.

      If you payed attention in Computer Science class, you'd know that the first rule of the internet is never upload something you don't want to share with the world forever. However, the current rules of the digital world are: Collect it now, pay the consequences never. Given that there is no opt-in requirement, people are forced to upload things that they may not want to upload, and the companies are free to profit off of that power imbalance. This is inconstant. The public shouldn't have to beg corporations, and each other, to quit violating their privacy.

      reply to this | link to this | view in chronology ]

  • icon
    tz1 (profile), 3 Jun 2020 @ 4:51pm

    Sue-age isn't a solution

    There are several related problems. First the tracking type violations are often silent. If they had to light a big red flashing indicator they might not do it, but the wiretapping is hidden. Second, there are technical quality and maintence reasons - that goes back to the old POTS where technicians might test lines. That was silent. Third sue all you want, but if you've been doxxed that bell can't be unrung and it gets messy trying to calculate damages. Fourth, you often want a public if small platform like Facebook, Twitter, or even this comment section. Would you bother posting if comments were hidden? There will never be a workable semi-private square, only private communications like a phone call and the public square like the big tech posting platforms. Fifth, people like a free lunch. Which is not free and has to be paid for by someone. There are no options on Facebook to be on the platform, pay some amount and NOT be tracked, analyzed, monitored, and worse. Nor Google. It would be nice to have even a "pay for privacy:, but then it would be obvious how bad they violate it.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Jun 2020 @ 4:51pm

    State Actors get a pass

    The companies that are collecting our information pass it along to some 3 letter agencies and suddenly they are essential and can't be sued. After a few decades of this kind of power, they start increasing fees, charging per gigabyte used and creating rules that do nothing but increase their income.

    If we want this to change, we have to start over with a new government without all of the corruption. Our rights mean nothing if we can't actually enjoy them.

    reply to this | link to this | view in chronology ]

  • icon
    tz1 (profile), 3 Jun 2020 @ 4:58pm

    Technically, you COULD sue...

    See DoorDash. Facebook has done worse. Arbitration costs $1500 to file but JAMA and California (where the ToS typically say how/where) say there is a limit of $250 total for the consumer, and they have to pay to fly out to your town to conduct it, and they have to pay attorney's fees... If everyone who FaceBook, Google, Twitter simply paid $250 over a privacy violation it would cost them a few Billion. Or shred the ToS as unconscionable and there would be interesting things there.

    reply to this | link to this | view in chronology ]

  • icon
    sumgai (profile), 4 Jun 2020 @ 11:30am

    It's pretty difficult to use [any smartphone] without all this hoopla....

    Seems to me that the proper answer is to use the phone only for what A.G. Bell designed it to do - make voice phone calls. Portability doesn't change that little factoid. Or didn't you ever stop to wonder why they still make "dumb phones"?

    reply to this | link to this | view in chronology ]

    • icon
      Thad (profile), 8 Jun 2020 @ 9:15am

      Re:

      Seems to me that the proper answer is to use the phone only for what A.G. Bell designed it to do - make voice phone calls. Portability doesn't change that little factoid.

      Yes, it does. Mobile phones cannot function without tracking your physical location.

      There is a difference between using a smartphone and a feature phone, but it's a difference of degree, not kind. If you're using a feature phone, then that means your location is only being tracked by your phone company, instead of your phone company plus a bunch of software vendors.

      reply to this | link to this | view in chronology ]

  • icon
    Zane (profile), 7 Jun 2020 @ 11:50pm

    Europe?

    Look to the EU, they are miles ahead in looking at these issues. As much flack as GDPR gets, it actually makes a lot of sense.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt

The Tech Policy Greenhouse
is a special project by Techdirt,
with support from:

Essential Reading
Techdirt Insider Chat
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.