Cindy Cohn, EFF's Techdirt Profile

Cindy Cohn, EFF

About Cindy Cohn, EFF

Posted on Techdirt - 8 September 2021 @ 03:41pm

New Texas Abortion Law Likely To Unleash A Torrent Of Lawsuits Against Online Education, Advocacy And Other Speech

In addition to the drastic restrictions it places on a woman’s reproductive and medical care rights, the new Texas abortion lawSB8, will have devastating effects on online speech. 

The law creates a cadre of bounty hunters who can use the courts to punish and silence anyone whose online advocacy, education, and other speech about abortion draws their ire. It will undoubtedly lead to a torrent of private lawsuits against online speakers who publish information about abortion rights and access in Texas, with little regard for the merits of those lawsuits or the First Amendment protections accorded to the speech. Individuals and organizations providing basic educational resources, sharing information, identifying locations of clinics, arranging rides and escorts, fundraising to support reproductive rights, or simply encouraging women to consider all their options now have to consider the risk that they might be sued for merely speaking. The result will be a chilling effect on speech and a litigation cudgel that will be used to silence those who seek to give women truthful information about their reproductive options. 

SB8, also known as the Texas Heartbeat Act, encourages private persons to file lawsuits against anyone who “knowingly engages in conduct that aids or abets the performance or inducement of an abortion.” It doesn’t matter whether that person “knew or should have known that the abortion would be performed or induced in violation of the law,” that is, the law’s new and broadly expansive definition of illegal abortion. And you can be liable even if you simply intend to help, regardless, apparently, of whether an illegal abortion actually resulted from your assistance.

And although you may defend a lawsuit if you believed the doctor performing the abortion complied with the law, it is really hard to do so. You must prove that you conducted a “reasonable investigation,” and as a result “reasonably believed” that the doctor was following the law. That’s a lot to do before you simply post something to the internet, and of course you will probably have to hire a lawyer to help you do it.

SB8 is a “bounty law”: it doesn’t just allow these lawsuits, it provides a significant financial incentive to file them. It guarantees that a person who files and wins such a lawsuit will receive at least $10,000 for each abortion that the speech “aided or abetted,” plus their costs and attorney’s fees. At the same time, SB8 may often shield these bounty hunters from having to pay the defendant’s legal costs should they lose. This removes a key financial disincentive they might have had against bringing meritless lawsuits. 

Moreover, lawsuits may be filed up to six years after the purported “aiding and abetting” occurred. And the law allows for retroactive liability: you can be liable even if your “aiding and abetting” conduct was legal when you did it, if a later court decision changes the rules. Together this creates a ticking time bomb for anyone who dares to say anything that educates the public about, or even discusses, abortion online.

Given this legal structure, and the law’s vast application, there is no doubt that we will quickly see the emergence of anti-choice trolls: lawyers and plaintiffs dedicated to using the courts to extort money from a wide variety of speakers supporting reproductive rights.

And unfortunately, it’s not clear when speech encouraging someone to or instructing them how to commit a crime rises to the level of “aiding and abetting” unprotected by the First Amendment. Under the leading case on the issue, it is a fact-intensive analysis, which means that defending the case on First amendment grounds may be arduous and expensive. 

The result of all of this is the classic chilling effect: many would-be speakers will choose not to speak at all for fear of having to defend even the meritless lawsuits that SB8 encourages. And many speakers will choose to take down their speech if merely threatened with a lawsuit, rather than risk the law’s penalties if they lose or take on the burdens of a fact-intensive case even if they were likely to win it. 

The law does include an empty clause providing that it may not be “construed to impose liability on any speech or conduct protected by the First Amendment of the United States Constitution, as made applicable to the states through the United States Supreme Court’s interpretation of the Fourteenth Amendment of the United States Constitution.” While that sounds nice, it offers no real protection—you can already raise the First Amendment in any case, and you don’t need the Texas legislature to give you permission. Rather, that clause is included to try to insulate the law from a facial First Amendment challenge—a challenge to the mere existence of the law rather than its use against a specific person. In other words, the drafters are hoping to ensure that, even if the law is unconstitutional—which it is—each individual plaintiff will have to raise the First Amendment issues on their own, and bear the exorbitant costs—both financial and otherwise—of having to defend the lawsuit in the first place.

One existing free speech bulwark—47 U.S.C. § 230 (“Section 230”)—will provide some protection here, at least for the online intermediaries upon which many speakers depend. Section 230 immunizes online intermediaries from state law liability arising from the speech of their users, so it provides a way for online platforms and other services to get early dismissals of lawsuits against them based on their hosting of user speech. So although a user will still have to fully defend a lawsuit arising, for example, from posting clinic hours online, the platform they used to share that information will not. That is important, because without that protection, many platforms would preemptively take down abortion-related speech for fear of having to defend these lawsuits themselves. As a result, even a strong-willed abortion advocate willing to risk the burdens of litigation in order to defend their right to speak will find their speech limited if weak-kneed platforms refuse to publish it. This is exactly the way Section 230 is designed to work: to reduce the likelihood that platforms will censor in order to protect themselves from legal liability, and to enable speakers to make their own decisions about what to say and what risks to bear with their speech. 

But a powerful and dangerous chilling effect remains for users. Texas’s anti-abortion law is an attack on many fundamental rights, including the First Amendment rights to advocate for abortion rights, to provide basic educational information, and to counsel those considering reproductive decisions. We will keep a close eye on the lawsuits the law spurs and the chilling effects that accompany them. If you experience such censorship, please contact info@eff.org.

Originally published to the EFF Deeplinks blog.

Posted on Techdirt - 2 March 2016 @ 09:28am

EFF Director Cindy Cohn On Why You Should Support Techdirt's Encryption Crowdfunding Campaign

With our ongoing crowdfunding campaign concerning our coverage of the encryption fight, EFF Executive Director Cindy Cohn kindly offered to write a plea on our behalf as to why you should fund our efforts.

Help Techdirt Cut Through The Confusion In The Crypto Fight

Techdirt is a key part of the digital revolution. They cover our work at EFF, of course, but more importantly, their original reporting and in-depth research help us nail down facts across our issues, and their analysis is strong. The clear way in which Techdirt’s reporters lay out both technical and non-technical aspects of their stories provides an example for everyone looking to speak to many audiences at once.

I was so happy to hear that Techdirt is planning to gear up its coverage of the current fight to protect strong security and privacy in our digital world. We sometimes call this Crypto Wars Part Deux here at EFF. The FBI’s efforts to undermine and compromise the tools we all increasingly rely upon and trust with our most intimate communications, information and plans create risks us all. EFF has been down this road before. We were leaders in the first Crypto Wars in the 1990s; I personally handled the Bernstein v. DOJ case that first freed encryption from government export controls and established that the First Amendment applied to the act of writing code. We thought we won that battle, but the FBI is back again (and the NSA apparently continued all along) seeking in various ways to undermine our security. We’ve stepped up again to protect our privacy and push for the common sense notion that we need to be able to trust the digital tools that hold, carry and store an increasing amount of our most sensitive information.

I was excited to have Techdirt focus on this because of Techdirt’s long track record of trustworthy reporting. Techdirt’s coverage of copyright issues, for instance, has resonated with musicians and technologists alike. During the height of the battle over music file sharing, Techdirt picked apart music industry “solutions” to show they weren’t as attractive as they may have sounded. Nor is Techdirt afraid to take a stand. On top of their informative reporting on the Cyber Intelligence Sharing and Protection Act of 2011 (CISPA), Techdirt actively participated in a coalition, along with EFF, that organized a massive Twitter protest campaign against the privacy-invasive bill. Already, as the debate over iPhone security has brought encryption to broad public attention, Techdirt has done amazing work separating what people are saying from what their words actually mean, as well as providing the underlying legal documents so people can read them for themselves.

This time around the issues surrounding encryption are much bigger than they were 20 years ago and reach far beyond the technical community. More than ever we need media and analysis that won’t be confused or misled, that will follow stories past the headlines and scare tactics and that will help the much wider range of people affected by this debate understand what’s at stake. Luckily, Techdirt is up for the task and all they need is a little help from their audience to get there. I hope you will help.

Help Techdirt Cut Through The Confusion In The Crypto Fight

Posted on Techdirt - 26 September 2014 @ 07:39pm

Nine Epic Failures Of Regulating Cryptography

They can promise strong encryption. They just need to figure out how they can provide us plain text. – FBI General Counsel Valerie Caproni, September 27, 2010

[W]e’re in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge’s authority where we can get there if somebody is planning a crime. – FBI Director Louis Freeh, May 11, 1995

Here we go again.  Apple has done (and Google has long announced they will do) basic encryption on mobile devices. And predictably, law enforcement has responded with howls of alarm.

We’ve seen this movie before.  Below is a slightly adapted blog post from one we posted in 2010, the last time the FBI was seriously hinting that it was going to try to mandate that all communications systems be easily wiretappable by mandating “back doors” into any encryption systems.  We marshaled eight “epic failures” of regulating crypto at that time, all of which are still salient today.  And in honor of the current debate, we’ve added a ninth: 

. . .

If the government howls of protest at the idea that people will be using encryption sound familiar, it’s because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans’ privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it’s now rising from the grave, bringing the same disastrous flaws with it.

For those who weren’t following digital civil liberties issues in 1995, or for those who have forgotten, here’s a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago:

  1. It will create security risks. Don’t take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it’s hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: “Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access.”It doesn’t end there. Bellovin notes:

    Complexity in the protocols isn’t the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called ‘lawful intercept’ mechanisms in the switch ? that is, the features designed to permit the police to wiretap calls easily ? was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister’s. This attack would not have been possible if the vendor hadn’t written the lawful intercept code.

    More recently, as security researcher Susan Landau explains, “an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements ? a system already in use by major carriers ? had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications.”

    The same is true for Google, which had its “compliance” technologies hacked by China.

    This isn’t just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products ? the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?

  2. It won’t stop the bad guys. Users who want strong encryption will be able to get it ? from Germany, Finland, Israel, and many other places in the world where it’s offered for sale and for free. In 1996, the National Research Council did a study called “Cryptography’s Role in Securing the Information Society,” nicknamed CRISIS. Here’s what they said:

    Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. ? CRISIS Report at 303

    None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996. So unless the goverment wants to mandate that you are forbidden to run anything that is not U.S. government approved on your devices,  they won’t stop bad guys from getting  access to strong encryption.

  3. It will harm innovation. In order to ensure that no “untappable” technology exists, we’ll likely see a technology mandate and a draconian regulatory framework. The implications of this for America’s leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he’d had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.
  4. It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we’re just handing business over to foreign companies who don’t have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it’s not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They’d have to be tappable, too.
  5. It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there’s no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
  6. It will be unconstitutional.. Of course, we wouldn’t be EFF if we didn’t point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a “no encryption allowed” proposal that we’ve seen so far. Some likely problems:
    • The First Amendment would likely be violated by a ban on all fully encrypted speech.
    • The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
    • The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our “papers” in advance of a showing of probable cause, and our digital communications shouldn’t be treated any differently.
    • The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
    • Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
  7. It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government (at least for the FBI in domestic investigations — the NSA is another matter as we now all know). Yet the extra tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act “in the clear” by not using encryption readily available from a German or Israeli company or for free online.
  8. The government hasn’t shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn’t prevent investigators from obtaining the communications they were after.This truth was made manifest in a recent Washington Post article written by an ex-FBI agent. While he came up with a scary kidnapping story to start his screed, device encryption simply had nothing to do with the investigation.  The case involved an ordinary wiretap. In 2010, the New York Times reported that the government officials pushing for this have only come up with a few examples (and it’s not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI’s PR campaign, but we’ll be watching closely to see if underneath all the scary hype there’s actually a real problem demanding this expensive, intrusive solution.
  9. Mobile devices are just catching up with laptops and other devices.  Disk encryption just isn’t that new. Laptops and desktop computers have long had disk encryption features that the manufacturers have absolutely no way to unlock. Even for simple screen locks with a user password, the device maker or software developer doesn’t automatically know your password or have a way to bypass it or unlock the screen remotely.Although many law enforcement folks don’t really like disk encryption on laptops and have never really liked it, and we understand that some lobbied against it in private, we haven’t typically heard them suggest in public that it was somehow improper for these vendors not to have a backdoor to their security measures.That makes us think that the difference here is really just that some law enforcement folks think that phones are just too popular and too useful to have strong security.  But strong security is something we all should have.  The idea that basic data security is just a niche product and that ordinary people don’t deserve it is, frankly, insulting.  Ordinary people deserve security just as much as elite hackers, sophisticated criminals, cops and government agents, all of whom have ready access to locks for their data.  

The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don’t. Indeed, Bellovin argues: “Time has also shown that the government has almost always managed to go around encryption.” (One circumvention that’s worked before: keyloggers.) But if the FBI’s burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:

It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.

The mere fact that law enforcement’s job may become a bit more difficult is not a sufficient reason for undermining the privacy and security of hundreds of millions of innocent people around the world who will be helped by mobile disk encryption.  Or as Chief Justice of John Roberts recently observed in another case rejecting law enforcement’s broad demands for access to the information available on our mobile phones:   “Privacy comes at a cost.”  

Reposted from the Electronic Frontier Foundation’s Deeplinks Blog

More posts from Cindy Cohn, EFF >>