Lindsey Graham's Sneak Attack On Section 230 And Encryption: A Backdoor To A Backdoor?

from the if-it-aint-broke dept

Both Republicans and Democrats have been talking about amending Section 230, the law that made today’s Internet possible. Most politicians are foggy on the details, complaining generally about “Big Tech” being biased against them (Republicans), “not doing enough” about harmful content (Democrats, usually), or just being too powerful (populists on both sides). Some have promised legislation to amend, while others hope to revoke Section 230 entirely. And more bills will doubtless follow.

Rather than get mired in the specifics about how tinkering with Section 230 could backfire, Sen. Lindsey Graham is circulating a draft bill called the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019” — the “EARN IT Act of 2019,” leaked by Bloomberg yesterday. Democratic Sen. Richard Blumenthal has apparently been involved in drafting.

At first blush, the bill may seem uncontroversial: it would create a presidential commission of experts to “develop recommended best practices for providers of interactive computer services regarding the prevention of online child exploitation conduct.” Who could argue with that? Indeed, given how little lawmakers understand online content moderation, getting analysis and recommendations from real experts about Section 230 is probably the only way out of the increasingly intractable, empty debate over the law.

But what Graham’s bill would actually do is give the Attorney General a blank check to bypass Congress in cracking down on Internet services in ways that may have little to do with child sexual abuse material (CSAM). Specifically, the bill would:

  1. Amend Criminal Law & Section 230: Section 230 has never shielded operators of websites and Internet services from federal criminal prosecution for CSAM. But the Graham bill would create broad new legal risks by lowering the (actual) knowledge requirement from “knowingly” to “recklessly” (which would include an after-the-fact assessment of what the company “should have known”) and amending Section 230 to authorize both criminal prosecution and civil suits under state law. For the first time, operators could be sued by plaintiff’s lawyers in class-action suits for “reckless” decisions in designing or operating their sites/services.

  2. Condition Section 230 Immunity: The commission’s (a) recommended “best practices” would quickly become (b) conditions for invoking Section 230 immunity against greatly expanded liability for CSAM — immunity so vital to the operation of many online services that (c) the conditions would be tantamount to legal mandates.

As drafted, Graham’s bill entails a shocking abandonment of the most basic principles of how administrative agencies make rules — based on the fiction that the “best practices” wouldn’t be effectively mandatory — by allowing the AG to bypass Congress on other controversial issues like mandatory age verification or even encryption. As I told Bloomberg: “The absolute worst-case scenario could easily become reality: DOJ could effectively ban end-to-end encryption.” Signal, Telegram and Whatsapp all could no longer exist in their current form. All would be required to build in backdoors for law enforcement because all could be accused of “recklessly” designing their products to make it impossible for the operators or law enforcement to stop CSAM sharing. The same could happen for age verification mechanisms. It’s the worst kind of indirect regulation. And because of the crazy way it’s done, it could be hard to challenge in court.

The rhetorical premise of the “EARN IT” Act — that Section 230 was a special favor that tech companies must continually “earn” — is false. Republicans have repeatedly made this claim in arguing that only “neutral” platforms “deserve” Section 230’s protections, and Democrats likewise argue that website operators should lose Section 230’s protections if they don’t “do more” to combat disinformation or other forms of problematic speech by users.

Congress has never conditioned Section 230 in the way Graham’s bill would do. Section 230, far from being a special favor or subsidy to tech companies, was crafted because, without its protections, website operators would have been discouraged from taking active measures to moderate user content — or from hosting user-generated content altogether, often referred to as the “moderator’s dilemma.”

Here’s how Graham’s monstrous, Rube-Goldberg-esque legal contraption would work in practice. To understand which services will be affected and why they’d feel compelled to do whatever DOJ commands to retain their Section 230 immunity, we’ll unpack the changes to criminal law first.

Step #1: Expanding Legal Liability

Graham’s bill would amend existing law in a variety of ways, mostly paralleling SESTA-FOSTA: while the 2018 law expanded the federal prostitution law (18 U.S.C. § 1591, 2421A), the Graham bill focuses on “child exploitation” imagery (child porn). (Note: To help prosecutors prosecute sex trafficking, without the need for any amendment to Section 230, TechFreedom supported toughening 18 U.S.C. § 1591, 2421A to cover trafficking of minors when FOSTA was a stand-alone bill — but opposed marrying FOSTA with SESTA, the Senate bill, which unwisely amended Section 230.) Specifically, the Graham bill would:

  1. Create a new civil remedy under 18 U.S.C. § 2255 that extends to suits brought against an “interactive computer service” for reckless § 2252 violations;

  2. Amend Section 230(e) to exclude immunity for state criminal prosecution for crimes coextensive with § 2252; and

  3. Amend Section 230(e) to exclude immunity for civil causes of action against an “interactive computer service” pursuant to other state laws if the underlying claim constitutes a violation of § 2252 (or by operation of § 2255(a)(1)). Most notably, this would open the door to states to authorize class-action lawsuits brought by entrepreneurial trial lawyers — which may even be a greater threat than criminal prosecution since the burden of proof would be lower (even though, in principle, a civil plaintiff would have to establish that a violation of criminal law had occurred under Section 2252).

The Graham bill goes further than SESTA-FOSTA in two key respects:

  1. It would lower the mens rea (knowledge) requirement from “knowingly” to “recklessly,” making it considerably easier to prosecute or sue operators; and

  2. Allow for state criminal and civil prosecution for hosting child exploitation imagery that could violate § 2252).

In a ploy to make their bill seem less draconian, SESTA-FOSTA’s sponsors loudly proclaimed that they preserved “core” parts of Section 230’s immunity. Graham will no doubt do the same thing. Both bills leave untouched Section 230(c)(2)(A)’s immunity for “good faith” content removal decisions. But this protection is essentially useless against prosecutions for either sex trafficking or CSAM. In either case, the relevant immunity would be Section 230(c)(1), which ensures that ICS operators are not held responsible as “publishers” for user content. The overwhelming majority of cases turn on that provision — and that is the provision that Graham’s bill conditions on compliance with the AG’s “best practices.”

Step #2: How a “Recommendation” Becomes a Condition to 230

The bill seems to provide an important procedural safeguard by requiring consensus — at least 10 of the 15 commissioners — for each recommended “best practice.” But the chairman (the FTC chairman or his proxy) could issue his own “alternative best practices” with no minimum level of support. The criteria for membership ensure that he’d be able to command at least a majority of the commission, with the FTC, DOJ and Department of Homeland Security each getting one seat, law enforcement getting two, prosecutors getting two more — that’s seven just for government actors — plus two more for those with “experience in providing victims services for victims of child exploitation” — which makes nine reliable votes for “getting tough.” The remaining six Commissioners would include two technical experts (who could turn out to be just as hawkish) plus two commissioners with “experience in child safety” at a big company and two more from small companies. So the “alternative” recommendations would almost certainly command a majority anyway.

More importantly, it doesn’t really matter what the Commissioners recommend: the Attorney General (AG) could issue a radically different set of “best practices” — without public comment. He need only explain why he modified the Commission’s recommendations.

What the AG ultimately issues would not just be recommendations. No, Graham’s bill would empower the AG to enact requirements for enjoying Section 230’s protections against a range of new civil lawsuits and from criminal prosecutions related to “child exploitation” or “child abuse” — two terms that the bill never defines.

Step #3: How Conditioning 230 Eligibility Amounts to a Mandate

Most websites and services, especially the smallest ones, but even the largest ones, simply couldn’t exist if their operators could be held civilly liable for what their users do and say — or if they could be prosecuted under an endless array of state laws. But it’s important to stress at the outset that Section 230 immunity isn’t anywhere near as “absolute” or “sweeping” as its critics claim. Despite the panic over online sex trafficking that finally led Congress, in 2018, to pass SESTA-FOSTA, Section 230 never hindered federal criminal prosecutions. In fact, the CEO of Backpage.com — the company at the center of the controversy over Section 230 — pled guilty to facilitating prostitution (and money laundering) the day after SESTA-FOSTA became law in April 2018. Prosecutors didn’t need a new law, as we stressed at the time.

Just as SESTA-FOSTA created considerable new legal liability for websites for sex trafficking, Graham’s bill does so for CSAM (discussed below) — which makes Section 230 an even more critical legal shield and, in turn, makes companies more willing to follow whatever requirements might be attached to that legal shield.

How Broad Could the Bill’s Effects Be?

Understanding the bill’s real-world effects depends on three separate questions:

  1. What counts as “child exploitation” and “child abuse?”

  2. Which companies would really need Section 230 protection against new, expanded liability for CSAM?

  3. What could be the scope of the AG’s conditions to 230 liability? Must they be related to conduct covered by Section 230?

What Do We Mean by “Child Exploitation” and “Child Abuse?”

The bill’s title focuses on “child exploitation” but the bill also repeatedly talks about “child abuse” — without defining either term. The former comes from the title of 18 U.S.C. § 2252, which governs the “visual depiction involves the use of a minor engaging in sexually explicit conduct” (CSAM). The bill directly invokes that bedrock law, so one might assume that’s what Graham had in mind. There is a federal child abuse law but it’s never mentioned in the bill.

This lack of clarity becomes a significant problem because, as discussed below, the bill is so broadly drafted that the AG could mandate just about anything as a condition of Section 230 immunity.

Which Websites & Services Are We Talking About?

Today, every website and Internet service operator faces some legal risk for CSAM. At greatest risk are those services that allow users to communicate with each other in private messaging or groups, or to share images or videos, because this is how CSAM is most likely to be exchanged. Those who traffic in CSAM are known to be highly creative in finding unexpected places to interact online — just as terrorist groups may use chat rooms in video games to hold staff meetings.

It’s hard to anticipate all the services that might be affected by the Graham bill, but it’s safe to bet that any messaging, photo-sharing, video-hosting or file-sharing tool would consider the bill a real threat. At greatest risk would be services that cannot see what their users do because they offer end-to-end encryption. They risk being accused of making a “reckless” design decision if it turns out that their users share CSAM with each other.

What De Facto Requirements Are We Talking About?

Again, Graham’s bill claims a narrow scope: “The purpose of the Commission is to develop recommended best practices for providers of interactive computer services regarding the prevention of online child exploitation conduct.”

The former term (ICS) is the term Section 230 uses to refer to covered operators: a service, system or software that “provides or enables computer access by multiple users to a computer server.” You might think the Graham bill’s use of this term means the bill couldn’t be used to force Apple to change how it puts E2EE on iPhones — because the iPhone, unlike iMessage, is not an ICS. You might also think that the bill couldn’t be used to regulate things that seem unrelated to CSAM — like requiring “fairness” or “neutrality” in content moderation practices, as Sen. Hawley has proposed and Graham has mentioned repeatedly.

But the bill won’t actually stop the AG from using this bill to do either. The reason is the same in both cases: this is not how legislation normally works. In a normal bill, Congress might authorize the Federal Communications Commission to do something — say, require accessibility features for disabled users of communications services. The FCC could then issue regulations that would have to be reasonably related to that purpose and within its jurisdiction over “communications.” As we know from the 2005 American Library decision, the FCC can’t regulate after the process of “communications” has ended — and thus had no authority to require television manufacturers to build in “broadcast flag” technology on their devices to ensure that, once the device received a broadcast signal, it could not make copies of the device unless authorized by the copyright holder.

But that’s not how Graham’s bill would work. A company that only makes devices, or installs firmware or operating system on them, may not feel compelled to follow the AG’s “best practices” because it does not operate an ICS, and, as such, could not claim Section 230 content (and is highly unlikely to be sued for what its users do anyway). But Apple or Google, in addition to doing these things, also operate multiple ICSes. Nothing in the Graham bill would stop the AG from saying that Apple would lose its Section 230 immunity for iMessage, iCloud or any ICS if it does not build in a backdoor on iPhones for law enforcement. Apple would likely comply. And even if Apple resists, smaller companies with fewer legal resources would likely cave under pressure.

In fact, the Graham bill specifically includes, among ten “matters addressed,” the “retention of evidence and attribution or user identification data relating to child exploitation or child sexual abuse, including such retention by subcontractors” — plus two other prongs relating to identifying such material. While these may appear to be limited to CSAM, the government has long argued that E2EE makes it impossible for operators either to identify or retain CSAM — and thus that law enforcement must have a backdoor and/or that operators must be able to see everything their users do (the opposite of E2EE).

Most of the “matters addressed” pertain to child exploitation (at least in theory) but one other stands out: “employing age limits and age verification systems.” Congress tried to mandate minimum age limits and age verification systems for adult materials back in the Child Online Protection Act (COPA) of 1998. Fortunately, that law was blocked in court in a protracted legal battle because adults have a right to access sensitive content without being subjected to age verification — which generally requires submitting a credit card, and thus necessarily entails identifying oneself. (The court also recognized publishers’ rights to reach privacy-sensitive users.)

Rep. Bart Stupak’s (D-MI) ‘‘Online Age Verification and Child Safety Act’’ of 2009 attempted to revive age verification mandates, but died amidst a howl of protest from civil libertarians. But, like banning E2EE, this is precisely the kind of thing the AG might try to mandate under Graham’s bill. And, critically, the government would argue that the bill does not present the same constitutional questions because it is not a mandate, but rather merely a condition of special immunity bestowed upon operators as a kind of subsidy. Courts should protect us from “unconstitutional conditions,” but given the state of the law and the difficulty of getting the right parties to sue, don’t count on it.

These “matters addressed” need not be the only things the Commission recommends. The bill merely says the “[t]he matters addressed by the recommended best practices developed and submitted by the Commission … shall include [the ten things outlined in the bill].” The Commission could “recommend” more — and the AG could create whatever conditions to Section 230 liability he felt he could get away with, politically. His sense of shame, even more than the courts or Congress, would determine how far the law could stretch.

It wouldn’t be hard to imagine this AG (or AGs of, sadly, either party) using the bill to reshape moderation practices more generally. Republicans increasingly argue that social media are “public fora” to which people like Alex Jones or pseudo-journalistic outlets like Gateway Pundit have First Amendment rights of access. Under the same crazy pseudo-logic, the AG might argue that, the more involved the government becomes in content moderation through whatever conditions he imposes on Section 230 immunity, the more essential it is that website operators “respect the free speech rights” of users. Ultimately, the Commission would operate as a censorship board, with murky but enormous powers — and the AG would be the ultimate censor.

If this sounds like a crazy way to make law, it is! It’s free-form lawmaking — not “we tell what you must do” (and you can raise constitutional objections in court) but rather “we’re not gonna tell you what to do, but if you don’t want to be sued or prosecuted under vague new sex trafficking laws, you’d better do what we tell you.” Once the Commission or the AG strays from “best practice” recommendations that strictly related to CSAM, then the floodgates are open to politically motivated back-door rulemaking that leave platforms with no input and virtually no avenue for appeal. And even if the best practices are related to CSAM, the way the Commission makes what amounts to law will still be unprecedented, secretive, arbitrary and difficult to challenge in court.

Other Key Aspects of How the Bill Would Work

The bill would operate as follows:

  • The bill allows 90 days Commissioners to be appointed, 60 days for the Commission’s first meeting, and 18 months to make its first set of recommendations — 25 months in total. The leaked draft leaves blankr the window in which the AG must issue his “best practices.”

  • Those would de facto requirements would not become legally valid until publication in the Federal Register — which usually takes a month but which sometimes drags out indefinitely.

  • Operators would have 1 year to submit a written certification of their compliance.

  • If, say, the next administration drags its feet and the AG never issues “best practices,” the bill’s amendments to Section 230 and criminal law go into effect four years after enactment — creating sweeping new liability for CSAM and removing Section 230’s protections.

  • The Commission and AG will go through the whole farce again at least every two years.

The bill also grants DOJ broad subpoena power to determine whether operators are, in fact, living up to their certification of compliance with the AG’s “best practices.” Expect this power to be used aggressively to turn tech companies inside out.

Conclusion

In the end, one must ask: what problem is the Graham bill trying to solve? Section 230 has never prevented federal criminal prosecution of those who traffic in CSAM — as more than 36,000 individuals were between 2004 and 2017. Website operators themselves already have enormous legal liability for CSAM — and can be prosecuted by the Department of Justice for failing to cooperate with law enforcement, just as Backpage executives were prosecuted under Federal sex trafficking before SESTA-FOSTA (and plead guilty).

The Graham bill seems to be designed for one overarching purpose: to make services that offer end-to-end encryption effectively illegal, and ensure that law enforcement (and the intelligence agencies) has a backdoor into every major communications platform.

That would be outrageous enough if it were done through a direct mandate, but doing it in the roundabout way Graham’s bill proposes is effectively a backdoor to a backdoor. Unfortunately, that doesn’t mean the bill might not suddenly move quickly through Congress, just as SESTA did. Be ready: the “Cryptowars” may finally turn very, very hot.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Lindsey Graham's Sneak Attack On Section 230 And Encryption: A Backdoor To A Backdoor?”

Subscribe: RSS Leave a comment
73 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:

Other possible reasons:

  • Things full of references to "child porn" on the internet tend to get immediately flagged by every moderation system in existence
  • In addition to implying some sort of legality, "porn" is generally defined by its purpose and there’s no reason to limit things to child abuse done for a specific purpose
Anonymous Coward says:

Re: Re: Re:

There is no actual difference between CSAM and child pornography according to the Department of Justice. Both involve sexually explicit conduct of a minor or someone who is purported to be a minor. It is for this reason that video games that feature cartoon characters and films featuring characters who are in high school having sex always contain the proviso that "the characters depicted are at least 18 years of age" and, if not cartoon based but instead involving real individuals, the actors involved in such scenes must also be at least 18 years of age. It also is a bit wrong to think that "child porn" automatically means "sexual abuse" because individuals under the age of 18 who take naked photos of themselves and willingly distribute them to anyone else without being coaxed into doing so are also engaged in child pornography.

Scary Devil Monastery (profile) says:

Re: Re: Re: Re:

"…because individuals under the age of 18 who take naked photos of themselves and willingly distribute them to anyone else without being coaxed into doing so are also engaged in child pornography."

…and sexting surely is a felony bad enough to result in a teenager having to drag around a lifetime title of "sex offender". /s

One of the big issues about this type of legislation is that what starts as a perfectly valid attempt to protect juveniles from predation always, invariably, ends up being hijacked by religious interests or the moral panic brigade to be used as a lever to ensure unmarried teens abstain from sex and lewd behavior.

Hence the somewhat interesting situation arising in many jurisdictions of where two perfectly legal consenting partners end up perpetrating one of the most toxic felonies known to man – by law – the very second one of them sends a raunchy pic to their significant other.

CP has become that ultimate weapon tacked on to any argument so weak it wouldn’t stand up to criticism otherwise. But use the magic words and no critic even dares argue since being smeared as a pedo is still pretty much a career killer everywhere.
And that’s a problem because when even the kids in your neighborhood end up with the "sex offender" title for being young and in love, people will stop taking the label seriously.

Scary Devil Monastery (profile) says:

Re: What's so special about child *sexual* abuse?

"Isn’t plain old child abuse enough for people anymore?"

Not visceral enough. You can’t tack "child abuse" on to a bill to make up for the lack of proper arguments in favor of it quite the same way you can if you involve sex.

You also can’t use implied pedophilia sympathy to shut the mouth of any critic daring to oppose your bill if all you’ve got is "abuse".

I keep saying that for 99% of the population there is no such thing as CP because there’s nothing erotic about abused children to anyone but a distinctive minority. It’s simply evidence of particularly toxic child abuse.

Sok Puppette says:

I’m having trouble buying the idea that anybody at all thinks the phrase "child porn" carries any implication, or even suggestion, of legality. It’s the most famously illegal thing that exists on the Internet.

As for moderation, I will bet that almost all references to "child porn" on the Internet are in text that condemns it and/or discusses what to do to stop it. And if the pedos are in fact openly using the phrase "child porn" all over the place, what happens when they start calling it "CSAM"?

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re: RE: When they call it CSAM

When Pedophiles call their material Sex Abuse, we’ve won?

Pedophiles who seek out Child Sex Abuse Material often rationalize their behavior, like many criminals. They seek to convice themselves the harm doesn’t really exist. Calling it CSAM makes that harder. See my comments above.

Scary Devil Monastery (profile) says:

Re: Re: RE: When they call it CSAM

"When Pedophiles call their material Sex Abuse, we’ve won?"

Even assuming the narrative can be retrieved that way, I’m sceptical.

We’re talking about people who are either self-centered enough to not give a shit about the fact that they’re hurting children, or who have already convinced themselves that the entire world is wrong about what their urges mean to begin with.

To 99% of humanity there’s nothing erotic about an abused child. Thus it can’t be pornography to anyone other than the sick minority.

But changing the terminology at this stage only, if anything, dilutes the impact of what it is and what it causes.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re:

Depends on the public probably. Without sufficient public backlash making it clear that screwing over the internet and the public that uses will mean someone else will be voted in come the next election I can all too easily see this one sailing right through, as few politicians have the spine to risk being accused of being pro-child porn, which you can be damn sure the scum who crafted the bill are banking on.

Anonymous Coward says:

Re: Re:

I wouldn’t be too worried (at least, not yet). Just remember that this is just a draft bill that doesn’t even have a co-sponsor, yet. How far this bill goes is anyone’s guess, it still has a slim chance of actually passing or being radically changed, much like other bills. But Considering how trigger-happy both parties seem to be when it comes to liability, digital policy and encryption, I admit to share your concern.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Equality is not a priveledge

Time and time again it needs to be hammered home for those whining about how 230 is somehow ‘special’ protections for the internet: Equal protections are not special protections, they are simply making it so that everyone is on a level playing field.

You do not get to sue a newspaper because someone bought a copy and taped an illegal image in it for someone to find.

You do not get to hold a book publisher liable because someone scribbled defamatory content in one of their books.

You do not get to blame a car company because someone used one of their vehicles in a hit and run.

And you do not get to blame an online platform because someone posted something illegal on it.

If anyone is getting ‘special treatment’ it’s the offline companies, who don’t have to worry about being held liable for what people use their products for because it’s understood that the blame rests not on them but on those that misused their products.

If it wouldn’t result in massive amounts of damage I’d almost hope that those whining about 230 actually got their wish and removed it, because I guarantee it would not work out well for either side. Those demanding that companies aren’t moderating enough would quickly be faced with platforms that either don’t moderate at all or block all user submitted content, including theirs, and those complaining that companies are moderating too much by booting their buddies off the various platforms would face the same, either blocked entirely or swamped with people returning the favor and posting where they aren’t wanted.

Anonymous Coward says:

Re: Re: Equality is not a priveledge

Th reckless part of the proposed bill would eliminate not moderating, as not trying to stop CSAM would be considered reckless.

FTFY.

The only thing that would result from this bill would be an internet that was:

  1. Completely unprotected. (No encryption for the masses.)
  2. Completely devoid of user generated content. (Too difficult to moderate, too easily sued for.)
  3. Completely useless for business transactions. (Backdoored encryption for major businesses like banks, but things like ebay and smaller independent online sellers would no longer viable as a business model.)
  4. Completely useless for "Cloud" anything. (Too easily sued for, and too easy for a disgruntled employee to set them up for. Worse, due to the lack of decent protections, i.e. encryption, a literal free-for-all to anyone looking to steal product ideas / company secrets / etc.)

Effectively you’d be looking at an internet with very little on it and very little usefulness to the general public. Hell even corporate content streaming would take a nose dive, not because of the lack of encryption, but because most people wouldn’t have a need to carry around a device anymore. (What good is it for most people without the social media connectivity? Certainly not good enough to justify the extra costs and unlimited data subscription fees.) Cable and the like would rejoice at the idea of the living room returning to entertainment prominence. As would the media in general being free of smaller independent creators and the like on places like YouTube / Twitch / etc. Simply stated, the internet today would cease to exist under this bill. Which is most likely what it’s creator really wants. After all what is the biggest threat to power? A voice.

This comment has been deemed insightful by the community.
Anonymous Coward says:

You know, another thing about the whole CSAM thing is that the, uh… market… for that never changes. Pedo /ephebe -philes are what they are. You aren’t going to grow that demographic, nor shrink it, by making ICSes more liable. They are going to do what they do until they are stopped. They have done and will continue to do so without the internet.

The whole legislative thing, and blaming mere service providers, is stupid.

This comment has been flagged by the community. Click here to show it.

Richard Bennett (profile) says:

Misses the point

Google’s favorite right wing pressure group publishes a post about a discussion draft on Google’s favorite blog. The post displays massive ignorance of law, economics, and technology but that’s OK, it gets the piracy buffs fired up.

The problem with this bill is that it unlawfully delegates too much of Congress’s policy-making power to an executive branch agency acting in the guise of a regulator. In other words, it has the same problem that net neutrality regulations made at the FCC absent meaningful guidance from Congress has.

But the post doesn’t make this very important point because it would make the TD audience sad. So wade through this mess of illogic for a coherent point if you’re having a hard time sleeping.

This comment has been deemed insightful by the community.
John Roddy (profile) says:

Re: Misses the point

You’re the second lobbyist I’ve seen today who went out of their way to "agree" that the bill is bad, but still (incorrectly) chastise the source for not being anti-Google enough. Is there some kind of coordinated effort going on? Did your owners pay you to make this attack specifically?

This comment has been flagged by the community. Click here to show it.

Richard Bennett (profile) says:

Re: Re: Misses the point

Yeah, I totally got a call from Putin today telling me to foment disruption in the American political process around Google. This Szoka character (former president of the Ayn Rand Fanboi Alliance) makes a perfect foil.

Sorry, I’ve gotta leave now so I can post pro-witness comments on the Facebook pages of Repubican senators…

Scary Devil Monastery (profile) says:

Re: Re: Re: Misses the point

"Yeah, I totally got a call from Putin today telling me to foment disruption in the American political process around Google."

That’s obvious BS. Putin isn’t inept enough to employ trolls of your particular caliber.

But let me summarize; Google bad, Google REALLY bad, the OP referenced briefly so you have a platform from which to launch yet another Google Bad. Also, Pirates.

Was that about the gist of it?

Anonymous Coward says:

Re: Re: Re: Re:

Yeah, I totally got a call from Putin today telling me to foment disruption in the American political process around Google.

Nobody mentioned Russia. Most of us just assumed you were shilling again for big ISPs and William Barr. Is that a Freudian slip I detect?

This Szoka character (former president of the Ayn Rand Fanboi Alliance) makes a perfect foil.

Eh, it wouldn’t matter. You get paid to shill for big ISPs and now apparently Barr and regularly shoehorn your insanity into any comment section that is even the least bit tangentially related. Sometimes practicing necromancy to do so.

Try again Richard.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Never had the point

Big ISP and law enforcement’s favorite shill shows up in comment section of a blog he hates to post a rambling comment devoid of actual facts. The post displays massive ignorance of law, economics, and technology but that’s OK, it gets him his check.

The problem with his comment is that it misses the larger point that Section 230 is one of the legal underpinnings that allows the internet to be as open and free as it. His comment then goes on to make a laughably weak connection to net neutrality because he can’t stand that he was proven wrong on that time and again.

But he can’t let that go because it would mean missing out on future paychecks and so he tries to hide it by lying about the article and the fact that the draft bill might be illegal in other ways would make the TD community sad, somehow ignoring the fact that none of us care how the bill gets stopped as long as it does. So wade through this mess of illogic for a coherent point if you’re having a hard time sleeping.

Try again Richard.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: 'Only a criminal would try to hide things...'

Well, yes and no. The ban of creating secure encryption wouldn’t apply, any more than blasphemy laws in countries apply to those in countries sane enough not to have them, but you can be sure that if offering secure encryption is banned then using secure encryption would be next on the list, even if ‘only’ used as justification for surveillance/investigation as ‘suspicious’.

Scary Devil Monastery (profile) says:

Re: Re: Re:2 'Only a criminal would try to hide things...'

"I’d love to see them try and ban VPNs. The squirming as they’re informed what they really are and what they’re really used for outside of the consumer sphere would be entertaining."

There’s a reason why very few politicians have even mentioned doing so, after all. Screwing every major corporation in the US is an interesting way to end your career.

PaulT (profile) says:

Re: Re: Re:3 'Only a criminal would try to hide things...'

That’s why it would be fun. I’ve heard rumblings whenever they think that the only uses for VPNs are for circumventing region locks or hiding torrent traffic, but I’d love to see them educated about what they’re really used for if they seriously tried passing any blanket restrictions.

Scary Devil Monastery (profile) says:

Re: Re: Re:4 'Only a criminal would try to hide things...

"…but I’d love to see them educated about what they’re really used for if they seriously tried passing any blanket restrictions."

Hear, hear.

I’d love to see a few of the particularly bought congressional numbskulls usually rattling in favor of the copyright cult start braying about banning VPN’s, only to see them meet the rising lobby tide of irate Big Banking – and just about every industry other than show business, all howling for their scalps…

It would confirm one of my old philosophical chestnuts about cases where two ethical wrongs CAN make an ethical right.

Anonymous Coward says:

Re: Re: 'Only a criminal would try to hide things...'

Another problem enforcing such a ban will.be users in idaho and Montana that use wireless internet in canada

One woman I used to chat with years ago in adult chat room lived in Montana, but used a wireless isp 8n canada

Wireless providers in canada are only subject to CRTC regulations. American laws do not apply to ISPs in Canada

Anonymous Coward says:

i think politicians always use the excuse we have to stop crime, terrorism,
etc to bring in vague broad laws that can be used to reduce free speech on the web,or attack tech companys .the police can prosecute anyone who commits a crime or hosts csam content so why does this law exist?
to give government control over tech companys ,to attack facebook or google ?
If section 320 is under attack the victims will be free speech ,minoritys ,
ordinary people who use the web to communicate and criticise big corporations or government

This comment has been flagged by the community. Click here to show it.

I. Dunno says:

Maybe, ask the dual nationals aboutvit.

Hi, little snowflake Agent Starling!
Was that you on Jailbait.org? I caught your posts, and wobdered: why

why do lesbian FBI trolls always post as concerned citizens, when in fact and …practice you are actually state-sponsored pedophiles?

Well, just ask this guy, who runs child porn entrapment schemes all over the world, about it:

https://en.m.wikipedia.org/wiki/Joel_Zamel

But you agency pedos should come clean about it.

Yeah, I know you CIA/FBI/JTRIG/Mossadi jihadis have distinct character flaws, veracity and proof being one of them.

https://en.m.wikipedia.org/wiki/Joel_Zamel
Dirty bitches….

Anonymous Coward says:

A dictatorship in the making

The USA government is growing more and more desperate to destroy everything it cannot control, with the free internet and encryption still at the top of the hit list. Every day they’re driving America closer to becoming the next Russia and China especially in terms of digital rights. For conservative politicians modern technology is a danger that must be eliminated, so that they can keep the world working the only way they know how. Now they’re trying to downright eradicate everything we’ve built over the past 30 years at a global scale.

This is a reminder of how imperative it is to destroy Trump and isolate the Republican party to a dark corner of history: They are an active threat to the modern world as a whole! From the repeal of NetNeutrality to passing SESTA / FOSTA to now this, the war against progress and tech is now more obvious than ever. If things keep going this way we’re literally a few years away from a complete dystopia that no futuristic film prepared us for.

Progress will win against those madmen: The 2020 elections are slowly coming. I do not like Biden one bit, but I’d gladly vote for him a million times if I could just to see this dictatorship finally ended.

Anonymous Coward says:

A dictatorship in the making

The USA government is growing more and more desperate to destroy everything it cannot control, with the free internet and encryption still at the top of the hit list. Every day they’re driving America closer to becoming the next Russia and China especially in terms of digital rights. For conservative politicians modern technology is a danger that must be eliminated, so that they can keep the world working the only way they know how. Now they’re trying to downright eradicate everything we’ve built over the past 30 years at a global scale.

This is a reminder of how imperative it is to destroy Trump and isolate the Republican party to a dark corner of history: They are an active threat to the modern world as a whole! From the repeal of NetNeutrality to passing SESTA / FOSTA to now this, the war against progress and tech is now more obvious than ever. If things keep going this way we’re literally a few years away from a complete dystopia that no futuristic film prepared us for.

Progress will win against those madmen: The 2020 elections are slowly coming. I do not like Biden one bit, but I’d gladly vote for him a million times if I could just to see this dictatorship finally ended.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...