And if the UK government gets its way you may even be required to show ID to disable filters built into your phone or computer's operating system:
https://arstechnica.com/tech-policy/2025/12/uk-to-encourage-apple-and-google-to-put-nudity-blocking-systems-on-phones/
The latest news in the digital ID front is that they want device manufacturers to obtain proof of age before allowing people uncensored access to their own phones.
https://forums.macrumors.com/threads/uk-wants-all-iphones-to-block-explicit-images-unless-you-prove-age.2474720/
Given all that's been happening lately at SCOTUS with questionable ruling after questionable ruling, it is quite depressing to see two of the only three liberal judges leaning the wrong way. It helps that copyright isn't a pet conservative issue, giving conservative justices more room to be objective, but Jackson and Sotomayor have shown better judgment than this. They should know better.
People keep saying even these shadow docket rulings establish binding precedent, but are courts required to follow the rationale laid out in Kavanaugh's sole concurrence? Presumably the case at issue still needs to be considered on its merits, but is the court required to follow Kavanaugh's lead here about what constitutes reasonable suspicion and such?
I remember when service providers would fight these laws tooth and nail, but the ones who can best afford a full-on legal offensive these days are just letting it happen. It feels like nobody cares anymore (although many people do).
I shudder to think what this will mean for YouTube videos exposing charlatans, grifters, and other liars. The subjects of these videos have no qualms about abusing YouTube's reporting systems to shut down criticism, and the worst of them are litigious to a fault. This new law will facilitate this kind of censorship, and YouTube being YouTube will simply take down the videos, and possibly shut down the channels, without any chance for appeal because the content has to be removed within 48 hours and deleted.
A new era of censorship awaits us, and I'm not at all confident the courts will overturn this.
The part about the government not enforcing the law doesn't bother me as much as the part about Trump using the law to cosplay a strongman: "sell us Tiktok or else!" There are laws on the books that governments should refuse to enforce, but the tactics being employed here are terrible.
Given the lack of safeguards like having a counter-notice procedure, one way to protest this law would be to file a bunch of complaints against non-NCII content to the point where ISPs have to either take down everything or exercise actual discretion in what they take down.
I wonder how these things would play if it were, say, the European Union instead of Russia and the sum were only 100% of Google's net worth instead of it being more money than exists in the entire world. If one is to support the EU's huge extraterritorial fines over GDPR and DSA violations then one must also support Russia's. The difference is more a matter of degree than susbstance.
I understand that harassment laws can be difficult to implement, and that changing the way the block function works may detract people from using ExTwitter. I doubt it would make polarization worse, but in the absence of evidence either way that's just my opinion.
I do have a problem with the notion that the existence of harassment on the platform is enough to justify not relaxing how the block feature currently operates. What Musk is proposing is pretty much the way Twitter's block function used to work: you can see people's posts but you can't follow them or reply to their posts if they've blocked you.
We're not talking about private spaces, direct messages, or anything of the sort here. We're talking about posts that are intentionally made available to everyone by default, as in online forums and websites that allow people to leave comments. These are posts that are intended for public consumption, and in that context I think the whole "but harassment" angle is disingenuous.
I have at no point argued that "harassment isn't harassment," or that somehow I get to decide what is or is not harassment. What I've said is that a majority of blocks are for reasons other than what reasonable people would call harassment.
ExTwitter's current approach is a weird combination of people's posts being visible to all coupled with particular accounts being denied access to posts that literally everybody else can see, which has never made much sense to me.
Blocking on ExTwitter normally has more to do with personal disagreement and annoyance than real harassment, and there are legal tools available to people actually being harassed. I'm one of those people who believe social media has contributed significantly to the high degree of polarization we are now seeing, and I can't support any feature that may contribute to extreme ideological fragmentation. I'll be glad to see the block function modified in this fashion.
Unless Durov was, in fact, involved in the production of illegal content or was aware of specific instances of such content and did nothing to remove and report it, this arrest is just plain outrageous. It's also a symptom of a terrible malady that's especially prevalent in Europe: the attitude that the way to deal with online harms is to deal with things in the most heavy-handed way possible, either through ill-conceived and chilling legislation or arrests over third-party conduct.
Some state recently outlawed gathering biometric data, even from publicly-available images, without the subject's permission, but they made an exception for law enforcement purposes. Government officials claim to stand for privacy but support the worst of all possible uses. Private companies want to sell you things, but the government wants to put you in jail.
So disappointing. Some people are insisting the bill is not about content regulation but about "addictive design features." Even Wyden, who voted against the bill, has expressed support for legal measures against so-called addictive design features. You can bet algorithmic recommendations would be included in this, making it a form of content regulation after all.
And it's not just Americans wanting to regulate these addictive design features, either. The European Union is getting ready, in its usual, heavy-handed fashion, to strike against things like "endless scrolling, pull-to-refresh, never-ending auto-play videos, push notifications, temporarily available stories, likes, [and] read-receipts.". You couldn't possibly concoct a parody more absurd than this.
New EU rules needed to make digital platforms less addictive.
I'm glad Wyden is opposing this bill, but I'm also disappointed to learn he believes "provisions regulating addictive design elements used by platforms to keep young people hooked are valuable safeguards that will make tech products safer." Those "addictive design elements" would be things like content recommendations which absolutely do implicate content. Opponents of algorithmic feeds like to talk about the supposed "radicalizing effect" of content recommendations, and I fear that's the direction any efforts against so-called addictive design elements would take.
ID to use your computer
And if the UK government gets its way you may even be required to show ID to disable filters built into your phone or computer's operating system: https://arstechnica.com/tech-policy/2025/12/uk-to-encourage-apple-and-google-to-put-nudity-blocking-systems-on-phones/
Prove your age to use your phone
The latest news in the digital ID front is that they want device manufacturers to obtain proof of age before allowing people uncensored access to their own phones. https://forums.macrumors.com/threads/uk-wants-all-iphones-to-block-explicit-images-unless-you-prove-age.2474720/
Two of the three liberal justices
Given all that's been happening lately at SCOTUS with questionable ruling after questionable ruling, it is quite depressing to see two of the only three liberal judges leaning the wrong way. It helps that copyright isn't a pet conservative issue, giving conservative justices more room to be objective, but Jackson and Sotomayor have shown better judgment than this. They should know better.
That one and Ashcroft vs. ACLU, thanks to FSC vs Paxton as I have just learned.
It's as if Reno vs. American Civil Liberties Union had never happened.
Binding precedent
People keep saying even these shadow docket rulings establish binding precedent, but are courts required to follow the rationale laid out in Kavanaugh's sole concurrence? Presumably the case at issue still needs to be considered on its merits, but is the court required to follow Kavanaugh's lead here about what constitutes reasonable suspicion and such?
Constitutionality
This is technically unconstitutional, right? Any thoughts on how the Supreme Court would rule given its current makeup?
Where have our online rights gone?
I remember when service providers would fight these laws tooth and nail, but the ones who can best afford a full-on legal offensive these days are just letting it happen. It feels like nobody cares anymore (although many people do).
YouTube Takedowns
I shudder to think what this will mean for YouTube videos exposing charlatans, grifters, and other liars. The subjects of these videos have no qualms about abusing YouTube's reporting systems to shut down criticism, and the worst of them are litigious to a fault. This new law will facilitate this kind of censorship, and YouTube being YouTube will simply take down the videos, and possibly shut down the channels, without any chance for appeal because the content has to be removed within 48 hours and deleted. A new era of censorship awaits us, and I'm not at all confident the courts will overturn this.
Enforcement
The part about the government not enforcing the law doesn't bother me as much as the part about Trump using the law to cosplay a strongman: "sell us Tiktok or else!" There are laws on the books that governments should refuse to enforce, but the tactics being employed here are terrible.
Denial of service
Given the lack of safeguards like having a counter-notice procedure, one way to protest this law would be to file a bunch of complaints against non-NCII content to the point where ISPs have to either take down everything or exercise actual discretion in what they take down.
The Long Arm of the Law
I wonder how these things would play if it were, say, the European Union instead of Russia and the sum were only 100% of Google's net worth instead of it being more money than exists in the entire world. If one is to support the EU's huge extraterritorial fines over GDPR and DSA violations then one must also support Russia's. The difference is more a matter of degree than susbstance.
I understand that harassment laws can be difficult to implement, and that changing the way the block function works may detract people from using ExTwitter. I doubt it would make polarization worse, but in the absence of evidence either way that's just my opinion. I do have a problem with the notion that the existence of harassment on the platform is enough to justify not relaxing how the block feature currently operates. What Musk is proposing is pretty much the way Twitter's block function used to work: you can see people's posts but you can't follow them or reply to their posts if they've blocked you. We're not talking about private spaces, direct messages, or anything of the sort here. We're talking about posts that are intentionally made available to everyone by default, as in online forums and websites that allow people to leave comments. These are posts that are intended for public consumption, and in that context I think the whole "but harassment" angle is disingenuous.
I have at no point argued that "harassment isn't harassment," or that somehow I get to decide what is or is not harassment. What I've said is that a majority of blocks are for reasons other than what reasonable people would call harassment.
Ex-Twitter's block function harmful to online discourse
ExTwitter's current approach is a weird combination of people's posts being visible to all coupled with particular accounts being denied access to posts that literally everybody else can see, which has never made much sense to me. Blocking on ExTwitter normally has more to do with personal disagreement and annoyance than real harassment, and there are legal tools available to people actually being harassed. I'm one of those people who believe social media has contributed significantly to the high degree of polarization we are now seeing, and I can't support any feature that may contribute to extreme ideological fragmentation. I'll be glad to see the block function modified in this fashion.
This doesn't look good for France
Unless Durov was, in fact, involved in the production of illegal content or was aware of specific instances of such content and did nothing to remove and report it, this arrest is just plain outrageous. It's also a symptom of a terrible malady that's especially prevalent in Europe: the attitude that the way to deal with online harms is to deal with things in the most heavy-handed way possible, either through ill-conceived and chilling legislation or arrests over third-party conduct.
Selective Outrage
Some state recently outlawed gathering biometric data, even from publicly-available images, without the subject's permission, but they made an exception for law enforcement purposes. Government officials claim to stand for privacy but support the worst of all possible uses. Private companies want to sell you things, but the government wants to put you in jail.
Addictive Design
So disappointing. Some people are insisting the bill is not about content regulation but about "addictive design features." Even Wyden, who voted against the bill, has expressed support for legal measures against so-called addictive design features. You can bet algorithmic recommendations would be included in this, making it a form of content regulation after all. And it's not just Americans wanting to regulate these addictive design features, either. The European Union is getting ready, in its usual, heavy-handed fashion, to strike against things like "endless scrolling, pull-to-refresh, never-ending auto-play videos, push notifications, temporarily available stories, likes, [and] read-receipts.". You couldn't possibly concoct a parody more absurd than this. New EU rules needed to make digital platforms less addictive.
Addictive Elements
I'm glad Wyden is opposing this bill, but I'm also disappointed to learn he believes "provisions regulating addictive design elements used by platforms to keep young people hooked are valuable safeguards that will make tech products safer." Those "addictive design elements" would be things like content recommendations which absolutely do implicate content. Opponents of algorithmic feeds like to talk about the supposed "radicalizing effect" of content recommendations, and I fear that's the direction any efforts against so-called addictive design elements would take.
Honestly, I'm not sure which bothers me more: Apple's business practices or the European Union's heavy-handed regulation of "Big Tech."