Minnesota Pushing Bill That Says Websites Can No Longer Be Useful For Teenagers
from the i-mean-what-is-going-on-here dept
The various “for the children” moral panic bills about the internet are getting dumber. Over in Minnesota, the legislature has moved forward with a truly stupid bill, which the legislature’s own website says could make the state “a national leader in putting new guardrails on social media platforms.” The bill is pretty simple — it says that any social media platform with more than 1 million account holders (and operating in Minnesota) cannot use an algorithm to recommend content to users under the age of 18.
Prohibitions; social media algorithm. (a) A social media platform with more than 1,000,000 account holders operating in Minnesota is prohibited from using a social media algorithm to target user-created content at an account holder under the age of 18.
(b) The operator of a social media platform is liable to an individual account holder who received user-created content through a social media algorithm while the individual account holder was under the age of 18 if the operator of a social media platform knew or had reason to know that the individual account holder was under the age of 18. A social media operator subject to this paragraph is liable to the account holder for (1) any regular orspecial damages, (2) a statutory penalty of $1,000 for each violation of this section, and (3) any other penalties available under law.
So, um, why? I mean, I get that for computer illiterate people the word “algorithm” is scary. And that there’s some ridiculous belief among people who don’t know any better that recommendation algorithms are like mind control, but the point of an algorithm is… to recommend content. That is, to make a social media (or other kind of service) useful. Without it, you just get an undifferentiated mass of content, and that’s not very useful.
In most cases, algorithms are actually helpful. They point you to the information that actually matters to you and avoid the nonsense that doesn’t. Why, exactly, is that bad?
Also, it seems that under this law, websites would have to create a different kind of service for those under 18 and for those over 18, and carefully track how old those users are, which seems silly. Indeed, it would seem like this bill should raise pretty serious privacy concerns, because now companies are going to have to much more aggressively track age information, meaning they need to be much more intrusive. Age verification is a difficult problem to solve, and with a bill like this, making a mistake (and every website will make mistakes) will be costly.
But, the reality is that the politicians pushing this bill know how ridiculous and silly it is, and how algorithms are actually useful. Want to know how I know? Because the bill has a very, very, very telling exemption:
Exceptions. User-created content that is created by a federal, state, or local government or by a public or private school, college, or university is exempt from this section.
Algorithms recommending content are bad, you see, except if it’s recommending content from us, your loving, well-meaning leaders. For us, keep on recommending our content and only our content.
Filed Under: algorithms, for the children, minnesota, recommendations
Comments on “Minnesota Pushing Bill That Says Websites Can No Longer Be Useful For Teenagers”
Another lets enable and encourage lots of people to sue big tech. Who is pushing this approach, which appears to be designed to allow moral crusaders to attack, and potentially cripple web sites? Isn’t there an industry that has figured out that you can use the law to destroy a target, while losing every case? MPA/RIAA I am looking at you.
Re:
So basically like a DoS attack, but its done by one person, doesn’t use computers, and is completely legal (and carries more repercussions).
Would be funny to see someone use this law against whichever idiot thought it would be a good idea to create it, but sadly, there’s that exemption.
Whoever wrote this bill must have never taken a CompSci 101 class nor even reached out to an actual computer scientist for input.
No matter how MN wants social media posts to be displayed, there will ALWAYS be an algorithm involved. Full Stop!
Re:
Never mind CS, algorithms have been in use for at least 4000 years in several cultures.
But heuristics are next on the chopping block, for being vague imitators.
Re: Re:
Heuristics can be algorithmic in nature, indeed search optimization and search optimization and content filtering are heuristic in nature, so they are already targeting heuristics. Indeed one of the problems with formal methods is that there is no formal definition if the problem to be solved, just a heuristic one. Data validation is full of problems that only have heuristic answers.
Re:
I mean, we could just give all the kids direct access to the database and let them query to their hearts content…
Re:
there will ALWAYS be an algorithm involved
That’s the problem – these are people completely devoid of any mathematical or analytic skills whatsoever. If it involves any kind of math, it must be bad, despite the lack of ability for them to parse what they’re seeing.
It’s the direct result of allowing ‘creationism’ in lieu of science. That was the foot in the door.
Sounds like the only algorithm MN approves of is:
if (user == minor) close(connection);
‘Algorithms are so terrible that they can never provide useful content and therefore must be barred entirely from kids! Unless it’s our content in which case the algorithms should absolutely put it in front of kids.’
Nothing like presenting your argument only to shoot it in the back the very next second.
Any bit of code is an algorithm so how do I code this without one?
As any sequence of code, even “random” selection, is an algorithm, and almost all content on such a site is “user created” this effectively means, a social media site with no social media
This is unless you are an “adult”, something the authors of this bill clearly are not
No algorithm means:
Are those Minnesota “representatives” so stupid as they claim to be for not seeing the consequences?
Re:
Yes.
Re:
Yes.*
Re: Re:
Pi==3.14 is at least an approximation good enough for some uses. This Pi==7 level stupidity.
Re: You were so close...
What you forgot is that the login process is an algorithm, so what the law really means is that any child can log in as any individual they want.
Please, $diety, please put some parent back into parenting.
Re: Re:
My read of that is that: This law would be fine with minors logging in…. as long as that login does NOT return any content.
Or to put another, if your algorithm is sending to a child, you can’t send any content (or you will face crippling liability).
Anyhow, that’s my take.
Re:
And don’t forget that no algorithm means more irrelevant spam.
Which would they prefer? A kid see an ad based on his interests? Or an ad for Viagra promising a 1% mortgage rate on a house when someone clicks to download the malware infested emoji pack?
Or worse, the site would just run the stupid click bait ads like “People in your state need to know this secret. Click to find out!”
Re:
That’s not what this means. An algorithm (in recommendations) is used to amplify preferred content not filter out content.
Punctuation saves lives!
that is entirely different that
or even
Pardon me, I have to go. It’s time to eat Grandma.
uhhh
Technically, isn’t “sorted chronologically” an algorithm since these are distributed systems?
Whoever thought this was a good idea probably has deep insecurities about their feminine hips.
Thank the FSM that politicians are doing the important work in a nation where people are dying from preventable diseases, hunger, being homeless.
tweet required brain algorithm
https://nitter.net/mmasnick/status/1504563935642468367#m
Technical misunderstanding or legal misunderstanding?
Subdivision 2 paragraph a of the proposed bill says it prohibits the targeting of minors with prioritized content.
Presumably if someone requested content from their contacts in chronological order that wouldn’t be seen as being targeted.
We are a nonprofit looking to protect kids online... this Bill sucks.
We are a nonprofit called DEFEND, an international volunteer-run nonprofit based in Canada and our mandate is to protect kids and vulnerable people online.
This Bill is doomed to fail. First, it provides for no enforcement mechanism. No body set up to monitor or enforce. How do you pass out $1000 fines?
Second, the Bill fails to address how a user will be found to be under 18. Are we relying on declarations? User-input birthdays? Those sites not using those… what method should be used? Perhaps you should address that first, Representative Kristin Robbins?
http://www.HelpUsDefend.com for more on our social media platform (65square) and how it will surpass this.
Re: How do you pass out $1000 fines
No need for a special body to monitor or enforce. A parent whose kid sees algorithm-selected data sues, getting the $1000 and his lawyer getting fees under the statute. We use similar mechanisms for FDCPA and FCCPA claims, unfair and deceptive claims, and a whole bunch of other things I cannot be bothered to dredge up at the momen.
I am having a hard time figuring out how data could be other than algorithm-selected, even if the algorithm is no more than ``fetch the items out in their natural database order”.
From this I conclude that there will likely be a lot of suits if it takes time for the law to be invalidated on First Amendment grounds.
(preview still borken)
youtube
And Youtube already has not only a separate platform for children (which I am informed is worse than the adult version) but they also have a version of the adult version where content is arbitrarily (and I MEAN arbitrarily – like think Content ID but 10 times worse) blocked, comments cannot be left or viewed, and is in general a frustrating and bad experience.
Not to mention Youtube requires all content to be “kid-friendly” (i.e. advertiser friendly). And yet I get ads that are pretty NSFW.