Minnesota Pushing Bill That Says Websites Can No Longer Be Useful For Teenagers

from the i-mean-what-is-going-on-here dept

The various “for the children” moral panic bills about the internet are getting dumber. Over in Minnesota, the legislature has moved forward with a truly stupid bill, which the legislature’s own website says could make the state “a national leader in putting new guardrails on social media platforms.” The bill is pretty simple — it says that any social media platform with more than 1 million account holders (and operating in Minnesota) cannot use an algorithm to recommend content to users under the age of 18.

Prohibitions; social media algorithm. (a) A social media platform with more than 1,000,000 account holders operating in Minnesota is prohibited from using a social media algorithm to target user-created content at an account holder under the age of 18.

(b) The operator of a social media platform is liable to an individual account holder who received user-created content through a social media algorithm while the individual account holder was under the age of 18 if the operator of a social media platform knew or had reason to know that the individual account holder was under the age of 18. A social media operator subject to this paragraph is liable to the account holder for (1) any regular orspecial damages, (2) a statutory penalty of $1,000 for each violation of this section, and (3) any other penalties available under law.

So, um, why? I mean, I get that for computer illiterate people the word “algorithm” is scary. And that there’s some ridiculous belief among people who don’t know any better that recommendation algorithms are like mind control, but the point of an algorithm is… to recommend content. That is, to make a social media (or other kind of service) useful. Without it, you just get an undifferentiated mass of content, and that’s not very useful.

In most cases, algorithms are actually helpful. They point you to the information that actually matters to you and avoid the nonsense that doesn’t. Why, exactly, is that bad?

Also, it seems that under this law, websites would have to create a different kind of service for those under 18 and for those over 18, and carefully track how old those users are, which seems silly. Indeed, it would seem like this bill should raise pretty serious privacy concerns, because now companies are going to have to much more aggressively track age information, meaning they need to be much more intrusive. Age verification is a difficult problem to solve, and with a bill like this, making a mistake (and every website will make mistakes) will be costly.

But, the reality is that the politicians pushing this bill know how ridiculous and silly it is, and how algorithms are actually useful. Want to know how I know? Because the bill has a very, very, very telling exemption:

Exceptions. User-created content that is created by a federal, state, or local government or by a public or private school, college, or university is exempt from this section.

Algorithms recommending content are bad, you see, except if it’s recommending content from us, your loving, well-meaning leaders. For us, keep on recommending our content and only our content.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Minnesota Pushing Bill That Says Websites Can No Longer Be Useful For Teenagers”

Subscribe: RSS Leave a comment
27 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Another lets enable and encourage lots of people to sue big tech. Who is pushing this approach, which appears to be designed to allow moral crusaders to attack, and potentially cripple web sites? Isn’t there an industry that has figured out that you can use the law to destroy a target, while losing every case? MPA/RIAA I am looking at you.

Anonymous Coward says:

Re: Re:

Heuristics can be algorithmic in nature, indeed search optimization and search optimization and content filtering are heuristic in nature, so they are already targeting heuristics. Indeed one of the problems with formal methods is that there is no formal definition if the problem to be solved, just a heuristic one. Data validation is full of problems that only have heuristic answers.

Anonymous Coward says:

Re:

there will ALWAYS be an algorithm involved

That’s the problem – these are people completely devoid of any mathematical or analytic skills whatsoever. If it involves any kind of math, it must be bad, despite the lack of ability for them to parse what they’re seeing.

It’s the direct result of allowing ‘creationism’ in lieu of science. That was the foot in the door.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

‘Algorithms are so terrible that they can never provide useful content and therefore must be barred entirely from kids! Unless it’s our content in which case the algorithms should absolutely put it in front of kids.’

Nothing like presenting your argument only to shoot it in the back the very next second.

This comment has been deemed insightful by the community.
MathFox says:

No algorithm means:

  • no filtering against harassment,
  • no filtering against violent content,
  • no filtering of pornographic content and
  • no filter to keep child predators out

Are those Minnesota “representatives” so stupid as they claim to be for not seeing the consequences?

Anonymous Coward says:

Re: Re:

is prohibited from using a social media algorithm to target user-created content at an account holder under the age of 18.

My read of that is that: This law would be fine with minors logging in…. as long as that login does NOT return any content.

Or to put another, if your algorithm is sending to a child, you can’t send any content (or you will face crippling liability).

Anyhow, that’s my take.

John85851 (profile) says:

Re:

And don’t forget that no algorithm means more irrelevant spam.
Which would they prefer? A kid see an ad based on his interests? Or an ad for Viagra promising a 1% mortgage rate on a house when someone clicks to download the malware infested emoji pack?

Or worse, the site would just run the stupid click bait ads like “People in your state need to know this secret. Click to find out!”

Anonymous Coward says:

Punctuation saves lives!

… social media platform with more than 1,000,000 account holders operating in Minnesota …

that is entirely different that

… social media platform, with more than 1,000,000 account holders, operating in Minnesota …

or even

… social media platform operating in Minnesota, with more than 1,000,000 account holders,…

Pardon me, I have to go. It’s time to eat Grandma.

Anonymous Coward says:

We are a nonprofit looking to protect kids online... this Bill sucks.

We are a nonprofit called DEFEND, an international volunteer-run nonprofit based in Canada and our mandate is to protect kids and vulnerable people online.

This Bill is doomed to fail. First, it provides for no enforcement mechanism. No body set up to monitor or enforce. How do you pass out $1000 fines?

Second, the Bill fails to address how a user will be found to be under 18. Are we relying on declarations? User-input birthdays? Those sites not using those… what method should be used? Perhaps you should address that first, Representative Kristin Robbins?

http://www.HelpUsDefend.com for more on our social media platform (65square) and how it will surpass this.

Tanner Andrews (profile) says:

Re: How do you pass out $1000 fines

No body set up to monitor or enforce. How do you pass out $1000 fines?

No need for a special body to monitor or enforce. A parent whose kid sees algorithm-selected data sues, getting the $1000 and his lawyer getting fees under the statute. We use similar mechanisms for FDCPA and FCCPA claims, unfair and deceptive claims, and a whole bunch of other things I cannot be bothered to dredge up at the momen.

I am having a hard time figuring out how data could be other than algorithm-selected, even if the algorithm is no more than ``fetch the items out in their natural database order”.

From this I conclude that there will likely be a lot of suits if it takes time for the law to be invalidated on First Amendment grounds.

(preview still borken)

CauseOfBSOD (profile) says:

youtube

And Youtube already has not only a separate platform for children (which I am informed is worse than the adult version) but they also have a version of the adult version where content is arbitrarily (and I MEAN arbitrarily – like think Content ID but 10 times worse) blocked, comments cannot be left or viewed, and is in general a frustrating and bad experience.
Not to mention Youtube requires all content to be “kid-friendly” (i.e. advertiser friendly). And yet I get ads that are pretty NSFW.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...