California Legislators Seek To Burn Down The Internet — For The Children

from the stop-trying-to-regulate-stuff-you-don't-understand dept

I’m continuing my coverage of dangerous Internet bills in the California legislature. This job is especially challenging during an election year, when legislators rally behind the “protect the kids” mantra to pursue bills that are likely to hurt, or at least not help, kids. Today’s example is AB 2273, the Age-Appropriate Design Code Act (AADC),

Before we get overwhelmed by the bill’s details, I’ll highlight three crucial concerns:

First, the bill pretextually claims to protect children, but it will change the Internet for EVERYONE. In order to determine who is a child, websites and apps will have to authenticate the age of ALL consumers before they can use the service. NO ONE WANTS THIS. It will erect barriers to roaming around the Internet. Bye bye casual browsing. To do the authentication, businesses will be forced to collect personal information they don’t want to collect and consumers don’t want to give, and that data collection creates extra privacy and security risks for everyone. Furthermore, age authentication usually also requires identity authentication, and that will end anonymous/unattributed online activity.

Second, even if businesses treated all consumers (i.e., adults) to the heightened obligations required for children, businesses still could not comply with this bill. That’s because this bill is based on the U.K. Age-Appropriate Design Code. European laws are often aspirational and standards-based (instead of rule-based), because European regulators and regulated businesses engage in dialogues, and the regulators reward good tries, even if they aren’t successful. We don’t do “A-for-Effort” laws in the U.S., and generally we rely on rules, not standards, to provide certainty to businesses and reduce regulatory overreach and censorship.

Third, this bill reaches topics well beyond children’s privacy. Instead, the bill repeatedly implicates general consumer protection concerns and, most troublingly, content moderation topics. This turns the bill into a trojan horse for comprehensive regulation of Internet services and would turn the privacy-centric California Privacy Protection Agency/CPPA) into the general purpose Internet regulator.

So the big takeaway: this bill’s protect-the-children framing is designed to mislead everyone about the bill’s scope. The bill will dramatically degrade the Internet experience for everyone and will empower a new censorship-focused regulator who has no interest or expertise in balancing complex and competing interests.

What the Bill Says

Who’s Covered

The bill applies to a “business that provides an online service, product, or feature likely to be accessed by a child.” “Child” is defined as under-18, so the bill treats teens and toddlers identically.

The phrase “likely to be accessed by a child means it is reasonable to expect, based on the nature of the content, the associated marketing, the online context, or academic or internal research, that the service, product, or feature would be accessed by children.” Compare how COPPA handles this issue; it applies when services know (not anticipate) users are under-13 or direct their services to an under-13 audience. In contrast, the bill says that if it’s reasonable to expect ONE under-18 user, the business must comply with its requirements. With that overexpansive framing, few websites and apps can reasonably expect that under-18s will NEVER use their services. Thus, I believe all websites/apps are covered by this law so long as they clear the CPRA quantitative thresholds for being a “business.” [Note: it’s not clear how this bill situates into the CPRA, but I think the CPRA’s “business” definition applies.]

What’s Required

The bill starts with this aspirational statement: “Companies that develop and provide online services, products, or features that children are likely to access should consider the best interests of children when designing, developing, and providing that service, product, or feature.” The “should consider” grammar is the kind of regulatory aspiration found in European law. Does this statement have legal consequences or not? I vote it does not because “should” is not a compulsory obligation. So what is it doing here?

More generally, this provision tries to anchor the bill in the notion that businesses owe a “duty of loyalty” or fiduciary duty to their consumers. This duty-based approach to privacy regulation is trendy in privacy circles, but if adopted, it would exponentially expand regulatory oversight of businesses’ decisions. Regulators (and private plaintiffs) can always second-guess a business’ decision; a duty of “loyalty” gives the regulators the unlimited power to insist that the business made wrong calls and impose punishments accordingly. We usually see fiduciary/loyalty obligations in the professional services context where the professional service provider must put an individual customer’s needs before its own profit. Expanding this concept to mass-market businesses with millions of consumers would take us into uncharted regulatory territory.

The bill would obligate regulated businesses to:

  • Do data protection impact assessments (DPIAs) for any features likely to be accessed by kids (i.e., all features), provide a “report of the assessment” to the CPPA, and update the DPIA at least every 2 years.
  • “Establish the age of consumers with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business, or apply the privacy and data protections afforded to children to all consumers.” As discussed below, this is a poison pill for the Internet. This also exposes part of the true agenda here: if a business can’t do what the bill requires (a common consequence), the bill drives businesses to adopt the most restrictive regulation for everyone, including adults.
  • Configure default settings to a “high level of privacy protection,” whatever that means. I think this meant to say that kids should automatically get the highest privacy settings offered by the business, whatever that level is, but it’s not what it says. Instead, this becomes an aspirational statement about what constitutes a “high level” of protection.
  • All disclosures must be made “concisely, prominently, and using clear language suited to the age of children likely to access” the service. The disclosures in play are “privacy information, terms of service, policies, and community standards.” Note how this reaches all consumer disclosures, not just those that are privacy-focused. This is the first of several times we’ll see the bill’s power grab beyond privacy. Also, if a single toddler is “likely” to access the service, must all disclosures must be written at toddlers’ reading level?
  • Provide an “obvious signal” if parents can monitor their kids’ activities online. How does this intersect with COPPA?
  • “Enforce published terms, policies, and community standards established by the business, including, but not limited to, privacy policies and those concerning children.” 🚨 This language unambiguously governs all consumer disclosures, not just privacy-focused ones. Interpreted literally, it’s ludicrous to mandate businesses enforce every provision in their TOSes. If a consumer breaches a TOS by scraping content or posting violative content, does this provision require businesses to sue the consumer for breach of contract? More generally, this provision directly overlaps AB 587, which requires businesses to disclose their editorial policies and gives regulators the power to investigate and enforce any perceived or alleged deviations how services moderate content. See my excoriation of AB 587. This provision is a trojan horse for government censorship that has nothing to do with protecting the kids or even privacy. Plus, even if it weren’t an unconstitutional provision, the CPPA, with its privacy focus, lacks the expertise to monitor/enforce content moderation decisions.
  • “Provide prominent, accessible, and responsive tools to help children, or where applicable their parent or guardian, exercise their privacy rights and report concerns.” Not sure what this means, especially in light of the CPRA’s detailed provisions about how consumers can exercise privacy rights.

The bill would also obligate regulated businesses not to:

  • “Use the personal information of any child in a way that the business knows or has reason to know the online service, product, or feature more likely than not causes or contributes to a more than de minimis risk of harm to the physical health, mental health, or well-being of a child.” This provision cannot be complied with. It appears that businesses must change their services if a single child might suffer any of these harms, which is always? This provision especially seems to target UGC features, where people always say mean things that upset other users. Knowing that, what exactly are UGC services supposed to do differently? I assume the paradigmatic example are the concerns about kids’ social media addiction, but like the 587 discussion above, the legislature is separately considering an entire bill on that topic (AB 2408), and this one-sentence treatment of such a complicated and censorial objective isn’t helpful.
  • “Profile a child by default.” “Profile” is not defined in the bill. The term “profile” is used 3x in the CPRA but also not defined. So what does this mean?
  • “Collect, sell, share, or retain any personal information that is not necessary to provide a service, product, or feature with which a child is actively and knowingly engaged.” This partially overlaps COPPA.
  • “If a business does not have actual knowledge of the age of a consumer, it shall not collect, share, sell, or retain any personal information that is not necessary to provide a service, product, or feature with which a consumer is actively and knowingly engaged.” Note how the bill switches to the phrase “actual knowledge” about age rather than the threshold “likely to be accessed by kids.” This provision will affect many adults.
  • “Use the personal information of a child for any reason other than the reason or reasons for which that personal information was collected. If the business does not have actual knowledge of the age of the consumer, the business shall not use any personal information for any reason other than the reason or reasons for which that personal information was collected.” Same point about actual knowledge.
  • Sell/share a child’s PI unless needed for the service.
  • “Collect, sell, or share any precise geolocation information of children by default” unless needed for the service–and only if providing “an obvious sign to the child for the duration of that collection.”
  • “Use dark patterns or other techniques to lead or encourage consumers to provide personal information beyond what is reasonably expected for the service the child is accessing and necessary to provide that service or product to forego privacy protections, or to otherwise take any action that the business knows or has reason to know the online service or product more likely than not causes or contributes to a more than de minimis risk of harm to the child’s physical health, mental health, or well-being.” No one knows what the term “dark patterns” means, and now the bill would also restrict “other techniques” that aren’t dark patterns? Also see my earlier point about the “de minimis risk of harm” requirement.
  • “Use any personal information collected or processed to establish age or age range for any other purpose, or retain that personal information longer than necessary to establish age. Age assurance shall be proportionate to the risks and data practice of a service, product, or feature.” The bill expressly acknowledges that businesses can’t authenticate age without collecting PI–including PI the business would choose not to collect but for this bill. This is like the CCPA/CPRA’s problems with “verifiable consumer request”–to verify the consumer, the business has to ask for PI, sometimes more invasively than the PI the consumer is making the request about. ¯_(ツ)_/¯

New Taskforce

The bill would create a new government entity, the “California Children’s Data Protection Taskforce,” composed of “Californians with expertise in the areas of privacy, physical health, mental health, and well-being, technology, and children’s rights” as appointed by the CPPA. The taskforce’s job is “to evaluate best practices for the implementation of this title, and to provide support to businesses, with an emphasis on small and medium businesses, to comply with this title.”

The scope of this taskforce likely exceeds privacy topics. For example, the taskforce is charged with developing best practices for “Assessing and mitigating risks to children that arise from the use of an online service, product, or feature”–this scope isn’t limited to privacy risks. Indeed, it likely reaches services’ editorial decisions. The CPPA is charged with constituting and supervising this taskforce even though it lacks expertise on non-privacy-related topics.

New Regulations

The bill obligates the CPPA to come up with regulations supporting this bill by April 1, 2024. Given the CADOJ’s and CPPA’s track record of missing statutorily required timelines for rule-making, how likely is this schedule? 🤣

Problems With the Bill

Unwanted Consequences of Age and Identity Authentication. Structurally, the law tries to sort the online population into kids and adults for different regulatory treatment. The desire to distinguish between children and adults online has a venerable regulatory history. The first Congressional law to crack down on the Internet, the Communications Decency Act, had the same requirement. It was struck down as unconstitutional because of the infeasibility. Yet, after 25 years, age authentication still remains a vexing technical and social challenge.

Counterproductively, age-authentication processes are generally privacy invasive. There are two primary ways to do it: (1) demand the consumer disclose lots of personal information, or (2) use facial recognition and collect highly sensitive face information (and more). Businesses don’t want to invade their consumers’ privacy these ways, and COPPA doesn’t require such invasiveness either.

Also, it’s typically impossible to do age-authentication without also doing identity-authentication so that the consumer can establish a persistent identity with the service. Otherwise, every consumer (kids and adults) will have to authentication their age each time they access a service, which will create friction and discourage usage. But if businesses authenticate identity, and not just age, then the bill creates even greater privacy and security risks as consumers will have to disclose even more PI.

Furthermore, identity authentication functionally eliminates anonymous online activity and all unattributed activity and content on the Internet. This would hurt many communities, such as minorities concerned about revealing their identity (e.g., LGBTQ), pregnant women seeking information about abortions, and whistleblowers. This also raises obvious First Amendment concerns.

Enforcement. The bill doesn’t specify the enforcement mechanisms. Instead, it wades into an obvious and avoidable tension in California law. On the one hand, the CPRA expressly negates private rights of action (except for certain data security breaches). If this bill is part of the CPRA–which the introductory language implies–then it should be subject to the CPRA’s enforcement limits. CADOJ and CPPA have exclusive enforcement authority over the CPRA, and there’s no private right of action/PRA. On the other hand, California B&P 17200 allows for PRAs for any legal violation, including violations of other California statutes. So unless the bill is cabined by the CPRA’s enforcement limit, the bill will be subject to PRAs through 17200. So which is it?  ¯\_(ツ)_/¯

Adding to the CPPA’s Workload. The CPPA is already overwhelmed. It can’t make its rule-making deadline of July 1, 2022 (missing it by months). That means businesses will have to comply with the voluminous rules with inadequate compliance time. Once that initial rule-making is done, the CPPA will then have to build a brand-new administrative enforcement function and start bringing, prosecuting, and adjudicating enforcements. That will be another demanding, complex, and time-consuming project for the CPPA. So it’s preposterous that the California legislature would add MORE to the CPPA’s agenda, when it clearly cannot handle the work that the California voters have already instructed it to do.

Trade Secret Problems. Requiring businesses to report about their DPIAs for every feature they launch potentially discloses lots of trade secrets–which may blow their trade secret protection. It certainly provides a rich roadmap for plaintiffs to mine.

Conflict with COPPA. The bill does not provide any exceptions for parental consent to the business’ privacy practices. Instead, the bill takes power away from parents. Does this conflict with COPPA such that COPPA would preempt it? No doubt the bill’s basic scheme rejects COPPA’s parental control model.

I’ll also note that any PRA may compound the preemption problem. “Allowing private plaintiffs to bring suits for violations of conduct regulated by COPPA, even styled in the form of state law claims, with no obligation to cooperate with the FTC, is inconsistent with the treatment of COPPA violations as outlined in the COPPA statute.” Hubbard v. Google LLC, 546 F. Supp. 3d 986 (N.D. Cal. 2021).

Conflict with CPRA’s Amendment Process. The legislature may amend the CPRA by majority vote only if it enhances consumer privacy rights. As I’ve explained before, this is a trap because I believe the amendments must uniformly enhance consumer privacy rights. In other words, if some consumers get greater privacy rights, but other consumers get less privacy rights, then the legislature cannot make the amendment via majority vote. In this case, the AADC undermines consumer privacy by exposing both children and adults to new privacy and security risks through the authentication process. Thus, the bill, if passed, could be struck down as exceeding the legislature’s authority.

In addition, the bill says “If a conflict arises between commercial interests and the best interests of children, companies should prioritizes the privacy, safety, and well-being of children over commercial interests.” A reminder of what the CPRA actually says: “The rights of consumers and the responsibilities of businesses should be implemented with the goal of strengthening consumer privacy, while giving attention to the impact on business and innovation.” By disregarding the CPRA’s instructions to consider impacts on businesses, this also exceeds the legislature’s authority.

Dormant Commerce Clause. The bill creates numerous potential DCC problems. Most importantly, businesses necessarily will have authenticate the age of all consumers, both in and outside of California. This means that the bill would govern how businesses based outside of California interact with non-Californians, which the DCC does not permit.

Conclusion

Due to its scope and likely impact, this bill is one of the most consequential bills in the California legislature this year. The Internet as we know it hangs in the balance. If your legislator isn’t paying proper attention to those consequences (spoiler: they aren’t), you should give them a call.

Originally posted to Eric Goldman’s Technology & Marketing Law blog. Reposted with permission.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “California Legislators Seek To Burn Down The Internet — For The Children”

Subscribe: RSS Leave a comment
43 Comments

This comment has been flagged by the community. Click here to show it.

Rocky says:

Re: Re: Re:3

Original comment:

Gun control laws aim to protect a broad swath of the population. That such laws might get passed in the wake of children being slaughtered (if they get passed at all) is a horrifying⁠—and uniquely American⁠—sequence of events.

Reply from Naughty Autie disproving assertion.

Followup comment:

I was referring more to the uniquely American tragedy of regularly occuring school shootings, but fair point regardless.

[Citation needed]

Drew Wilson (user link) says:

Re:

I was actually following the UK age verification legislation off and on for a while. Then, things, admittedly, got really gnarly in North America and couldn’t keep up with the details simply because I didn’t have the resources to follow it.

I’m really glad to find out that the legislation was scrapped. That was one nightmare of a piece of legislation!

Vern Parchie says:

Would not the businesses just leave California?

This would only count for internet businesses and webpage businesses in CA. I know plenty of internet companies in CA. All they would do is pick up and move. People are going to complain and the companies not located in CA the webpage designers will just do business with a webpage developer not located in CA. Governor Newsome will find less internet companies in his state if this bill passes. Screw CA, if this bill passes it’s goodbye California.

That One Guy (profile) says:

'Think of the children', almost alway a stadium sized red flag

‘We must protect the children!’

‘How?’

‘Ensuring that they never feel like they aren’t being watched online and forcing them to provide highly personal information to any platform they want to use.’

‘Won’t that make things worse and impact everyone?’

‘Maybe, who cares, all that matters is (me being seen) Protecting The Children!

Anonymous Coward says:

The úk government tried to set up an online It system it failed its unworkable unless they government wants to set up an unique ID system like China or India, children can watch the news, CNN, do parents not have a duty to monitor thier child’s Web acess,?
Websites are probably not going to set up an I’d system for one state
They could block California or ask users to login to the websites or maybe have a banner this site is only to be viewed by over 16 year old users
The value of the Web is í can read One article
On techdirt or new York Times without having
To go thru a login process
Or giving my Id
The average 10 year old dies not want to read sfgate or techdirt or Newyork times
It’s ridiculous to expect millions of websites to change the log in process in case one 14 year old might read one article

This comment has been deemed insightful by the community.
GHB (profile) says:

This basically turns the internet into facebook

I’m against login walls (“log-walls” as I term it) — sites asking me to create an account just to look at content for free. Especially when they ask for your personal information. I’m tolerant if NSFW content have this, but not so for SFW works. This have lead to the creation of bugmenot as well as proxies to dodge having to enter your personal info.

These people supporting such “for the children” are stupid. The internet is not a daycare for kids, and shouldn’t be treated like that. Maybe parents should take more responsibility on supervising their kids to use the internet, it shouldn’t be the site’s job to do this.

How about use webfilters on your home network, have parental control privileges set on the device they’re using, or better yet, they have to be watched constantly anytime when they use the computer, or don’t let them use computers at all?

Not only that, but “child-proofing” the internet is largely ineffective. Google “are age restrictions effective online?” and you’ll get lots of articles showing that age restriction walls are ineffective. Children can lie about their age and more generally, enter fake info. Unless you meet them in person, just looking at the data they entered is not reliable to know if they’re old enough to legally use the site. Either you have that flaw, or you have facebook to go even further by demanding a video recording of your face. Both of them are awful ideas.

Anonymous Coward says:

…the business knows or has reason to know the online service, product, or feature more likely than not causes or contributes to a more than de minimis risk of harm to the physical health, mental health, or well-being of a child.

Well, I guess that rules out online belt sales in California. Also, comic book sales, role-playing game sales, sales of any video game ever, satanic ritual supplies, political memoirs, MAGA hats …

This comment has been deemed insightful by the community.
Anonymous Coward says:

It’s sad that whenever I hear “for the children” I become immediately distrustful because of how many other unpalpable laws have slipped past scrutiny because of it being otensively “for the children” while making the internet worse for everyone including the people they claim to want to protect.

That Anonymous Coward (profile) says:

Dear California Legislators,

GO THE FUCK AWAY.
The internet did not bring the crotchfruit into the world and isn’t responsible for them.
Stop enabling shitty parents to be shitty.
Internet access isn’t free, perhaps the person paying the bill should decide to monitor what their children are doing online?
Perhaps if ‘magically’ a child is groomed and lured away over 6 months, once they are found perhaps then someone should ask the impolite question of WHERE THE FUCK WERE THE PARENTS?
Corporations might be people, but they aren’t fucking parents and pretending that they should do more than the actual parents should do to monitor their children is insane.

I get it, Nanny State wants to Nanny and score political points but my FSM, perhaps its time to stop relying on Fran Drescher to raise other peoples kids.

danderbandit (profile) says:

The other side of the same shitty coin.

Nothing that I can add to the previous comments.
Just wanted to acknowledge that this is the kind of crap that the rest of the country shits on California for.

The government wants to control everything/everybody. The republicans for profit, the democrats because they think they know better, both for control/power.

Going to turn me into an Anarchist!

Tracy Rosenberg says:

Leaving Out A Bit

Mike,

There’s a few things to say here, but the first and most important is why are you conflating the actual definition of “likely to be accessed by a child” that is contained in the bill?

It’s not ONE child user. Techdirt and most Internet sites aren’t going to qualify.

There are problems with the bill, but maybe stick to the real ones instead of kinda making stuff up.

Likely to be accessed by children” means it is reasonable to expect, based on any of the following factors, indicators, that the online service, product, or feature would be accessed by children:
(A) The online service, product, or feature is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.).
(B) The online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.
(C) An online service, product, or feature with advertisements marketed to children.
(D) An online service, product, or feature that is substantially similar or the same as an online service, product, or feature subject to subparagraph (B).
(E) An online service, product, or feature that has design elements that are known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.
(F) A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.

Jim Bierly says:

These same people cared enough about kids to close schools for 2 years.

So now they “care” about kids, after giving two sh$ts about them for the past 2 years while they closed schools? They are using children who committed suicide as a marketing ploy for this bill? What do you think happens to kids that are forced in isolation and forced to mask unnecessarily. Sickening.

These Californians could give a crap about kids. They are all a bunch of sickos. Stay away from middle-America.

Montana Burr (profile) says:

Minor correction?

I’m only a software engineer, but I’m not sure I completely agree with the analysis.

Upon my reading of the law, it seems that the application having just one user under 18 is not sufficient to establish the applicability of the act. Rather, the applicability of the law is to be judged on a case-by-case basis. To quote the law:

“(4) “Likely to be accessed by children” means it is reasonable to expect, based on the following indicators, that the online service, product, or feature would be accessed by children:
(A) The online service, product, or feature is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.).
(B) The online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.
(C) An online service, product, or feature with advertisements marketed to children.
(D) An online service, product, or feature that is substantially similar or the same as an online service, product, or feature subject to subparagraph (B).
(E) An online service, product, or feature that has design elements that are known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.
(F) A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.”

In general, I would guess that a company that, by complete accident, has a child user would not be subject to this new law.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...