The Teddy Bear And Toaster Act Is Device Regulation Done Wrong

from the not-the-right-approach dept

Should government to protect us from snooping teddy bears and untrustworthy toasters? The California State Senate seems to think so.

With traditional devices on the decline, laptop and desktop computers now account for less than 25 percent of internet network traffic. Indeed, American households now use, on average, seven connected devices every day. As this so-called “internet of things” continues to expand, an array of connected objects—from toasters to lightbulbs to dishwashers—now include embedded microprocessors, multiplying the number of potential threat vectors for data breaches and cyberattacks.

Notably, security researchers revealed recently that CloudPets, a company that sells connected stuffed animal toys with voice-recording capabilities, had a security vulnerability that leaked the information of more than 500,000 people. In response to accounts like these and concerns about data collection by internet-of-things devices, California is considering S.B. 327, legislation that would require certain security and privacy features for any connected devices sold in the Golden State.

Device insecurity is a real threat and it’s encouraging to see legislators thinking about consumer privacy and security. But this bill, facetiously called the “teddy bear and toaster act” by its critics, would create more problems than it solves. These concerns do not merit a heavy-handed and wide-reaching legislative response.

First introduced in February, the bill targets a broad range of products that include “any device, sensor, or other physical object that is capable of connecting to the internet, directly or indirectly, or to another connected device.” It would require that their manufacturers “equip the device with reasonable security features.”

The scope and scale of that definition would appear to cover everything from smartphones to cars to tweet-happy toasters. Sweeping such a broad range of connected devices under its rules ignores that all of these items have unique functions, capabilities, and vulnerabilities. What constitutes a “reasonable security feature” for one might be completely unreasonable for another. This one-size-fits-all regulatory approach threatens to chill innovation, as companies from a host of different sectors expend resources just to make sense of the rules.

Should the bill move forward, we should also expect a range of consumer items will be equipped to blink and buzz and beep in ways more annoying than informative. The bill decrees that: “a manufacturer that sells or offers to sell a connected device in this state shall design the device to indicate through visual, auditory, or other means when it is collecting information.”

For some types of devices—such as virtual and augmented reality systems and autonomous vehicles—this requirement is simply infeasible. These devices use sensors to collect data constantly in order to perform their core functions. For always-on devices like IP security cameras, Amazon Alexa or connected cars, an indicator would just be synonymous with an “on” button. Many of these indicators will be superfluous, misunderstood and costly to implement—costs that disproportionately would hit smaller businesses.

Other provisions of the bill urge sellers of connected devices to notify consumers at checkout where they can find the item’s privacy policy and information about security patches and updates. This is valuable information, but the point-of-sale may not be the best time to communicate it. For many devices, a verbal or web-based tutorial likely would be more effective. Companies need the flexibility to figure out the best ways to inform their customers, while these design requirements would remove that flexibility.

In an interconnected world, balancing privacy rights and security is a hugely difficult undertaking. Enshrining that balance in law requires a nuanced and targeted approach. Policymakers at both the state and federal levels should focus their efforts on provable privacy or security harms, while empowering consumers with baseline information, where appropriate. Applying design requirements and compliance tasks in a haphazard way, as S.B. 327 does, will harm innovation without meaningfully improving data security.

Anne Hobson is technology policy fellow with the R Street Institute.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “The Teddy Bear And Toaster Act Is Device Regulation Done Wrong”

Subscribe: RSS Leave a comment
44 Comments
Ninja (profile) says:

I’d say that making storage, transmission and general data manipulation security mandatory, disclosure of what data could be collected for the device to do its basic task (ie: toasters making toasts should collect no data while self-driving vehicles need all sorts of environment data to self-drive) and make further data collection than that basic strictly opt-in could be nice obligations that wouldn’t cause any harm to innovation. The way the bill has been crafted is completely flawed but I’d argue we need strong laws to protect everybody from the INEPT.

Wyrm (profile) says:

Re: Response to: Ninja on Apr 19th, 2017 @ 12:16pm

Funny, that’s exactly what Congress just voted down, and that was not about optional connected toys but your mandatory ISP. (Well, mandatory as long as you want internet.)

So I’d find it a little hypocritical to pretend that toy-makers have to abide to some privacy and security standards.

Anonymous Coward says:

So, the text of the bill is less than 20 lines, but I’m not sure the author understands it.

Requiring “reasonable security features appropriate to the nature of the device and the information it may collect” does not, by definition, require using unreasonably high security features on a device that doesn’t need them.

The bill’s mandates are relatively straightforward- (1) the reasonable security features (i.e. tailored to the device’s needs and info); (2) some indication that the device is collecting information (again, not any specific method); (3) obtain consent for transmission of info (other than information transmission for the stated functionality of the device- e.g. not for a phone to send voice but for a phone that sends GPS data); (4) a short statement of the information collection made at point of sale; (5) direct notification to consumers of security patches.

Most of this seems reasonable and/or flexible- it mandates informing consumers of security and collection features, requires consent for unanticipated data transmission, and increases notice of security updates. The bill may not be perfect, but it definitely doesn’t jibe with the characterization made in this article.

Anonymous Coward says:

Re: Re: Re:3 Re:

If the definition of victim today is someone who voluntarily parted with their money and agreed to restrictive terms of service just to obtain an Internet toaster the word has truly lost all meaning.

The risks are known, information is easy to find. Ignorance does not a victim make.

Peter Leppik says:

So what's your solution?

This article bothers me because even while it acknowledges the “real threat” of poorly-secured devices, it offers no solution while pointing out all the problems (real or imagined) with the proposal that’s being offered.

If you agree that security of connected devices is a real problem that needs to be addressed, then what’s your solution? If not this idea, then what?

This sort of commentary is very close to straight-up obstructionism. It’s very easy to find problems with any specific proposal. It’s much harder to come up with better solutions. But nothing will ever get done if nobody ever offers better ideas.

Anonymous Coward says:

Re: So what's your solution?

“So what’s your solution?”

Lawsuit. Regulation winds up “shielding” businesses more than preventing them from fucking shit up.

But if a business can be sued by consumers for producing a product that can be used to compromise their privacy then maybe a few things will happen.

The thoughts of all the shit that might break loose would send many existing businesses into a ah fuck scramble to take their fucking security seriously.

Anonymous Coward says:

Re: Re: So what's your solution?

Security is hard. Attacking is easier than defending. If you hold the vendors/software developers liable eventually the market will only have the few companies like Apple, Google, and Microsoft who have the money and talent to make their products nominally secure.

Anonymous Coward says:

Re: Re: Re: So what's your solution?

Not true.

If you want to do it right, then you need to have an established set of guidelines everyone needs to follow similar to now NIST does password regulations. If you don’t follow the minimums you are exposed to a lawsuit.

Security is a serious issue, your logic would dictate that its okay for the TSA to hire retards for the security of air ports… of fuck, they already do? No fucking wonder! Do you work for the TSA?

Security, hard or not, is necessary, if you are prepared to do it right, its one of those things you don’t need to be doing at all!

Anonymous Coward says:

Re: Re: Re:2 So what's your solution?

“set of guidelines everyone needs to follow” There are so many different devices that do different things I think it would be difficult to make meaningful standards for devices that perform so many different functions and have different expectations of security. But if you could do that something like the Energy Star program could serve as a model. It’s a voluntary program and well known and understood by consumers.

Anonymous Coward says:

Re: Re: Re:3 So what's your solution?

I think you have brain damage.

standards would transcend the devices. Kinda like how NIST password standards have nothing to do with specific devices. But I guess you would not be intelligent enough to understand how that would work, would you? I feel like I am talking to a Politician that likes to talk about shit they know nothing about. Is this Trump?

Hopefully the next person that has to talk with you about anything that requires knowledge or brain cells gets the option of a refund!

Thad (user link) says:

Re: Re: Re: So what's your solution?

Security is hard.

I’ve never had any difficulty not leaving an open Telnet port with a hardcoded root password.

If you hold the vendors/software developers liable eventually the market will only have the few companies like Apple, Google, and Microsoft who have the money and talent to make their products nominally secure.

Wait, so you’re arguing that companies that don’t have the money and talent to make their products secure ("nominally" or otherwise) should continue to stay in business and sell their insecure products?

TKnarr (profile) says:

Re: Re: So what's your solution?

That’s already been thought of. That’s why the “terms of service” for connected devices commonly include clauses preventing users from joining class-action suits and requiring them to first go through manufacturer-friendly arbitration before filing an individual lawsuit (and often making the consumer liable for the company’s legal costs if the consumer fails to win the suit, where in the normal course of legal proceedings they wouldn’t be). Lawsuits aren’t a real threat when no individual consumer can show enough damages to cover the costs of suing and collective actions are prohibited.

Anonymous Coward says:

Re: Re: Re: So what's your solution?

Exactly, which is why we need things like standards. If you as a business fall below them, then your TOS will not save your ass from a lawsuit. So if a person brings suit and a judge finds reason, that product go to a blackhat session where it become a fucking field day exercise in security breaching.

Once a person succeeds in their suit, they get a juicy stack of cash for their problems and the business is open to future lawsuits by other customers until they release a patch addressing the security vulnerabilities.

It will not take long for businesses to understand that if they don’t take it seriously they could be put out of business fast!

TKnarr (profile) says:

Re: Re: Re:2 So what's your solution?

I don’t think you understand the process. With these terms of service a person brings suit, the company moves for dismissal and referral to arbitration based on the TOS, the judge tosses the suit (out or over the wall to the arbitration panel) based solely on the TOS and never gets to the question of whether the complaint had any basis. And if they sue after arbitration, they have to shell out hundreds of thousands of dollars over a couple of years with no ability to recover any of it and the possibility of having to also cover the company’s legal fees even if the person wins.

OA (profile) says:

Re: So what's your solution?

This article bothers me because even while it acknowledges the "real threat" of poorly-secured devices, it offers no solution while pointing out all the problems (real or imagined) with the proposal that’s being offered.

If you agree that security of connected devices is a real problem that needs to be addressed, then what’s your solution? If not this idea, then what?

First off, I have no opinion on the article…

I’m not fond of this type of "reasoning". Commentary towards the assessment of a problem is perfectly valid. Furthermore, many solutions should be derived communally.

If one waits for action plans like the following:
1) Solve problem,

then you tend to get narrowly considered, cliché-like "solutions".

This sort of commentary is very close to straight-up obstructionism.

Obstructionism (or "very close" to it) is usual about insincerity and/or malice. You reply as if the author’s insincerity is a given.

It’s very easy to find problems with any specific proposal.

It is very easy to make proposals that are careless, thoughtless, destructive or irresponsible. The author offers related discussion and arguments. You offer nothing!

It’s much harder to come up with better solutions. But nothing will ever get done if nobody ever offers better ideas.

Meaningless cliché. This whole comment reads like an attempt to prejudice the susceptible reader and as a blind defense of the criticized legislation. There are no actual arguments!

Roger Strong (profile) says:

Not The Singularity We Were Warned About

Other provisions of the bill urge sellers of connected devices to notify consumers at checkout where they can find the item’s privacy policy and information about security patches and updates.

What of devices that lack a screen for conveying that information? For example that infamous internet-connected smart vibrator mentioned here a month ago.

The obvious solution is to add a voice chip so that it starts loudly start explaining We-Vibe privacy policy at checkout. That could take a while, so it may still be happily explaining security features on the bus home.

Christenson says:

How about some *good* language for the law?

Here’s my simple list of requirements for all IOT devices:

Conspicuously available disclosures PRIOR TO THE SALE of:
Identify the device, (where’s the model #?, what’s it do?)
the data it may collect,
how the collected data is secured,
how the collected data may be used by whom in spite of being “secured” (hint: here’s looking at you, browser fingerprinters!)
potential consequences of not securing that data.
How may the internet connection be disabled?
What are the consequences of disconnecting from the internet?
How may the firmware be updated?
How may it’s version be determined?

And a couple of requirements:
Firmware must not be updated without in-person mechanical permission such as pressing a button.
Internet disconnection must be reasonably simple and not otherwise damage the device. Maximum tools required: screwdriver, wire cutters, or USB/network cable and computer.
Firmware updates must be offered to all customers on an anonymous basis.

Anonymous Coward says:

Easier fix

Just reclassify security defective devices as defective in the product recall sense. When the companies are faced with recalling devices they cannot or will not fix, they’ll seriously consider designing the devices to be either correct the first time or, where that’s not viable (and, for some devices, it won’t be), sufficiently field serviceable that they can — and do — fix the problems when problems are reported. As is, you get the worst of all worlds: the complete lack of support/interest after the point of sale that is common in pure software, but the rapid market turnover of embedded devices, allowing the manufacturer to profit and move on before their mistakes catch up to them.

TKnarr (profile) says:

Hmm. Who does the R Street Institute represent (as in, who are they being paid by)? The arguments Ms. Hobson presents look like they’re taking the proposed law and interpreting every clause in it in the most disadvantageous manner (even when that contradicts the black-letter words of the proposal). The result is arguments that amount to eg. "There isn’t a full screen to display details like we’d have on a computer on a toaster, so it’s impossible for a toaster to comply.", easily countered by "State clearly in the manual what information is collected and transmitted, then either state that it’s continuously collected/transmitted while the toaster is powered on or add one single LED and say that that LED being lit means data collection/transmission is in progress.". The whole thing smacks of an attempt to argue that we shouldn’t hold manufacturers to any legal standard and should leave it entirely up to them to voluntarily do the right thing.

Well, if they would voluntarily do the right thing, we’d never have gotten to the point where a law like this is proposed.

Anonymous Coward says:

Re: Re: Re:

“Who benefits from avoiding product liability?”

Open source. It is the norm for free and open source software to come with no warranty and exclusion of liability. Yet many projects have more features and better security than proprietary options.

If you kill the market for cheap Chinese routers, you lose many of the platforms that OpenWRT runs on, driving up the price for secure routers. Already the threat of security regulation has caused vendors like TP-Link to make it harder to install third party firmware making the routers permanently insecure when they decide not to support them anymore.

chrisbyrnes (profile) says:

Most of you are seriously underestimating this problem

As one who deals with this problem professionally on a daily basis, I assure you that legislation is critically needed. Insecure devices place more than their owners at risk. They become pawns in criminal enterprises. They are used to undermine the Internet. Device manufacturers continue to ignore this. Continued economic growth, at the point, will require legislation. As usual, California first.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...