Disappointing: Apple The Latest To Abuse DMCA 1201 To Try To Stifle Competition, Security Research, Jailbreaking And More

from the come-on-guys dept

Back in August, Apple kicked off an already questionable lawsuit against Corellium, makers of virtualization software that would let users create and interact with “virtual” iOS devices. It is a useful tool for a variety of reasons, including (importantly) for security researchers trying to hunt down bugs on a virtual iPhone. Over the last few months, security researchers in particular have been raising the alarm about this lawsuit. Then, just before the New Year, Apple made things much, much worse, with its amended complaint, that takes Section 1201 of the DMCA to new and even more ridiculous heights.

As Corellium’s CEO Amanda Gorton noted in an open letter, this appeared to be Apple using copyright law to completely shutdown the idea of jailbreaking:

Apple?s latest filing against Corellium should give all security researchers, app developers, and jailbreakers reason to be concerned. The filing asserts that because Corellium ?allows users to jailbreak? and ?gave one or more Persons access? to develop software that can be used to jailbreak,? Corellium is ?engaging in trafficking? in violation of the DMCA. In other words, Apple is asserting that anyone who provides a tool that allows other people to jailbreak, and anyone who assists in creating such a tool, is violating the DMCA. Apple underscores this position by calling the unc0ver jailbreak tool ?unlawful? and stating that it is ?designed to circumvent [the] same technological measures? as Corellium.

Apple is using this case as a trial balloon in a new angle to crack down on jailbreaking. Apple has made it clear that it does not intend to limit this attack to Corellium: it is seeking to set a precedent to eliminate public jailbreaks.

We are deeply disappointed by Apple?s persistent demonization of jailbreaking. Across the industry, developers and researchers rely on jailbreaks to test the security of both their own apps and third-party apps ? testing which cannot be done without a jailbroken device. For example, a recent analysis of the ToTok app revealed that an Apple-approved chat app was being used as a spying tool by the government of the United Arab Emirates, and according to the researchers behind this analysis, this work would not have been possible without a jailbreak.

You really should read the Apple filing directly. It is not subtle in what it is seeking to argue. It claims that any virtualization of its software is copyright infringement, and that any attempt to jailbreak its software violates Section 1201 of the DMCA, which is the anti-circumvention or “digital locks” part of the DMCA. We’ve long found 1201 to be incredibly problematic in general, and believe it should be dumped entirely as it has served to regularly prevent perfectly legal uses that might create competition. Here, however, Apple is taking the argument much, much further, and suggesting that because some security researchers might use the product for bad reasons, that alone proves that Corellium’s offering is not done in good faith.

A key argument is that because security researchers using Corellium don’t always report bugs directly to Apple, that proves Corellium is a bad actor. This is a huge stretch and would be a very dangerous interpretation of the law.

Although Corellium paints itself as providing a research tool for those trying to discover security vulnerabilities and other flaws in Apple?s software, Corellium?s true goal is profiting off its blatant infringement. Far from assisting in fixing vulnerabilities, Corellium encourages its users to sell any discovered information on the open market to the highest bidder. Indeed, Corellium?s largest customer admits that it has never reported any bugs to Apple.

Apple strongly supports good-faith security research on its platforms, and has never pursued legal action against a security researcher. Not only does Apple publicly credit researchers for reporting vulnerabilities, it has created several programs to facilitate such research activity so that potential security flaws can be identified and corrected. Apple?s programs include providing as much as $1 million per report through ?bug bounty? programs in accordance with the provisions of those programs. Apple has also announced that it will provide custom versions of the iPhone to legitimate security researchers to allow them to conduct research on Apple devices and software. These efforts recognize the critical role that members of the security research community play in Apple?s efforts to ensure its devices contain the most secure software and systems available.

The purpose of this lawsuit is not to encumber good-faith security research, but to bring an end to Corellium?s unlawful commercialization of Apple?s valuable copyrighted works. Accordingly, Apple respectfully seeks an injunction, along with the other remedies described below, to stop Corellium?s acts of naked copyright infringement.

Before we get into the legal issues, just note carefully what Apple is arguing in the above three paragraphs. It is saying, in effect, that the only “good-faith security research” is that done in accordance with Apple’s concept of what is good-faith research. That should worry everyone. While it is true that Apple is rather accommodating of many security researchers, allowing the company determine what qualifies as good security research practices of its own products, with significant legal liability associated with falling on the wrong side, should scare everyone. Even if Apple is a good steward of the research community, tons of other companies are not. And such a precedent would be hugely problematic.

As for the specifics of the lawsuit, Apple seems particularly perturbed that Corellium advertises its products to security researchers to hunt down bugs.

In August 2019, Corellium specifically emphasized, at the international cybersecurity Black Hat USA Conference, that the Corellium Apple Product is an exact copy of Apple?s copyrighted works, designed specifically to allow researchers and hackers to research and test their vulnerabilities, by ?run[ing] real iOS ? with real bugs that have real exploits.? In other words, the Corellium Apple Product is designed to find and exploit flaws in iOS. And Corellium?s Apple Product does so by, among other things, enabling its users to circumvent the technological protection measures that are designed to limit where and how Apple?s copyrighted works can be used.

Relatedly, it is clear that Apple considers the process of jailbreaking itself to violate copyright laws, which is bullshit.

On April 1, 2019, Corellium again highlighted the unlawful ends to which its product is aimed by publicly acknowledging that it had given access to its platform to the developers of code used to jailbreak iOS devices called ?unc0ver,? so the developers could test the jailbreaking code ?on any device running any firmware? and distribute that code to the public. Within weeks, those developers released a new version of unc0ver that allowed jailbreaking of iOS 12.6 In other words, Corellium has admitted not only that its product is designed to circumvent technological protection measures Apple puts in place to prevent access

A decade ago, Apple had also tried to make the argument that jailbreaking your iPhone was copyright infringement, and partly as a result, the Library of Congress made it clear that jailbreaking mobile devices was not infringing under 1201. Indeed, the Library of Congress triennial exemptions still contain jailbreaking phones. But… part of the issue is that the exemptions only cover you jailbreaking your own device, and not a 3rd party company offering a service or software to do it for you.

The details of the 1201 claims here are important. Kyle Wiens, over at iFixit, has a really good breakdown of many of the issues. But Apple’s claims seem incredibly weak here:

The Copyright Act prohibits trafficking in products that are used to modify iOS and circumvent technological controls that protect copyrighted works. These ?anti-trafficking? provisions, 17 U.S.C. &section 1201(a)(2) and (b), make it unlawful for any person to ?manufacture, import, offer to the public, provide, or otherwise traffic in any technology, product, service, device, component, or part thereof? that is primarily designed, produced, or marketed for the purpose of circumventing technological measures that either effectively control access to a copyrighted work (section 1201(a)(2)), or that protect the exclusive rights of a copyright owner (section 1201(b)).

But it’s not at all clear how offering a virtualization product that allows for jailbreaking is “primarily designed… for the purpose of circumventing technological measures.” It’s primarily designed as a tool for security researchers. As Kyle points out, if Apple gets its way, that’s bad news for lots of other products as well:

Apple is arguing that no one else should be able to make tooling for performing security research on their products. What happens if other companies start making the same claims?

This isn?t academic. Last year, GM sued aftermarket parts company Dorman for ?overriding the security measures used in [GM]?s vehicle control modules? in their transmission repair tool. Dorman?s aftermarket transmissions moved the firmware from an existing transmission into their aftermarket part, so that it would be recognized by the vehicle and work.

John Deere has also been aggressively locking down their products, aiming to monopolize service and prevent farmers from doing repairs themselves. They opposed a DMCA exemption for farmers on the grounds that if owners could fix their own equipment, they might use their newfound freedom to pirate Taylor Swift?s music on their tractors.

As he notes, Apple understands all of this and should know better.

Meanwhile, Matt Tait, highlights that a separate, but equally problematic part of the lawsuit is the fact that Apple seems to be suggesting that the only acceptable security research is that done under Apple’s approval. That’s also worrying — not because Apple is particularly bad in how it engages with security researchers (as noted above, the opposite is true). What’s worrying is the precedent this would set for others, both about the nature of security work and how the DMCA 1201 might be further abused to shut down competition, ancillary markets, security research and more. It’s a head-on attack on the concept of property rights and ownership, abusing the DMCA. It’s an incredibly disappointing move from Apple, a company that should know better.

Filed Under: , , , , , , ,
Companies: apple, corellium

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Disappointing: Apple The Latest To Abuse DMCA 1201 To Try To Stifle Competition, Security Research, Jailbreaking And More”

Subscribe: RSS Leave a comment
35 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Security research

While it is true that Apple is rather accommodating of many security researchers, allowing the company determine what qualifies as good security research practices of its own products, with significant legal liability associated with falling on the wrong side, should scare everyone.

It shows quite a sense of entitlement on Apple’s part. They release insecure software that puts everyone at risk, and not only do they not expect to be sued for that, they expect people to help Apple on Apple’s terms—which include keeping the details secret from the public for a while. Apparently, if I notice Apple’s made a mistake, I owe them.

Anonymous Coward says:

Re: Re: Security research

Well if you pay for apple devices, you do not own them, but rather Apple owns you, and can tell you how to deal with problems in their devices.

Apple will be the first to tell you their systems are wonderfully secure because their TOS prohibit anyone from doing anything bad to them. Nothing to see here, move along citizen.

Anonymous Coward says:

Re: Re:

That license afreement isn’t even involved here. Beyond that, too bad, if i own a copy of something, i will virtualize it when i want, if i can. I will also install something on the hardware of which mommy didn’t approve.

Apple can suck it, and quite frankly, this is exactly why i have some issues with the most permissive open-source licenses.

This comment has been deemed insightful by the community.
Mike Masnick (profile) says:

Re: Re:

If users violate Apple’s license agreement then that’s between Apple and those users. Suing the software maker who makes such things possible, who is providing a tool, is pretty silly. And that’s doubly true in this case, where the tool is not used in competition to Apple, but rather as a virtualization tool.

urza9814 (profile) says:

Re: Re: Re:

From TFA (emphasis added):

"…the Corellium Apple Product is an exact copy of Apple’s copyrighted work, designed specifically to allow researchers and hackers to research and test their vulnerabilities, by ‘run[ing] real iOS – with real bugs that have real exploits.’"

So if the product that they’re distributing is or contains a copy of iOS as claimed, then they are required to comply with the license agreement under which iOS is distributed…right? What part of this am I missing?

This comment has been deemed insightful by the community.
Tom, Dick & Harry (profile) says:

Re: Re:

The result if this case — whether Apple is success or not — will have a lasting effect on many regions in businesses, communities, researchers, and those outside the computational aspects as well.

If interlopy is to remain/grow among software, hardware, cybersecurity: then pray that the bad Apple falls.

Code Monkey (profile) says:

For a technologically savvy company

Apple, like any other phone manufacturer, already has the capability to ping a phone and be able to determine if it’s been jailbreaked (jailbroke??) or not.

If they determine that the phone has been tampered with, they can just brick the phone.

Sure, it’s an asshole move, but probably after the 5th or 6th person has their phone turn into a $1200 piece of furniture, people will stop tampering with their phones.

This comment has been deemed insightful by the community.
Scary Devil Monastery (profile) says:

Re: For a technologically savvy company

"Sure, it’s an asshole move, but probably after the 5th or 6th person has their phone turn into a $1200 piece of furniture, people will stop buying Apple."

Fixed That For You.

We’ve seen plenty of examples of major manufacturers which have gone down in flames and hysteria over control issues of this kind.

Code Monkey (profile) says:

Re: Re: Re:2 For a technologically savvy company

No, SDM is still wrong. Just throwing "fixed that for you" doesn’t change the fact that phone companies can, if they choose, brick a phone that has been tampered with.

If a user is still stupid enough to tamper with their phone after TechDirt proves that it bricks phones of users who tamper, then that’s the user’s stupidity.

When you sign the TOC with the phone company, you immediately give up your rights to "reverse engineer" or "attempt to circumvent" any of the phone’s systems.

I’m not making a judgment as to whether the company is right or wrong. I will agree to SDM stating that other companies have gone down in flames for such fuckery, but I believe my original assertion that the phone companies COULD brink a phone still stands.

Rocky says:

Re: Re: Re:3 For a technologically savvy company

No, SDM is still wrong. Just throwing "fixed that for you" doesn’t change the fact that phone companies can, if they choose, brick a phone that has been tampered with.

It depends in which jurisdiction the phone was sold, because some jurisdictions actually have consumer laws that has some teeth in protecting consumers, ie. if a consumer bought a piece of hardware they are fully within their right to do whatever they want with it since they actually own it without the manufacturer/phone company can do jack shit about it and they would actually break the law if they brick the phone intentionally. The only thing they can do is to refuse service.

And the "sign the TOC" (TOS? EUA?), I can buy a phone from a 3rd party without signing ANYTHING and do whatever I want with it. Every smartphone I have owned I have rooted and re-installed with custom firmware and the manufacturer and phone company can take a hike as far as I’m concerned.

Code Monkey (profile) says:

Re: Re: Re:4 For a technologically savvy company

I’m right there with you. The FIRST thing I do is root my phone and write my own damn software.

Like many in this post, I refuse to have anything to do with Apple, but from a purely developer standpoint.

That said, if Verizon decides to brick my Galaxy S8 because I rooted it, that would be on ME, not Verizon. Yes, I’d be pissed, but, hey I took the chance when I rooted my phone.

And yes, I can buy another phone, and yes, I can buy third party, but I have to outlay that money for the new phone because I violated the rules.

From Verizon’s own website (I know, this story is about Apple, but that means I’d have to sully my hands going to Apple’s website…)

"..We further reserve the right to take measures to protect our network and other users from harm, compromised capacity or degradation in performance. These measures may impact your service, and we reserve the right to deny, modify or terminate service, with or without notice, to anyone we believe is using Data Plans or Features in a manner that adversely impacts our network. We may monitor your compliance, or the compliance of other subscribers, with these terms and conditions, but we will not monitor the content of your communications except as otherwise expressly permitted or required by law"

https://www.verizonwireless.com/support/important-plan-information/

Fair point. I should have said "read the TOC" and sign the contract. 🙂

Scary Devil Monastery (profile) says:

Re: Re: Re:5 For a technologically savvy company

"From Verizon’s own website (I know, this story is about Apple, but that means I’d have to sully my hands going to Apple’s website…)"

The day Verizon starts building their own phones and UI that’ll be an issue…
…but right now and here I’m pretty sure that as long as you use a verizon SIM run on hardware fulfilling all measurable specifications, Verizon won’t give you a hassle over whether you tun your phone with or without root, as long as they can’t detect any fuckery coming between your phone and their network.

If anyone knows of a case where Verizon or any other telco has tried to shitcan their account over which phone they ran then that’s…pretty scary, actually. Also unlikely.

"That said, if Verizon decides to brick my Galaxy S8 because I rooted it, that would be on ME, not Verizon. Yes, I’d be pissed, but, hey I took the chance when I rooted my phone."

If Verizon is ever in the position of being able to tell whether or not you rooted your own 3rd-party manufactured phone then it won’t really matter what their TOC’s say, because you’d be taking them to court over – in effect – stealing your phone.

Scary Devil Monastery (profile) says:

Re: Re: Re:3 For a technologically savvy company

"I will agree to SDM stating that other companies have gone down in flames for such fuckery, but I believe my original assertion that the phone companies COULD brink a phone still stands."

Possible, if they retain hardware access after a user has obtained root and possibly installed a 3rd-party OS like cyanogen…
…and no, you’re still wrong because you weren’t talking about technology. you were claiming that phone users would stop tampering if the OEM started bricking.

Well, I’ll give you they might stop tampering on that phone brand. They’d be tampering on one built by someone slightly less demented.

Coroporations can and will run dick moves, dumb moves, outright demented moves every now and then…
…but it takes a very special company to screw themselves out of the market completely by removing what is a key segment of user convenience.

Currently I can think of one company which has tried something slightly similar – Sony, with their PS3 "other OS" forced retraction – which gave xbox an instant christmas win in the markets – and before that with one of their cameras they hardwired to not accept anything other than Sony-made SD’s. But even so, Sony at most decided to bar unlocked devices from their networks.

"When you sign the TOC with the phone company, you immediately give up your rights to "reverse engineer" or "attempt to circumvent" any of the phone’s systems."

I think that already exists in Apple’s boilerplate EULA, and SCOTUS struck that one down in bolts of thunder, rendering jailbreaking undisputably illegal.

Apple’s new stunt here, trying to get to the same result by using a different law to accomplish the same result, may show a more favorable result for Apple but one, i believe, which will not benefit Apple (or any other OEM with a proprietary UI and control issues) in the market.

Scary Devil Monastery (profile) says:

Re: Re: Re: For a technologically savvy company

"Correctly fixed that for you"

No you didn’t.

Once the first few manufacturers start trying to brick their phones over someone choosing to install cyanogen rather than plain old google android, those manufacturers will no longer sell any phones.

This is why Apple today has a 14% smartphone market share where it used to be 100%.

So tell me why a smartphone OEM, especially one which has an Android One rollout alongside it’s own chosen UI, would ever be dumb enough to take its marketshare out back and shoot it in the way you suggest?

There’s a reason why no tech OEM has ever tried a stunt like that, even when they’ve shown themselves able to make monumentally bad decisions otherwise. They know damn well the very second they pull that stunt they’ll be replaced by those of their competitors who only had to not be outright insane.

urza9814 (profile) says:

Re: Re: Re:2 For a technologically savvy company

This is true until Google gets that Fuscia thing rammed through. Right now there’s still competition. You’ve got some fairly major companies releasing phones with fairly open variants of Android. But when there’s two dominant platforms and neither one has any open source variants? If we don’t keep LineageOS, Cyanogenmod, and the other Android variants alive, it will be all too easy for the only two companies in the market to lock out modified software versions, and they’ll have good reason to do so in order to protect their market. I would actually bet on Google doing it first though rather than Apple — Apple is already so locked down that there would be less fear of people jumping to the alternative, then once Google does it Apple would have no reason not to.

This comment has been deemed insightful by the community.
techflaws (profile) says:

"It is saying, in effect, that the only "good-faith security research" is that done in accordance with Apple’s concept of what is good-faith research."

And we all know how right Apple is about things! Remember "you’re holding it wrong" or how they removed RSS support but still kept it registered with Safari so you cannot use it?

https://mjtsai.com/blog/2019/12/26/apple-news-no-longer-supports-rss/

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...