Search for stories about Apple's App Store in the Techdirt archives and you will quickly notice a theme. That theme is that Apple routinely appoints itself as the arbiter of artistic quality and morality when it comes to content within the app store, particularly gaming content, and that its application of these standards swings like some kind of absurd pendulum. Ban a game over here for telling a bible story that includes violence against children, but allow the actual bible to be sold as well. React to the South Carolina massacre by pulling down games about the Civil War because they include images of the Confederate flag. Reject a wargaming simulation, then approve it, and nobody knows how the company might decide to react tomorrow. You often hear that stability breeds a good ground for business, whereas Apple runs its App Store like some kind of experiment in chaos.
And in order to apply its standards in a way that apparently makes the folks at Apple feel all warm and fuzzy inside, it occasionally has to truly lower its explanations to absurd levels of outright lying. For instance, Apple recently disallowed a game about surviving on the Gaza Strip in its store, claiming it wasn't a game at all, but a news publication, even though the briefest review of the app reveals that it's obviously a game.
A game about the Palestine/Israel conflict, Liyla and The Shadows of War, has proved too political for Apple. The technology giant ordered the developer, Rasheed Abueideh, to remove Liyla from the games section of its iTunes app store, claiming it isn't a game and should sit in the news section.
The real question is, is Liyla and The Shadows of War a game? I played it last night, as Liyla is available from Google Play. It's a short platformer with a powerful message and stunning graphics.
The writer goes on from there to describe the plot, the inclusion of reactions to real life events, the graphical elements of the game, and the, well, gameplay. Because it's a game. You have to play to get either a win or lose scenario, there are choices to be made, puzzles to be solved, and stages to complete. It's a platformer, like Mario Bros..
So, why the ban and the lies to support it? Well, one can understand that the Middle East conflict and the ongoing crisis between the Palestinians and the Israelis is among the most touchy of subjects. For a company that wants to keep its brand and its App Store squeaky clean, at least in its own mind, one can imagine that this kind of thing is something Apple wouldn't want to touch. But, misguided as this already is, it becomes all the more so when it can't even bother to stay consistent on the matter. The App Store has available for purchase, for instance, Israeli Heroes, which appears to be an Angry Birds clone in which you lob missiles at bombs that reside under a crescent moon and oh my god, I think I'm about to have an embolism, because come on.
As always, in the midst of this nonsense, the game is available for Android devices, because that garden has no wall around it.
For once, the phrase 'relax, it's just a game' seems apt. Apple take note. Liyla and The Shadows of War is available for Android on Google Play – it's free, it's short and it's definitely a game worth playing.
We've said it before, but we'll say it again: it'd be best if Apple would get out of the art critique business. They're not very good at it.
Senators Richard Burr and Dianne Feinstein are not giving up that quickly on their ridiculous and technically ignorant plan to outlaw real encryption. The two have now penned an op-ed in the WSJ that lays out all the same talking points they've laid out before, without adding anything new. Instead, it just continues to make statements that show how incredibly ignorant they are. The piece is called Encryption Without Tears (and may be paywalled, though by now everyone knows how to get around that), which already doesn't make any sense. What they're pushing for is ending basic encryption, which will lead to many, many tears.
It starts out with their standard ridiculous line, pretending that because a company builds end-to-end encryption, it's acting "above the law."
In an increasingly digital world, strong encryption of devices is needed to prevent criminal misuse of data. But technological innovation must not mean placing individuals or companies above the law.
People have gone over this time and time again: this is not about anyone being "above the law." It's about whether or not companies can be forced to directly undermine the safety and security of their products (and the public). A paper shredder can destroy evidence. A paper shredder maker is not "above the law" when it decides not to build a system for piecing back together the shreds.
And speaking of "above the law" I still don't see Feinstein or Burr commenting on the FBI/DOJ announcing that it will ignore a court order to reveal how it hacked into computers over Tor. That is being above the law. That involves a situation where a court has asked for information that the FBI absolutely has. The FBI is just saying "nope." If Burr and Feinstein are really worried about being "above the law," shouldn't they worry about this situation?
Over the past year the two of us have explored the challenges associated with criminal and terrorist use of encrypted communications. Two examples illustrate why the status quo is unacceptable.
I love this. They give two examples that have been rolled out a bunch in the last few weeks. The attack in Garland, Texas, where the attackers supposedly exchanged some messages with potential ISIS people, and the case of Brittney Mills, who was tragically murdered, and whose case hasn't been solved. Mills had her smartphone, but no one can get into it. Of course, it took nearly two years of fretting before law enforcement could dig up these two cases, and neither make a very strong argument for why we need to undermine all encryption.
It's a simple fact that law enforcement never gets to have all of the evidence. In many, many, many criminal scenarios, that's just the reality. People destroy evidence, or law enforcement doesn't find it or law enforcement just doesn't understand it. That's not the end of the world. This is why we have police detectives, who are supposed to piece together whatever evidence they do have and build a picture for a case. Burr and Feinstein are acting like in the past, law enforcement immediately was handed all evidence. That's never been the way it works. Yes, law enforcement doesn't get access to some information. That's how it works.
You don't go and undermine the very basis of computer security just because law enforcement can't find a few pieces of evidence.
Our draft bill wouldn’t impose a one-size-fits-all solution on all covered entities, which include device manufacturers, software developers and electronic-communications services. The proposal doesn’t define the technological solutions or tell businesses how to solve the problem.
This is also misleading. The bill requires an end to real encryption. That's it. Real encryption means that only one person has the key. This is what Burr and Feinstein don't seem to get. They seem to think it's trivial to leave a key with Apple or whoever. But as basically every crypto expert has explained, it is not. Doing so creates a vulnerability... and worse, it's a vulnerability that cannot be patched. That's hellishly dangerous. Sure, the bill doesn't tell them exactly how to do this, but it does make it clear: you cannot offer real encryption, you can only offer something that can be hacked. That's a problem.
We want to provide businesses with full discretion to decide how best to design and build systems that maintain data security while at the same time complying with court orders.
We want to provide businesses with full discretion to decide how best to travel back in time, in order to prevent crimes.
Seriously: this is basically the same thing that Burr and Feinstein are saying here. They're asking for something that's impossible, and acting like it's a routine suggestion. If they need to comply with these All Writs Act style orders, they cannot build systems that maintain data security. That's a fact. It's mind-boggling that Burr and Feinstein still can't understand this.
Critics in the industry suggest that providing access to encrypted data will weaken their systems. But these same companies, for business purposes, already maintain and have access to vast amounts of encrypted personal information, such as credit-card numbers, bank-account information and purchase histories.
Argh. This paragraph shows that whatever poor staffer Burr and Feinstein assigned to write this drivel doesn't understand even the first thing about what he or she is talking about. Storing encrypted passwords, credit card info, bank account info, etc. is a totally different thing. Those are encrypted to keep them safe, and part of the reason they're encrypted is so that even those companies cannot reveal them. This point is making the opposite point of what Burr and Feinstein think. Companies encrypt passwords and credit card info and the like so that they're not storing the plaintext info, and there's no easy way for anyone to get that info. This protects user data, and the companies cannot actually provide the plaintext. They're comparing hashes. That's what keeps it safe.
If we received a court order demanding our users' passwords, we couldn't provide them. Because they're encrypted. We don't know our users' passwords and can't give them to you. When someone logs in to our website, we can compare a hash of their password to our hashed version and then if they match, we let them in. But we don't know what their password is. So this is a terrible example that actually goes against what Burr and Feinstein are saying. Those encrypted stores of information would be illegal under this bill!
We are not asking companies to provide law enforcement with unfettered access to encrypted data. We aren’t even asking companies to tell the government how they gain access to this encrypted data. All we are doing is asking companies to find a way to keep their data secure while also cooperating with law enforcement in terrorism and criminal investigations.
Again, that last line is impossible. They're asking the impossible -- and in the process, making everyone less safe. The only way to provide such info to law enforcement is to no longer keep the data truly secure. And the big concern is not unfettered access for law enforcement, but rather whatever this backdoor means for those with malicious intent, who will be very, very, very focused on finding these vulnerabilities and exploiting them.
President Obama said earlier this year, “You cannot take an absolutist view on this.” We agree—and believe that strong data security and compliance with the justice system don’t have to be mutually exclusive.
Because you don't know what you're talking about.
American technology companies have done some amazing things that are the envy of the world. We think that finding a way to achieve both goals simultaneously is not beyond their capabilities.
So, in the end, despite basically every cryptography expert telling them this is impossible, Burr and Feinstein come back with "NERD HARDER, NERDS!"
Once the DOJ told the court in San Bernardino that it had succeeded in hacking into the iPhone of Syed Farook, the big question people asked is whether or not the FBI would then tell Apple about the vulnerability. After all, the administration set up the so-called "Vulnerabilities Equities Policy" (VEP) with the idea of sharing most vulnerabilities it discovers with companies. The White House directly stated:
One thing is clear: This administration takes seriously its commitment to an open and interoperable, secure and reliable Internet, and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest. This has been and continues to be the case.
This spring, we re-invigorated our efforts to implement existing policy with respect to disclosing vulnerabilities – so that everyone can have confidence in the integrity of the process we use to make these decisions. We rely on the Internet and connected systems for much of our daily lives. Our economy would not function without them. Our ability to project power abroad would be crippled if we could not depend on them. For these reasons, disclosing vulnerabilities usually makes sense. We need these systems to be secure as much as, if not more so, than everyone else.
Still, one could make a strong case that this vulnerability should be disclosed... even if almost no one expected it to be. Amusingly, just a few days ago, Apple revealed that the FBI used the VEP to disclose a vulnerabilityfor the very first time, on April 14th, just as everyone was arguing about this. Of course, the flaw it revealed was not about hacking into the iPhone, and was actually about a flaw that Apple had discovered and fixed... nine months ago. But, again, if this is the very first time the FBI has disclosed something to Apple, it certainly suggests that the VEP process generally means nothing gets disclosed. In fact, the timing of this really suggests that someone in the DOJ recently flipped out and realized that there's now going to be scrutiny on the VEP, so they might as well disclose something. Thus, they found an old bug that had already been patched and "revealed" it.
“The F.B.I. purchased the method from an outside party so that we could unlock the San Bernardino device,” Amy S. Hess, executive assistant director for science and technology, said in a statement.
“We did not, however, purchase the rights to technical details about how the method functions, or the nature and extent of any vulnerability upon which the method may rely in order to operate. As a result, currently we do not have enough technical information about any vulnerability that would permit any meaningful review” by the White House examiners, she said.
Now, some are arguing that this suggests absolutely terrible bargaining on the side of the DOJ/FBI. But, another interpretation is that it's how the DOJ knew that it wouldn't have to reveal the flaw to Apple. Of course, this might also explain why the DOJ at one point appeared to claim that the hack in question only worked for Farook's phone. They later claimed that was a misstatement, and it really meant that it only applied to that iPhone configuration. But, if the FBI never actually got the details, then in some sense they'd be right that for the FBI the crack only worked for that one phone. And if they wanted to do it on another phone, they'd have to shell out another ~$1 million or so...
That's what makes its recent editorial, The Encryption Farce (possibly behind a paywall, though the version I just opened showed up fine), so remarkable. It completely bashes the FBI over its attempts to force Apple to build a backdoor into its encryption -- though mainly because of the ridiculous fact that in the two most high profile cases, the DOJ magically got into the phones just as the cases got serious. The WSJ editorial doesn't pull any punches, asking what the hell is going on at the Justice Department:
If history repeats itself first as tragedy and then as farce, what does the FBI have in store next for its encryption war with Apple? After withdrawing its demands in San Bernardino and then reopening hostilities with a drug prosecution in Brooklyn, the G-men abruptly dumped the second case over the weekend too. Is anyone in charge at the Justice Department, or are junior prosecutors running the joint?
The editorial goes on to mock the FBI's claim that these cases are all about getting into just that phone, and notes that constantly finding ways in at the last minute are destroying the FBI's credibility.
This second immaculate conception in as many months further undermines the FBI’s credibility about its technological capabilities. Judges ought to exercise far more scrutiny in future decryption cases even as Mr. Comey continues to pose as helpless.
It goes on to suggest that the FBI stop bringing these cases, and that the President and the DOJ should put an end to this ridiculous attack on encryption:
Yet forgive us if this “conversation” now seems more like a Jim Comey monologue. The debate might start to be productive if the FBI Director would stop trying to use the courts as an ad hoc policy tool and promised not to bring any more cases like the one in Brooklyn.
Meanwhile, the White House has taken the profile-in-courage stand of refusing to endorse or oppose any encryption bill that Congress may propose. If the Obama team won’t start adjusting to the technological realities of strong and legal encryption, they could at least exercise some adult supervision at Main Justice.
On its own, such an editorial might not seem like a huge deal, but coming from the Wall Street Journal -- a source that has previously championed much greater surveillance and even supported backdoors -- it's a surprising shift. And it shows just how badly the DOJ and FBI miscalculated in their attempts to use the courts to get their desired results in breaking encryption.
While so much of the attention had been focused on the case in San Bernardino, where the DOJ was looking to get into Syed Farook's iPhone, we've pointed out that perhaps the more interesting case was the parallel one in NY (which actually started last October), where the magistrate judge James Orenstein rejected the DOJ's use of the All Writs Act to try to force Apple to help unlock the iPhone of Jun Feng, a guy who had already pled guilty on drug charges, but who insisted he did not recall his passcode.
There were some oddities in the case. Feng had pled guilty and there was some issue over whether or not there was still a need to get into the iPhone. The DOJ insisted yes, because Feng's iPhone might provide necessary evidence to find others involved in the drug ring. The other oddity: Feng's iPhone was running iOS7. While the device itself was a newer model iPhone than the one in the Farook case, it still has an older operating system, where it was known that Apple (and others) could easily get in. So it made no sense that the FBI couldn't get into this phone. In fact, Apple's latest filing in the case, just over a week ago was basically along those lines, noting that the DOJ claimed Apple's assistance was "necessary," but that seemed unlikely.
And... late on Friday, the DOJ did the exact same "run away!" move it did in the Farook case, telling the judge that it had suddenly been given the passcode, so there was no need to move forward with the case at all.
The government respectfully submits this letter to update the Court and the
parties. Yesterday evening, an individual provided the passcode to the iPhone at issue in this
case. Late last night, the government used that passcode by hand and gained access to the
iPhone. Accordingly, the government no longer needs Apple’s assistance to unlock the
iPhone, and withdraws its application.
According to a (paywalled) WSJ article, Feng, who has been waiting for his sentencing, and thinking that his case was otherwise over, only just found out that there was this big fuss around his own case... and told the DOJ he miraculously remembered the passcode. Hallelujah. A miracle... and the DOJ was magically saved from a precedent it didn't want.
The Wall Street Journal reported last week that Mr. Feng only recently learned his phone had become an issue in a high-stakes legal fight between prosecutors and Apple. Mr. Feng, who has pleaded guilty and is due to be sentenced in the coming weeks, is the one who provided the passcode to investigators, according to people familiar with the matter.
Of course, it's worth noting, however that while this particular case may be effectively over, it's not that great for the DOJ, in that no one got to officially review magistrate judge James Orenstein's fairly epic smackdown of the DOJ earlier in the case. That, of course, has no value as a precedent, but that doesn't mean it won't be quoted or pointed to in other, similar cases.
On the flip side, of course, there's the argument that every time the case starts looking bad for the DOJ, they miraculously get into the phone in question. At the very least, this ought to raise questions about why the DOJ keeps insisting that it needs Apple's help... But the fact is these cases are going to keep coming.
On Thursday, FBI Director James Comey suggested that the FBI paid over a million dollars to a group of hackers who helped it get into Syed Farook's encrypted work iPhone. Of course, just as pretty much everyone predicted, the FBI found nothing of value on the iPhone. This was hardly a surprise. It was a case where we already know who did it, and that they were already dead. We also know that they destroyed their two personal iPhones, leaving open the question why anyone would think there was anything valuable on the work iPhone.
Specifically, Comey said that buying the exploit from this group cost the FBI "more than I will make in the remainder of this job, which is seven years and four months, for sure." Comey makes $185,100 per year at his job, implying that buying the exploit cost at least $1.3 million or so.
This has, understandably caused some to ask how it could possibly be worth it to pay so much money for an exploit that everyone must have known was worthless.
It would have been more responsible to give the FBI’s slush fund over to the victims’ families than to pursue such an obvious non-lead.
Of course, that is taking a slightly narrow view on things, considering that many people believe, strongly, that the FBI's motive here was really to extricate itself from the legal dispute over the phone that had the very strong potential of ending with a bad precedent for the FBI and the DOJ. When looked at through that lens, $1.3 million or whatever seems like very little money to pay...
When you testify before Congress, it helps to actually have some knowledge of what you're talking about. On Tuesday, the House Energy & Commerce Committee held the latest congressional hearing on the whole silly encryption fight, entitled Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives. And, indeed, they did have witnesses presenting "industry" and "law enforcement" views, but for unclear reasons decided to separate them. First up were three "law enforcement" panelists, who were free to say whatever the hell they wanted with no one pointing out that they were spewing pure bullshit. You can watch the whole thing below (while it says it's 4 hours, it doesn't actually start until about 45 minutes in):
Lots of craziness was stated -- starting with the idea pushed by both chief of intelligence for the NYPD, Thomas Galati and the commander of the office of intelligence for the Indiana State Police, Charles Cohen -- that the way to deal with non-US or open source encryption was just to ban it from app stores. This is a real suggestion that was just made before Congress by two (?!?) separate law enforcement officials. Rep. Morgan Griffith rightly pointed out that so many encryption products couldn't possibly be regulated by US law, and asked the panelists what to do about it. You can watch the exchange here:
You see Cohen ridiculously claim that since Apple and Google are gatekeepers to apps, that the government could just ban foreign encryption apps from being in the app stores:
Right now Google and Apple act as the gatekeepers for most of those encrypted apps, meaning if the app is not available on the App Store for an iOS device, if the app is not available on Google Play for an Android device, a customer of the United States cannot install it. So while some of the encrypted apps, like Telegram, are based outside the United States, US companies act as gatekeepers as to whether those apps are accessible here in the United States to be used.
This is just wrong. It's ignorant and clueless and for a law enforcement official -- let alone one who is apparently the "commander of the office of intelligence" -- to not know that this is wrong is just astounding. Yes, on Apple phones it's more difficult to get apps onto a phone, but it's not impossible. On Android, however, it's easy. There are tons of alternative app stores, and part of the promise of the Android ecosystem is that you're not locked into Google's own app store. And, really, is Cohen literally saying that Apple and Google should be told they cannot allow Telegram -- one of the most popular apps in the world -- in their app stores? Really?
Galati then agreed with him and piled on with more ignorance:
I agree with what the Captain said. Certain apps are not available on all devices. So if the companies that are outside the United States can't comply with same rules and regulations of the ones that are in the United States, then they shouldn't be available on the app stores. For example, you can't get every app on a Blackberry that you can on an Android or a Google.
Leaving aside the fact he said "Android or a Google" (and just assuming he meant iPhone for one of those)... what?!? The reason you can't get every app on a BlackBerry that's on other devices has nothing to do with any of this at all. It's because the market for BlackBerry devices is tiny, so developers don't develop for the BlackBerry ecosystem (and, of course, some BlackBerries now use Android anyway, so...). That comment by Galati makes no sense at all. Using the fact that fewer developers develop for BlackBerry says nothing about blocking foreign encryption apps from Android or iOS ecosystems. It makes no sense.
Why are these people testifying before Congress when they don't appear to know what they're talking about?
Later in the hearing, when questioned by Rep. Paul Tonko about how other countries (especially authoritarian regimes) might view a US law demanding backdoors as an opportunity to demand the same levels of access, Cohen speculated ridiculously, wildly and falsely that he'd heard that Apple gave China its source code:
Here's what Cohen says:
In preparing for the testimony, I saw several news stories that said that Apple provided the source code for iOS to China, as an example. I don't know whether those stories are true or not.
Yeah, because they're not. He then goes on to say that Apple has never said under oath whether or not that's true -- except, just a little while later, on the second panel, Apple's General Counsel Bruce Sewell made it quite clear that they have never given China its source code. Either way, Cohen follows it up by saying that Apple won't give US law enforcement its source code, as if to imply that Apple is somehow more willing to help the Chinese government hack into phones than the US government. Again, this is just blatant false propaganda. And yet here is someone testifying before Congress and claiming that it might be true.
Thankfully, at the end of the hearing, Rep. Anna Eshoo -- who isn't even a member of the subcommittee holding the hearing (though she is a top member of the larger committee) joined in and quizzed Cohen about his bizarre claims:
She notes that it's a huge allegation to make without any factual evidence, and asks if he has anything to go on beyond just general "news reports." Not surprisingly, he does not.
Elsewhere in the hearing, Cohen also insists that a dual key solution would work. He says this with 100% confidence -- that if Apple and law enforcement had a shared key it would be "just like a safety deposit box." Of course, this is also just wrong. As has been shown for decades, when you set up a two key solution, you're introducing vulnerabilities into the system that almost certainly let in others as well.
And then, after that, Rep. Jerry McNerney raises the point -- highlighted by many others in the past -- that rather than "going dark," law enforcement is in the golden age of surveillance and investigation thanks to more and new information, including that provided by mobile phones (such as location data, metadata on contacts and more). Cohen, somewhat astoundingly, claims he can't think of any new information that's now available thanks to mobile phones:
Sir, I'm having problems thinking of an example of information that's available now that was not before. From my perspective, thinking through investigations that we previously had information for, when you combine the encryption issue along with shorter and shorter retention periods, in a service provider, meaning they're keeping their records, for both data and metadata, for a shorter period of time, available to legal process. I'm having difficulty finding an example of an avenue that was not available before.
Huh?!? He can't think of things like location info from mobile phones? He can't think of things like metadata and data around unencrypted texts? He can't think of things like unencrypted and available information from apps? Then why is he on this panel? And the issue of data retention? Was he just told before the hearing to make a point to push for mandatory data retention and decided to throw in a nod to it here?
At least Galati, who went after him, was willing to admit that tech has provided a lot more information than in the past -- but then claimed that encryption was "eliminating those gains."
Cohen is really the clown at the show here. He also claims that Apple somehow decided to throw away its key and that it was "solving a problem that doesn't exist" in adding encryption:
There he's being asked by Rep. Yvette Clarke if he sees any technical solutions to the encryption issue, and he says:
The solution that we had in place previously, in which Apple did hold a key. And as Chief Galati mentioned, that was never compromised. So they could comply with a proper service of legal process. Essentially, what happened is that Apple solved a problem that does not exist.
Again, this is astoundingly ignorant. The problem before was that there was no key. It wasn't that Apple had the key, it's that the data was readily available to anyone who had access to the phone. That put everyone's information at risk. It's why there was so much concern about stolen phones and why stolen phones were so valuable. For a law enforcement official to not realize that and not think it was a real problem is... astounding. And, again, raises the question of why this guy is testifying before Congress.
It also raises the question of why Congress put him on a panel with no experts around to correct his many, many errors. At the very least, towards the beginning of the second panel, Apple GC Sewell explained how Cohen was just flat out wrong on these points:
If you can't see that, after his prepared remarks, Sewell directly addresses Cohen's claims:
That's where I was going to conclude my comments. But I think I owe it to this committee to add one additional thought. And I want to be very clear on this: We have not provided source code to the Chinese government. We did not have a key 19 months ago that we threw away. We have not announced that we are going to apply passcode encryption to the next generation iCloud. I just want to be very clear on that because we heard three allegations. Those allegations have no merit.
A few minutes later, he's asked directly about this and whether or not the Chinese had asked for the source code, and Sewell says that, yes, the Chinese have asked, and Apple has refused to give it to them:
Seems like they could have killed 3 hours of ignorant arguments presented to Congress, if they had just not allowed such ignorance to be spewed earlier on.
As we've discussed at length, there are multiple cases going on right now in which the US Justice Department is looking to compel Apple to help access encrypted information on iPhones. There was lots of attention paid to the one in San Bernardino, around Syed Farook's work iPhone, but that case is now over. The one getting almost but not quite as much attention is the one happening across the country in NY, where magistrate judge James Orenstein ruled against the DOJ a little over a month ago, with a very detailed explanation for why the All Writs Act clearly did not apply. The DOJ, not surprisingly, appealed that ruling (technically made a "renewed application" rather than an appeal) to an Article III judge and the case was assigned to judge Margo Brodie.
Apple has now filed its argument against the DOJ, making a variety of points, but hitting hard on the idea that the DOJ is flat out lying in now claiming that Apple's assistance in unlocking this phone is "necessary." As we've noted, the end result of the San Bernardino case, where the FBI eventually "figured out" how to get into the phone, raises questions about whether it truly exhausted all possibilities in this case -- which involves a newer phone, but an older operating system.
... the record is devoid of evidence that Apple’s assistance is
necessary—and remains so even after a similar claim of necessity was proven untrue in a recent
proceeding in California. Indeed, in its original application to Judge Orenstein, the government
acknowledged that it sought Apple’s help to spare the government from having to expend
The government has made no showing that it has
exhausted alternative means for extracting data from the iPhone at issue here, either by making a
serious attempt to obtain the passcode from the individual defendant who set it in the first
place—nor to obtain passcode hints or other helpful information from the defendant—or by
consulting other government agencies and third parties known to the government. Indeed, the
government has gone so far as to claim that it has no obligation to do so...
notwithstanding media reports that suggest that companies already offer commercial solutions
capable of accessing data from phones running iOS 7, which is nearly three years old.
And, of course, Apple suggests (as it has all along) that the DOJ is totally misreading and/or misrepresenting the All Writs Act:
The government would have this Court believe that the All Writs
Act, first enacted in 1789, is a boundless grant of authority that permits courts to enter any order
the government seeks—including orders conscripting private third parties into providing
whatever assistance law enforcement deems appropriate—as long as Congress has not expressly
prohibited its issuance. DE 30 at 18. But that characterization of the All Writs Act turns our
system of limited government on its head. It simply is not the case that federal courts can issue
any order the executive branch dreams up unless and until Congress expressly prohibits it. That
construction of the All Writs Act has it exactly backwards. If the government’s view is correct,
Congress would never need to pass permissive legislation in the law enforcement context
because everything would be on the table until explicitly prohibited. That may be what the
government prefers, but it is not the legal system in which it operates.
The company also questions whether or not it's really necessary for the government to get into this phone, given that the defendant in the case, Jun Feng, has already pled guilty and the phone hasn't been used in years. Also, the government didn't even seek a warrant to get into the phone for over a year after seizing it.
Apple also raises some procedural concerns. As noted above, the government just asked for a new judge to review, rather than doing an official appeal, and Apple points out that it's doing this to try to avoid certain standards:
In its papers, the government takes great pains to characterize its brief as a renewed
application rather than an appeal from Judge Orenstein’s order, presumably to bolster its
contention that Judge Orenstein’s order should be reviewed de novo.... In doing
so, the government attempts to obscure the fact that this matter was extensively briefed, a
hearing was held, supplemental briefing was provided, and Judge Orenstein issued a 50-page
order. Moreover, the government’s insistence that it is entitled to a do-over is belied by Federal
Rule of Criminal Procedure 59 and Section 636 of the Federal Magistrates Act.
One of the key points made by the DOJ in its filing in this case was that Apple had been fine with previous such All Writs Act orders on phones running iOS 7, where it does have more access to information. But Apple notes that the details of this case are different in important ways: this is the first case where the judge specifically brought Apple into court, rather than ruling without Apple being involved at all (i.e. "ex parte").
To be sure, courts have previously issued ex parte orders directing Apple to “assist in
extracting data from an Apple device through bypassing the passcode in order to execute a search warrant.” But the government’s cited orders were issued ex
parte, without Apple’s participation, without the benefit of adversarial briefing on the scope of
the All Writs Act, and with no supporting analysis. Apple also was not a party in United States
v. Blake, No. 13-CR-80054 (S.D. Fl. July 14, 2014), in which the court denied the defendant’s
motion to suppress evidence gathered from an iPhone that Apple helped unlock. Accordingly,
such cases are not even persuasive authority on the scope of the All Writs Act, let alone
precedential; certainly such ex parte orders issued with little analysis should carry less weight
than Judge Orenstein’s lengthy and reasoned opinion.
Most of the other arguments cover things discussed earlier, around why the All Writs Act doesn't apply and why CALEA covers this situation and does not require Apple to assist.
So, while the San Bernardino case may be over, the NY case is still raging. I imagine the DOJ's next filing will be... interesting as well.
Bergstein glosses over the security implications of requiring phone manufacturers to hold the decryption keys for devices and services and instead presents his argument as an appeal to emotion. Those on Apple's side -- including Apple CEO Tim Cook -- are given only the briefest of nods before alarmists like Manhattan District Attorney Cy Vance are given the stage.
Bergstein does at least ask an interesting question: what if exonerating evidence is locked up in a phone? But his test case for "What if Apple is wrong?" doesn't apply as well as he seems to hope it does.
Devon Godfrey was killed in his apartment in 2010 -- and police arrested the wrong person. Somehow, Bergstein wants to blame the police screwing up on Apple. Investigators had only a week to pull evidence together to present to a grand jury. Some of that evidence happened to be located on a passcode-locked iPhone. But the evidence ultimately compiled and used has nearly nothing to do with that locked phone.
Cell phones had been found in Godfrey’s apartment, including an iPhone that was locked by its passcode. Arnold recalls doing what he always did in homicides back then: he obtained a search warrant for the phone and put a detective on a plane to Cupertino, California. The detective would wait in Apple’s headquarters and return with the data Arnold needed. Meanwhile, investigators looked more closely at the apartment building’s surveillance video, and Arnold examined records sent by Godfrey’s wireless carrier of when calls and texts were last made on the phones.
With this new evidence in hand, the case suddenly looked quite different. From the wireless carrier, Arnold saw that someone—presumably Godfrey—had sent a text from the iPhone at a certain time. But the recipient of that text had used a disposable “burner” phone not registered under a true name. So who was it? The iPhone itself had the crucial clue. Arnold could see that Godfrey referred to the person by a nickname. People who knew Godfrey helped police identify the man who went by that nickname. It was not the man who was originally arrested. It was Rafael Rosario—who also appeared in the apartment surveillance footage. Rosario confessed and later pleaded guilty.
“Today, Apple encrypts the iCloud but decrypts it in response to court orders,” he said. “So are they materially insecure because of that?”
Comey later reiterated this point, saying, “I see Apple today encrypting the iCloud and decrypting it in response to court orders. Is there a hole in their code?”
The frequency of the backups will vary from person to person, but this still gives investigators access to plenty of information supposedly "stored" in an uncrackable phone.
From there, the argument against Apple only gets worse, as the arguments themselves are sourced from the sort of people who'd rather see insecure devices than face obstacles when prosecuting suspects. Cy Vance, of course, has argued for outright encryption bans.
Vance also loves a good appeal to emotion.
Vance makes no dramatic claims about “going dark,” preferring a measured, lawyerly form of argument. When I tell him that his statistics on inaccessible iPhones don’t yet impress many computer scientists, he makes a facial expression equivalent to a shrug. “Some people have made the determination that not being able to do the kinds of work we do is an acceptable collateral damage,” he says. “I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.”
The assumption is that everyone loves locking cops out of phones until they're a crime victim. But this assertion is just as false as Comey's exaggerated laments about "going dark." But even in the most famous case involving a locked iPhone -- one that involved an apparent act of terrorism manifesting itself as a mass shooting -- the relatives of victims were far from unanimous in their support of the FBI's efforts. Two people who lost close relations in the shooting -- including a mother who lost her son -- spoke out against the FBI's efforts to undermine cell phone security.
Her son was killed in the San Bernardino, Calif., massacre — but Carole Adams agrees with Apple that personal privacy trumps the feds’ demands for new software to break into iPhones, including the phone of her son’s killer.
The mom of Robert Adams — a 40-year-old environmental health specialist who was shot dead by Syed Rizwan Farook and his wife — told The Post on Thursday that the constitutional right to privacy “is what makes America great to begin with.”
Then there's the belief -- offered by Vance, Comey and others -- that law enforcement should have access to communications simply because they have a warrant. But what isn't acknowledged is that this is unprecedented access. Texting/messaging has largely replaced telephone calls and face-to-face conversations.
Prior to the advent of texting, these conversations could not have been recorded without a wiretap warrant, which is a last resort effort that has to be carried out in real time. What law enforcement has access to now -- if not walled off by encryption -- are hundreds or thousands of conversations it never would have had access to before, even with a search warrant, which does not cover the interception of communications. And it's a technique that would be almost completely useless to investigators after a criminal act like a murder has been committed. The fact that a murder victim had a phone in the house would have prompted detectives to look at call records -- something they can still do without breaking a phone's encryption. What was said during those phone calls would still remain a mystery, warrant or no. So, law enforcement isn't as far behind technology as it likes to pretend it is.
Bergstein, along with Lawfare's Susan Hennessey (who Bergstein quotes), both claim a corporation can't possibly decide what's best for Americans.
So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want while other people deal with the negative consequences? If it’s the latter, that’s understandable; like any public company, Apple is obligated to maximize its value to its shareholders. But society is not necessarily best served by letting Apple make whatever phones are optimal for its chosen business strategy, which is to create a shiny mobile vault that people will trust with every aspect of their lives.
But somehow they both feel it's perfectly acceptable for another party with a vested interest in total access to make that same decision for Americans.
from the shame-that-'one-size-fits-all-writs'-thing-didn't-work-out... dept
Some of the other iPhones the FBI tried to pretend weren't going to be the beneficiaries of a precedential All Writs order are apparently not even the beneficiaries of the agency's Break Into an iPhone Using This One Simple Trick! anticlimax in the San Bernardino case.
The Massachusetts case is unique because it's the first of its kind involving a newer model iPhone—an iPhone 6 Plus running iOS 9.1—that likely can not be unlocked using the mysterious method the government wound up using on the older iPhone 5c of Syed Farook, one of the San Bernardino shooters. In addition to security features that automatically wipe the device after 10 passcode attempts, newer models including the iPhone 6 and up have a hardware-backed security feature called Secure Enclave, which makes breaking into the devices significantly harder.
Thus, the case appears to have entered legal limbo, both because the government has failed to respond to Apple’s refusal and because Apple has no way of accessing the phone’s data anyway.
The order set forth by the magistrate judge is unique in the fact that it compels Apple to turn over whatever data it recovers from the phone but does not demand the data be encrypted. Nor has Apple been ordered to assist in the decryption process. All of that ultimately doesn't matter if Apple can't access the data in the first place, hence the stalemate and apparent abandonment.
The drug dealer had an iPhone 5C running iOS 7 software, while the San Bernardino shooter was using an iPhone 5C running iOS 9, a later version of Apple's operating system.
"The government continues to require Apple's assistance in accessing the data that it is authorized to search by warrant," wrote Capers.
Whatever the exploit is that works with this narrow band of phones, Apple has yet to learn the details. The FBI has shared it with the Senate Intelligence Committee, which means privacy champions like Dianne Feinstein possibly have more info on this security flaw than Apple does. Apple, however, has stated it will not seek to legally compel the FBI to turn over details on the exploit -- which is incredibly gentlemanly considering the FBI has done little else lately but seek to compel Apple to perform all sorts of work for it.
What has been made painfully apparent to me for nearly the past decade in this field is that keeping an exploit secret is not possible, no matter how good an agency or corporation may be at keeping secrets – because an exploit is merely a dotted line on a blueprint. Mere knowledge of the general parameters of a vulnerability – even just the details of the device’s condition in this case – has been enough for security researchers to know exactly what security boundaries to start looking at, and they can do so now with the confidence that there is a known, exploitable vulnerability. One does not need to steal any exploit code in order to take advantage of a vulnerability; they only need to find the vulnerability; the way in already exists until it is closed.
Given that it’s only a matter of time before a criminal finds the blueprint to this vulnerability, I urge you to consider briefing Apple of the tool and techniques used to access Syed Farook’s device. While the part of the tool that brute forces a PIN does not seem to work on newer devices, the locks that it picks in order to get past the front door most certainly can be vulnerabilities that carry over into newer devices. Depending on the nature of these components of the solution, criminals or nation states could take advantage of them to install malware, spyware, ransomware, or to infect a target by other means. Individual components of this tool may be very dangerous to millions of Americans, even if the solution as a whole is not viable.
Not that the FBI will be swayed by the words of highly-respected iPhone forensics expert. It tuned out security researchers during its quest for alternate unlocking methods and it likely could care less who else gets in as long as law enforcement agencies get in first.