On Thursday, FBI Director James Comey suggested that the FBI paid over a million dollars to a group of hackers who helped it get into Syed Farook's encrypted work iPhone. Of course, just as pretty much everyone predicted, the FBI found nothing of value on the iPhone. This was hardly a surprise. It was a case where we already know who did it, and that they were already dead. We also know that they destroyed their two personal iPhones, leaving open the question why anyone would think there was anything valuable on the work iPhone.
Specifically, Comey said that buying the exploit from this group cost the FBI "more than I will make in the remainder of this job, which is seven years and four months, for sure." Comey makes $185,100 per year at his job, implying that buying the exploit cost at least $1.3 million or so.
This has, understandably caused some to ask how it could possibly be worth it to pay so much money for an exploit that everyone must have known was worthless.
It would have been more responsible to give the FBI’s slush fund over to the victims’ families than to pursue such an obvious non-lead.
Of course, that is taking a slightly narrow view on things, considering that many people believe, strongly, that the FBI's motive here was really to extricate itself from the legal dispute over the phone that had the very strong potential of ending with a bad precedent for the FBI and the DOJ. When looked at through that lens, $1.3 million or whatever seems like very little money to pay...
When you testify before Congress, it helps to actually have some knowledge of what you're talking about. On Tuesday, the House Energy & Commerce Committee held the latest congressional hearing on the whole silly encryption fight, entitled Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives. And, indeed, they did have witnesses presenting "industry" and "law enforcement" views, but for unclear reasons decided to separate them. First up were three "law enforcement" panelists, who were free to say whatever the hell they wanted with no one pointing out that they were spewing pure bullshit. You can watch the whole thing below (while it says it's 4 hours, it doesn't actually start until about 45 minutes in):
Lots of craziness was stated -- starting with the idea pushed by both chief of intelligence for the NYPD, Thomas Galati and the commander of the office of intelligence for the Indiana State Police, Charles Cohen -- that the way to deal with non-US or open source encryption was just to ban it from app stores. This is a real suggestion that was just made before Congress by two (?!?) separate law enforcement officials. Rep. Morgan Griffith rightly pointed out that so many encryption products couldn't possibly be regulated by US law, and asked the panelists what to do about it. You can watch the exchange here:
You see Cohen ridiculously claim that since Apple and Google are gatekeepers to apps, that the government could just ban foreign encryption apps from being in the app stores:
Right now Google and Apple act as the gatekeepers for most of those encrypted apps, meaning if the app is not available on the App Store for an iOS device, if the app is not available on Google Play for an Android device, a customer of the United States cannot install it. So while some of the encrypted apps, like Telegram, are based outside the United States, US companies act as gatekeepers as to whether those apps are accessible here in the United States to be used.
This is just wrong. It's ignorant and clueless and for a law enforcement official -- let alone one who is apparently the "commander of the office of intelligence" -- to not know that this is wrong is just astounding. Yes, on Apple phones it's more difficult to get apps onto a phone, but it's not impossible. On Android, however, it's easy. There are tons of alternative app stores, and part of the promise of the Android ecosystem is that you're not locked into Google's own app store. And, really, is Cohen literally saying that Apple and Google should be told they cannot allow Telegram -- one of the most popular apps in the world -- in their app stores? Really?
Galati then agreed with him and piled on with more ignorance:
I agree with what the Captain said. Certain apps are not available on all devices. So if the companies that are outside the United States can't comply with same rules and regulations of the ones that are in the United States, then they shouldn't be available on the app stores. For example, you can't get every app on a Blackberry that you can on an Android or a Google.
Leaving aside the fact he said "Android or a Google" (and just assuming he meant iPhone for one of those)... what?!? The reason you can't get every app on a BlackBerry that's on other devices has nothing to do with any of this at all. It's because the market for BlackBerry devices is tiny, so developers don't develop for the BlackBerry ecosystem (and, of course, some BlackBerries now use Android anyway, so...). That comment by Galati makes no sense at all. Using the fact that fewer developers develop for BlackBerry says nothing about blocking foreign encryption apps from Android or iOS ecosystems. It makes no sense.
Why are these people testifying before Congress when they don't appear to know what they're talking about?
Later in the hearing, when questioned by Rep. Paul Tonko about how other countries (especially authoritarian regimes) might view a US law demanding backdoors as an opportunity to demand the same levels of access, Cohen speculated ridiculously, wildly and falsely that he'd heard that Apple gave China its source code:
Here's what Cohen says:
In preparing for the testimony, I saw several news stories that said that Apple provided the source code for iOS to China, as an example. I don't know whether those stories are true or not.
Yeah, because they're not. He then goes on to say that Apple has never said under oath whether or not that's true -- except, just a little while later, on the second panel, Apple's General Counsel Bruce Sewell made it quite clear that they have never given China its source code. Either way, Cohen follows it up by saying that Apple won't give US law enforcement its source code, as if to imply that Apple is somehow more willing to help the Chinese government hack into phones than the US government. Again, this is just blatant false propaganda. And yet here is someone testifying before Congress and claiming that it might be true.
Thankfully, at the end of the hearing, Rep. Anna Eshoo -- who isn't even a member of the subcommittee holding the hearing (though she is a top member of the larger committee) joined in and quizzed Cohen about his bizarre claims:
She notes that it's a huge allegation to make without any factual evidence, and asks if he has anything to go on beyond just general "news reports." Not surprisingly, he does not.
Elsewhere in the hearing, Cohen also insists that a dual key solution would work. He says this with 100% confidence -- that if Apple and law enforcement had a shared key it would be "just like a safety deposit box." Of course, this is also just wrong. As has been shown for decades, when you set up a two key solution, you're introducing vulnerabilities into the system that almost certainly let in others as well.
And then, after that, Rep. Jerry McNerney raises the point -- highlighted by many others in the past -- that rather than "going dark," law enforcement is in the golden age of surveillance and investigation thanks to more and new information, including that provided by mobile phones (such as location data, metadata on contacts and more). Cohen, somewhat astoundingly, claims he can't think of any new information that's now available thanks to mobile phones:
Sir, I'm having problems thinking of an example of information that's available now that was not before. From my perspective, thinking through investigations that we previously had information for, when you combine the encryption issue along with shorter and shorter retention periods, in a service provider, meaning they're keeping their records, for both data and metadata, for a shorter period of time, available to legal process. I'm having difficulty finding an example of an avenue that was not available before.
Huh?!? He can't think of things like location info from mobile phones? He can't think of things like metadata and data around unencrypted texts? He can't think of things like unencrypted and available information from apps? Then why is he on this panel? And the issue of data retention? Was he just told before the hearing to make a point to push for mandatory data retention and decided to throw in a nod to it here?
At least Galati, who went after him, was willing to admit that tech has provided a lot more information than in the past -- but then claimed that encryption was "eliminating those gains."
Cohen is really the clown at the show here. He also claims that Apple somehow decided to throw away its key and that it was "solving a problem that doesn't exist" in adding encryption:
There he's being asked by Rep. Yvette Clarke if he sees any technical solutions to the encryption issue, and he says:
The solution that we had in place previously, in which Apple did hold a key. And as Chief Galati mentioned, that was never compromised. So they could comply with a proper service of legal process. Essentially, what happened is that Apple solved a problem that does not exist.
Again, this is astoundingly ignorant. The problem before was that there was no key. It wasn't that Apple had the key, it's that the data was readily available to anyone who had access to the phone. That put everyone's information at risk. It's why there was so much concern about stolen phones and why stolen phones were so valuable. For a law enforcement official to not realize that and not think it was a real problem is... astounding. And, again, raises the question of why this guy is testifying before Congress.
It also raises the question of why Congress put him on a panel with no experts around to correct his many, many errors. At the very least, towards the beginning of the second panel, Apple GC Sewell explained how Cohen was just flat out wrong on these points:
If you can't see that, after his prepared remarks, Sewell directly addresses Cohen's claims:
That's where I was going to conclude my comments. But I think I owe it to this committee to add one additional thought. And I want to be very clear on this: We have not provided source code to the Chinese government. We did not have a key 19 months ago that we threw away. We have not announced that we are going to apply passcode encryption to the next generation iCloud. I just want to be very clear on that because we heard three allegations. Those allegations have no merit.
A few minutes later, he's asked directly about this and whether or not the Chinese had asked for the source code, and Sewell says that, yes, the Chinese have asked, and Apple has refused to give it to them:
Seems like they could have killed 3 hours of ignorant arguments presented to Congress, if they had just not allowed such ignorance to be spewed earlier on.
As we've discussed at length, there are multiple cases going on right now in which the US Justice Department is looking to compel Apple to help access encrypted information on iPhones. There was lots of attention paid to the one in San Bernardino, around Syed Farook's work iPhone, but that case is now over. The one getting almost but not quite as much attention is the one happening across the country in NY, where magistrate judge James Orenstein ruled against the DOJ a little over a month ago, with a very detailed explanation for why the All Writs Act clearly did not apply. The DOJ, not surprisingly, appealed that ruling (technically made a "renewed application" rather than an appeal) to an Article III judge and the case was assigned to judge Margo Brodie.
Apple has now filed its argument against the DOJ, making a variety of points, but hitting hard on the idea that the DOJ is flat out lying in now claiming that Apple's assistance in unlocking this phone is "necessary." As we've noted, the end result of the San Bernardino case, where the FBI eventually "figured out" how to get into the phone, raises questions about whether it truly exhausted all possibilities in this case -- which involves a newer phone, but an older operating system.
... the record is devoid of evidence that Apple’s assistance is
necessary—and remains so even after a similar claim of necessity was proven untrue in a recent
proceeding in California. Indeed, in its original application to Judge Orenstein, the government
acknowledged that it sought Apple’s help to spare the government from having to expend
The government has made no showing that it has
exhausted alternative means for extracting data from the iPhone at issue here, either by making a
serious attempt to obtain the passcode from the individual defendant who set it in the first
place—nor to obtain passcode hints or other helpful information from the defendant—or by
consulting other government agencies and third parties known to the government. Indeed, the
government has gone so far as to claim that it has no obligation to do so...
notwithstanding media reports that suggest that companies already offer commercial solutions
capable of accessing data from phones running iOS 7, which is nearly three years old.
And, of course, Apple suggests (as it has all along) that the DOJ is totally misreading and/or misrepresenting the All Writs Act:
The government would have this Court believe that the All Writs
Act, first enacted in 1789, is a boundless grant of authority that permits courts to enter any order
the government seeks—including orders conscripting private third parties into providing
whatever assistance law enforcement deems appropriate—as long as Congress has not expressly
prohibited its issuance. DE 30 at 18. But that characterization of the All Writs Act turns our
system of limited government on its head. It simply is not the case that federal courts can issue
any order the executive branch dreams up unless and until Congress expressly prohibits it. That
construction of the All Writs Act has it exactly backwards. If the government’s view is correct,
Congress would never need to pass permissive legislation in the law enforcement context
because everything would be on the table until explicitly prohibited. That may be what the
government prefers, but it is not the legal system in which it operates.
The company also questions whether or not it's really necessary for the government to get into this phone, given that the defendant in the case, Jun Feng, has already pled guilty and the phone hasn't been used in years. Also, the government didn't even seek a warrant to get into the phone for over a year after seizing it.
Apple also raises some procedural concerns. As noted above, the government just asked for a new judge to review, rather than doing an official appeal, and Apple points out that it's doing this to try to avoid certain standards:
In its papers, the government takes great pains to characterize its brief as a renewed
application rather than an appeal from Judge Orenstein’s order, presumably to bolster its
contention that Judge Orenstein’s order should be reviewed de novo.... In doing
so, the government attempts to obscure the fact that this matter was extensively briefed, a
hearing was held, supplemental briefing was provided, and Judge Orenstein issued a 50-page
order. Moreover, the government’s insistence that it is entitled to a do-over is belied by Federal
Rule of Criminal Procedure 59 and Section 636 of the Federal Magistrates Act.
One of the key points made by the DOJ in its filing in this case was that Apple had been fine with previous such All Writs Act orders on phones running iOS 7, where it does have more access to information. But Apple notes that the details of this case are different in important ways: this is the first case where the judge specifically brought Apple into court, rather than ruling without Apple being involved at all (i.e. "ex parte").
To be sure, courts have previously issued ex parte orders directing Apple to “assist in
extracting data from an Apple device through bypassing the passcode in order to execute a search warrant.” But the government’s cited orders were issued ex
parte, without Apple’s participation, without the benefit of adversarial briefing on the scope of
the All Writs Act, and with no supporting analysis. Apple also was not a party in United States
v. Blake, No. 13-CR-80054 (S.D. Fl. July 14, 2014), in which the court denied the defendant’s
motion to suppress evidence gathered from an iPhone that Apple helped unlock. Accordingly,
such cases are not even persuasive authority on the scope of the All Writs Act, let alone
precedential; certainly such ex parte orders issued with little analysis should carry less weight
than Judge Orenstein’s lengthy and reasoned opinion.
Most of the other arguments cover things discussed earlier, around why the All Writs Act doesn't apply and why CALEA covers this situation and does not require Apple to assist.
So, while the San Bernardino case may be over, the NY case is still raging. I imagine the DOJ's next filing will be... interesting as well.
Bergstein glosses over the security implications of requiring phone manufacturers to hold the decryption keys for devices and services and instead presents his argument as an appeal to emotion. Those on Apple's side -- including Apple CEO Tim Cook -- are given only the briefest of nods before alarmists like Manhattan District Attorney Cy Vance are given the stage.
Bergstein does at least ask an interesting question: what if exonerating evidence is locked up in a phone? But his test case for "What if Apple is wrong?" doesn't apply as well as he seems to hope it does.
Devon Godfrey was killed in his apartment in 2010 -- and police arrested the wrong person. Somehow, Bergstein wants to blame the police screwing up on Apple. Investigators had only a week to pull evidence together to present to a grand jury. Some of that evidence happened to be located on a passcode-locked iPhone. But the evidence ultimately compiled and used has nearly nothing to do with that locked phone.
Cell phones had been found in Godfrey’s apartment, including an iPhone that was locked by its passcode. Arnold recalls doing what he always did in homicides back then: he obtained a search warrant for the phone and put a detective on a plane to Cupertino, California. The detective would wait in Apple’s headquarters and return with the data Arnold needed. Meanwhile, investigators looked more closely at the apartment building’s surveillance video, and Arnold examined records sent by Godfrey’s wireless carrier of when calls and texts were last made on the phones.
With this new evidence in hand, the case suddenly looked quite different. From the wireless carrier, Arnold saw that someone—presumably Godfrey—had sent a text from the iPhone at a certain time. But the recipient of that text had used a disposable “burner” phone not registered under a true name. So who was it? The iPhone itself had the crucial clue. Arnold could see that Godfrey referred to the person by a nickname. People who knew Godfrey helped police identify the man who went by that nickname. It was not the man who was originally arrested. It was Rafael Rosario—who also appeared in the apartment surveillance footage. Rosario confessed and later pleaded guilty.
“Today, Apple encrypts the iCloud but decrypts it in response to court orders,” he said. “So are they materially insecure because of that?”
Comey later reiterated this point, saying, “I see Apple today encrypting the iCloud and decrypting it in response to court orders. Is there a hole in their code?”
The frequency of the backups will vary from person to person, but this still gives investigators access to plenty of information supposedly "stored" in an uncrackable phone.
From there, the argument against Apple only gets worse, as the arguments themselves are sourced from the sort of people who'd rather see insecure devices than face obstacles when prosecuting suspects. Cy Vance, of course, has argued for outright encryption bans.
Vance also loves a good appeal to emotion.
Vance makes no dramatic claims about “going dark,” preferring a measured, lawyerly form of argument. When I tell him that his statistics on inaccessible iPhones don’t yet impress many computer scientists, he makes a facial expression equivalent to a shrug. “Some people have made the determination that not being able to do the kinds of work we do is an acceptable collateral damage,” he says. “I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.”
The assumption is that everyone loves locking cops out of phones until they're a crime victim. But this assertion is just as false as Comey's exaggerated laments about "going dark." But even in the most famous case involving a locked iPhone -- one that involved an apparent act of terrorism manifesting itself as a mass shooting -- the relatives of victims were far from unanimous in their support of the FBI's efforts. Two people who lost close relations in the shooting -- including a mother who lost her son -- spoke out against the FBI's efforts to undermine cell phone security.
Her son was killed in the San Bernardino, Calif., massacre — but Carole Adams agrees with Apple that personal privacy trumps the feds’ demands for new software to break into iPhones, including the phone of her son’s killer.
The mom of Robert Adams — a 40-year-old environmental health specialist who was shot dead by Syed Rizwan Farook and his wife — told The Post on Thursday that the constitutional right to privacy “is what makes America great to begin with.”
Then there's the belief -- offered by Vance, Comey and others -- that law enforcement should have access to communications simply because they have a warrant. But what isn't acknowledged is that this is unprecedented access. Texting/messaging has largely replaced telephone calls and face-to-face conversations.
Prior to the advent of texting, these conversations could not have been recorded without a wiretap warrant, which is a last resort effort that has to be carried out in real time. What law enforcement has access to now -- if not walled off by encryption -- are hundreds or thousands of conversations it never would have had access to before, even with a search warrant, which does not cover the interception of communications. And it's a technique that would be almost completely useless to investigators after a criminal act like a murder has been committed. The fact that a murder victim had a phone in the house would have prompted detectives to look at call records -- something they can still do without breaking a phone's encryption. What was said during those phone calls would still remain a mystery, warrant or no. So, law enforcement isn't as far behind technology as it likes to pretend it is.
Bergstein, along with Lawfare's Susan Hennessey (who Bergstein quotes), both claim a corporation can't possibly decide what's best for Americans.
So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want while other people deal with the negative consequences? If it’s the latter, that’s understandable; like any public company, Apple is obligated to maximize its value to its shareholders. But society is not necessarily best served by letting Apple make whatever phones are optimal for its chosen business strategy, which is to create a shiny mobile vault that people will trust with every aspect of their lives.
But somehow they both feel it's perfectly acceptable for another party with a vested interest in total access to make that same decision for Americans.
from the shame-that-'one-size-fits-all-writs'-thing-didn't-work-out... dept
Some of the other iPhones the FBI tried to pretend weren't going to be the beneficiaries of a precedential All Writs order are apparently not even the beneficiaries of the agency's Break Into an iPhone Using This One Simple Trick! anticlimax in the San Bernardino case.
The Massachusetts case is unique because it's the first of its kind involving a newer model iPhone—an iPhone 6 Plus running iOS 9.1—that likely can not be unlocked using the mysterious method the government wound up using on the older iPhone 5c of Syed Farook, one of the San Bernardino shooters. In addition to security features that automatically wipe the device after 10 passcode attempts, newer models including the iPhone 6 and up have a hardware-backed security feature called Secure Enclave, which makes breaking into the devices significantly harder.
Thus, the case appears to have entered legal limbo, both because the government has failed to respond to Apple’s refusal and because Apple has no way of accessing the phone’s data anyway.
The order set forth by the magistrate judge is unique in the fact that it compels Apple to turn over whatever data it recovers from the phone but does not demand the data be encrypted. Nor has Apple been ordered to assist in the decryption process. All of that ultimately doesn't matter if Apple can't access the data in the first place, hence the stalemate and apparent abandonment.
The drug dealer had an iPhone 5C running iOS 7 software, while the San Bernardino shooter was using an iPhone 5C running iOS 9, a later version of Apple's operating system.
"The government continues to require Apple's assistance in accessing the data that it is authorized to search by warrant," wrote Capers.
Whatever the exploit is that works with this narrow band of phones, Apple has yet to learn the details. The FBI has shared it with the Senate Intelligence Committee, which means privacy champions like Dianne Feinstein possibly have more info on this security flaw than Apple does. Apple, however, has stated it will not seek to legally compel the FBI to turn over details on the exploit -- which is incredibly gentlemanly considering the FBI has done little else lately but seek to compel Apple to perform all sorts of work for it.
What has been made painfully apparent to me for nearly the past decade in this field is that keeping an exploit secret is not possible, no matter how good an agency or corporation may be at keeping secrets – because an exploit is merely a dotted line on a blueprint. Mere knowledge of the general parameters of a vulnerability – even just the details of the device’s condition in this case – has been enough for security researchers to know exactly what security boundaries to start looking at, and they can do so now with the confidence that there is a known, exploitable vulnerability. One does not need to steal any exploit code in order to take advantage of a vulnerability; they only need to find the vulnerability; the way in already exists until it is closed.
Given that it’s only a matter of time before a criminal finds the blueprint to this vulnerability, I urge you to consider briefing Apple of the tool and techniques used to access Syed Farook’s device. While the part of the tool that brute forces a PIN does not seem to work on newer devices, the locks that it picks in order to get past the front door most certainly can be vulnerabilities that carry over into newer devices. Depending on the nature of these components of the solution, criminals or nation states could take advantage of them to install malware, spyware, ransomware, or to infect a target by other means. Individual components of this tool may be very dangerous to millions of Americans, even if the solution as a whole is not viable.
Not that the FBI will be swayed by the words of highly-respected iPhone forensics expert. It tuned out security researchers during its quest for alternate unlocking methods and it likely could care less who else gets in as long as law enforcement agencies get in first.
Every since the FBI announced that it had found its own way into Syed Farook's iPhone, people have been wondering exactly how it managed to do so, and how many people the exploit puts at risk. Unsurprisingly, the agency declined to share any details with Apple and tried to downplay the possibility that they'd be breaking into phones left and right — despite pretty quickly entertaining the idea of doing exactly that. Now, following a discussion with Director James Comey last night, we have some more... well... I don't think you can exactly call them "details", but:
"We're having discussions within the government about, okay, so should we tell Apple what the flaw is that was found?" Comey said. "That’s an interesting conversation because you tell Apple and they’re going to fix it and then we’re back where we started from."
Comey said that it is possible that authorities will tell Apple, but "we just haven’t decided yet."
That's an interesting way of putting it. It seems Comey has forgotten "where we started from", because not that long ago he was still insisting that this had nothing do with setting a precedent or getting into other phones in the future and was all about pursuing every lead in this one case. Well, that lead has now been pursued and the phone in question cracked, so Comey's "back where we started" comment only makes sense if (shocker) this really was about a lot more than one phone.
Comey went on to downplay the applicability of whatever exploit they are using:
While Comey did not disclose the outside group’s method in his remarks Wednesday, he said it would only be useful on a select type of devices — specifically, the iPhone 5C, an older model released more than two years ago.
"The world has moved on to [iPhone] 6’s," Comey said. "This doesn’t work in 6S, this doesn’t work in a 5S. So we have a tool that works on a narrow slice of phones. … I can never be completely confident, but I’m pretty confident about that."
Of course, the 5C still accounts for around 5% of iPhones, which may be a "narrow slice", but that's likely of little comfort to the many people using them who now know their device contains a potential security exploit which the FBI is refusing to protect them from. Because that's the point: if the 5C is hackable, that means a bunch of people are at risk and not just from law enforcement overreach. The right thing to do when you've discovered such a vulnerability is report it so it can be fixed — that's pretty much the dividing line between white hat and black hat hacking. By keeping mum on the details, the FBI is leaving a known security vulnerability in the wild. Oh, but Comey's not worried about that:
Comey did not seem concerned that the method for accessing Farook’s iPhone would be revealed by the outside group that helped them.
"The FBI is very good at keeping secrets, and the people we bought this from, I know a fair amount about them, and I have a high degree of confidence that they are very good at protecting them," he said.
He only identified this group as "someone outside the government" and said "their motivations align with ours."
Firstly, this presupposes that the exploit will never be found by anyone else (and hasn't been already). Secondly, isn't his allusion to the FBI's mysterious assistants a bit unnerving? Yes, there are security researchers who focus on selling what they find to governments and law enforcement agencies when they need to hack something, instead of revealing the vulnerabilities they discover and helping to close them — which many would already see as a problem. But I guess we are supposed to be comforted that the FBI knows a "fair amount" about these non-governmental hackers, and that their "motivations" align (and don't include doing everything possible to help the public secure their devices and keep their data safe). To protect and serve indeed.
Remember how the FBI insisted over and over again that the case in San Bernardino was not about setting a precedent and was totally about getting into "just that one phone?" Of course, no one believed it, but pay close attention to what's happening now that the FBI was able to hack into Syed Farook's work iPhone. The DOJ has also said that the crack was limited to just that type of phone and probably wasn't widely applicable. However, at the same time, the Justice Department probably has no interest in sharing the details of the vulnerability with Apple:
The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies.
Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.
Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone.
FBI: You should do it, it's just one phone
Apple: No it isn't
FBI: We got in
Apple: You should say how, it's just one phone
FBI: No it isn't
Meanwhile, the DOJ may not be interested in helping Apple patch that hole, but it is apparently at least willing to look into other cases where it can help law enforcement break into locked iPhones. There are some (somewhat conflicting) reports saying that the FBI has agreed to help prosecutors in Arkansas try to get into a couple of iOS devices in a murder case there. Of course, it may not be the same technique or situation (and the FBI might not be able to get in, either).
However, this does show just how eager law enforcement is to get into lots of phones, and how important it is that Apple actually be able to protect its users from those who do not have legitimate reasons to hack into phones. It's too bad that the FBI is apparently choosing to hold onto the info that helps it in a few cases while failing to protect the rest of the public who may use Apple devices.
The questions raised by the DOJ announcing that it was, in fact, able to get in to Syed Farook's work iPhone continue to grow. The latest is that, if it could get into that phone, running the fairly secure iOS 9, why is it still fighting the case in NY where it's trying to get into a drug dealer's phone running iOS 7? As you may recall, the case in NY has been going on for longer than the San Bernardino one. It started back in October when the DOJ demanded Apple's help in getting into the iPhone of Jun Feng (a drug dealer who admitted guilt, but who claims he forgot the passcode) and magistrate judge James Orenstein stepped in to ask Apple if this was a reasonable request.
Then, earlier this month, Orenstein wrote a pretty thorough dismantling of the Justice Department's position over the All Writs Act. The DOJ then appealed that ruling, which is now sitting in front of a non-magistrate judge, Margo Brodie. Part of the DOJ's argument made in the appeal was that this case is very different from the San Bernardino case, because in this case the phone is using iOS 7, which means that Apple already has the key to get in, and it doesn't require any further "burden" in terms of writing new code. As we noted at the time, this seemed to make the DOJ's case a little stronger.
However, now that the FBI has broken into the iOS 9 device (even if it claims the hack only works on some phones), it seems to argue in the exact other direction. Getting into an iOS 7 device is comparatively quite easy according to a bunch of folks. Apple really only ratcheted up the security in iOS with default encryption in iOS 8 -- meaning that any decent forensics team should be able to get into the iOS 7 device without much difficulty. Remember, a key part of the (somewhat made up) test that the DOJ insists must be used in All Writs Act cases like this is whether or not the third party's help is "necessary."
And in this case, the DOJ insisted that it had no way into this phone without Apple's help:
If you can't read that, it's from the appeal, and it directly claims that the government "has explored the possibility of using third-party technologies" but says that it can't do so safely. That's difficult to believe in the wake of the DOJ's claims that it successfully got into Syed Farook's much more secure iPhone.
And yet... the NY case moves forward, with the DOJ making a new filing today giving Apple more time to respond to the government's application. It is, of course, possible, that the DOJ will eventually drop this case as well, but its argument that it absolutely needs Apple's help here seems to be yet another misleading, or simply false, statement from the Justice Department.
Update: We've now added to the story that the DOJ is saying that CNN got the quote wrong, and the vulnerability applies to any iPhone 5C, which is more believable, but still raises questions. Original story, with a note appended is below.
So late yesterday the Justice Department told magistrate judge Sheri Pym that it had successfully broken into Syed Farook's work iPhone and therefore no longer needed to continue with the court's order compelling Apple to write a new version of its iOS with security features removed. And then, in talking to the press, the DOJ apparently claimed the method only works for Farook's iPhone:
On Monday, the Department of Justice said the method only works on this particular phone, which is an iPhone 5C running a version of iOS 9 software.
Perhaps the CNN reporter who wrote this really meant "this particular type of phone," in which case the statement would be only marginally more believable, but the idea that it only applies to "this particular phone" makes absolutely no sense, and suggests the DOJ is flat out lying again. The only way in that works with just this phone would be magically finding Farook's passcode (perhaps he left a post-it somewhere?). But if that was the case, the DOJ wouldn't have asked for two weeks to "test" the method (even if they only took one week). Finding the passcode and testing it doesn't take that long. Update: A DOJ spokesperson says that CNN got the quote wrong and that the actual statement was that the crack only applied to iPhone 5C devices.
And if it's any other method, it must have wider applicability to other iPhones. It's possible, if unlikely, that the method in question only works on iPhone 5Cs running iOS 9, but if it's a true vulnerability, it's likely that it impacts much more. It is true that later versions of the hardware include a chip called the Secure Enclave that might get in the way of certain vulnerabilities, but claiming that any such crack is limited to a specific phone is ludicrous.
And, of course, as we mentioned in the original post, if the DOJ really did find a vulnerability and refuses to share it with Apple, then the Justice Department is making us all less safe by refusing to reveal a potential security flaw that may impact tons of people. And then it's also lying about it publicly. Not a good look, but an all too typical one, unfortunately.
So it appears that the mainstage event over the DOJ's ability to force Apple to help it get around the security features of an iPhone is ending with a whimper, rather than a bang. The DOJ has just filed an early status report saying basically that it got into Syed Farook's work iPhone and it no longer needs the court to order Apple to help it comply by writing a modified version of iOS that disables security features.
The government has now successfully accessed the data stored on Farook's iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court's Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016.
There's also an associated one line proposed order that magistrate Judge Sheri Pym will almost certainly sign off on shortly.
And thus... the big showdown between the tech industry and the Justice Department goes nowhere. Just a little over a month after the DOJ swore to a court that it had exhausted all possibilities that didn't involve co-opting Apple to hack its own phones, the DOJ is admitting that the FBI has found a way in. Still, this was just one fight in a war that is still ongoing. It seems fairly clear that the DOJ and FBI expected their side of things to get a lot more support, which is why they chose the Syed Farook case to make a big public stand, rather than one of the many other cases where similar issues are at stake.
However, the overall issue is not over. There are still plenty of questions: What method did the DOJ use to get into Farook's iPhone? And what will happen in the other cases involving iPhones or involving other companies such as Whatsapp? And what will happen as Apple and other companies increasingly strengthen their encryption and security, making it more and more difficult for the FBI to get in?
In short, this is far from over. However, in the short term, the DOJ has learned that it isn't easy to win over public opinion on this issue, which suggests that future battles may play out under the cover of a bit more darkness, as the DOJ seeks to seal various filings and orders off from the public. My guess is that perhaps the next big fight will be in revealing what kinds of orders come through under the cover of darkness.