Remember, this was the same iPhone that the DOJ and the FBI said was critical in their investigation. This is the same iPhone that the San Bernardino District Attorney, Michael Ramos, insisted could be hiding evidence of a "dormant cyber pathogen" destined to destroy San Bernardino County's computer network.
And, in the end, it appears that (as everyone expected) there was nothing on it. At all.
This isn't surprising. As the FBI freely admitted all along, Farook and his wife had destroyed their own personal iPhones, and the only remaining one was this one, which was Farook's work phone. If there had been anything that sensitive on it, you'd have figured they would have destroyed it too. And, of course, others had noted that the FBI's choice of words after getting in pretty clearly suggested there was nothing useful on the phone.
But... in the meantime, the FBI has now made it clear that at least certain iPhone models have a massive vulnerability in them that potentially puts millions of iPhone users at risk. And the FBI has shown no interest in telling Apple where this vulnerability exists.
In short, the FBI, who is supposed to keep us safe from crime, broke into an iPhone which helped it solve no crime, but almost certainly has put millions of people at risk for potential criminal attacks in the future. Why does anyone think this is a good result?
Every since the FBI announced that it had found its own way into Syed Farook's iPhone, people have been wondering exactly how it managed to do so, and how many people the exploit puts at risk. Unsurprisingly, the agency declined to share any details with Apple and tried to downplay the possibility that they'd be breaking into phones left and right — despite pretty quickly entertaining the idea of doing exactly that. Now, following a discussion with Director James Comey last night, we have some more... well... I don't think you can exactly call them "details", but:
"We're having discussions within the government about, okay, so should we tell Apple what the flaw is that was found?" Comey said. "That’s an interesting conversation because you tell Apple and they’re going to fix it and then we’re back where we started from."
Comey said that it is possible that authorities will tell Apple, but "we just haven’t decided yet."
That's an interesting way of putting it. It seems Comey has forgotten "where we started from", because not that long ago he was still insisting that this had nothing do with setting a precedent or getting into other phones in the future and was all about pursuing every lead in this one case. Well, that lead has now been pursued and the phone in question cracked, so Comey's "back where we started" comment only makes sense if (shocker) this really was about a lot more than one phone.
Comey went on to downplay the applicability of whatever exploit they are using:
While Comey did not disclose the outside group’s method in his remarks Wednesday, he said it would only be useful on a select type of devices — specifically, the iPhone 5C, an older model released more than two years ago.
"The world has moved on to [iPhone] 6’s," Comey said. "This doesn’t work in 6S, this doesn’t work in a 5S. So we have a tool that works on a narrow slice of phones. … I can never be completely confident, but I’m pretty confident about that."
Of course, the 5C still accounts for around 5% of iPhones, which may be a "narrow slice", but that's likely of little comfort to the many people using them who now know their device contains a potential security exploit which the FBI is refusing to protect them from. Because that's the point: if the 5C is hackable, that means a bunch of people are at risk and not just from law enforcement overreach. The right thing to do when you've discovered such a vulnerability is report it so it can be fixed — that's pretty much the dividing line between white hat and black hat hacking. By keeping mum on the details, the FBI is leaving a known security vulnerability in the wild. Oh, but Comey's not worried about that:
Comey did not seem concerned that the method for accessing Farook’s iPhone would be revealed by the outside group that helped them.
"The FBI is very good at keeping secrets, and the people we bought this from, I know a fair amount about them, and I have a high degree of confidence that they are very good at protecting them," he said.
He only identified this group as "someone outside the government" and said "their motivations align with ours."
Firstly, this presupposes that the exploit will never be found by anyone else (and hasn't been already). Secondly, isn't his allusion to the FBI's mysterious assistants a bit unnerving? Yes, there are security researchers who focus on selling what they find to governments and law enforcement agencies when they need to hack something, instead of revealing the vulnerabilities they discover and helping to close them — which many would already see as a problem. But I guess we are supposed to be comforted that the FBI knows a "fair amount" about these non-governmental hackers, and that their "motivations" align (and don't include doing everything possible to help the public secure their devices and keep their data safe). To protect and serve indeed.
Remember how the FBI insisted over and over again that the case in San Bernardino was not about setting a precedent and was totally about getting into "just that one phone?" Of course, no one believed it, but pay close attention to what's happening now that the FBI was able to hack into Syed Farook's work iPhone. The DOJ has also said that the crack was limited to just that type of phone and probably wasn't widely applicable. However, at the same time, the Justice Department probably has no interest in sharing the details of the vulnerability with Apple:
The FBI may be allowed to withhold information about how it broke into an iPhone belonging to a gunman in the December San Bernardino shootings, despite a U.S. government policy of disclosing technology security flaws discovered by federal agencies.
Under the U.S. vulnerabilities equities process, the government is supposed to err in favor of disclosing security issues so companies can devise fixes to protect data. The policy has exceptions for law enforcement, and there are no hard rules about when and how it must be applied.
Apple Inc has said it would like the government to share how it cracked the iPhone security protections. But the Federal Bureau of Investigation, which has been frustrated by its inability to access data on encrypted phones belonging to criminal suspects, might prefer to keep secret the technique it used to gain access to gunman Syed Farook's phone.
FBI: You should do it, it's just one phone
Apple: No it isn't
FBI: We got in
Apple: You should say how, it's just one phone
FBI: No it isn't
Meanwhile, the DOJ may not be interested in helping Apple patch that hole, but it is apparently at least willing to look into other cases where it can help law enforcement break into locked iPhones. There are some (somewhat conflicting) reports saying that the FBI has agreed to help prosecutors in Arkansas try to get into a couple of iOS devices in a murder case there. Of course, it may not be the same technique or situation (and the FBI might not be able to get in, either).
However, this does show just how eager law enforcement is to get into lots of phones, and how important it is that Apple actually be able to protect its users from those who do not have legitimate reasons to hack into phones. It's too bad that the FBI is apparently choosing to hold onto the info that helps it in a few cases while failing to protect the rest of the public who may use Apple devices.
The questions raised by the DOJ announcing that it was, in fact, able to get in to Syed Farook's work iPhone continue to grow. The latest is that, if it could get into that phone, running the fairly secure iOS 9, why is it still fighting the case in NY where it's trying to get into a drug dealer's phone running iOS 7? As you may recall, the case in NY has been going on for longer than the San Bernardino one. It started back in October when the DOJ demanded Apple's help in getting into the iPhone of Jun Feng (a drug dealer who admitted guilt, but who claims he forgot the passcode) and magistrate judge James Orenstein stepped in to ask Apple if this was a reasonable request.
Then, earlier this month, Orenstein wrote a pretty thorough dismantling of the Justice Department's position over the All Writs Act. The DOJ then appealed that ruling, which is now sitting in front of a non-magistrate judge, Margo Brodie. Part of the DOJ's argument made in the appeal was that this case is very different from the San Bernardino case, because in this case the phone is using iOS 7, which means that Apple already has the key to get in, and it doesn't require any further "burden" in terms of writing new code. As we noted at the time, this seemed to make the DOJ's case a little stronger.
However, now that the FBI has broken into the iOS 9 device (even if it claims the hack only works on some phones), it seems to argue in the exact other direction. Getting into an iOS 7 device is comparatively quite easy according to a bunch of folks. Apple really only ratcheted up the security in iOS with default encryption in iOS 8 -- meaning that any decent forensics team should be able to get into the iOS 7 device without much difficulty. Remember, a key part of the (somewhat made up) test that the DOJ insists must be used in All Writs Act cases like this is whether or not the third party's help is "necessary."
And in this case, the DOJ insisted that it had no way into this phone without Apple's help:
If you can't read that, it's from the appeal, and it directly claims that the government "has explored the possibility of using third-party technologies" but says that it can't do so safely. That's difficult to believe in the wake of the DOJ's claims that it successfully got into Syed Farook's much more secure iPhone.
And yet... the NY case moves forward, with the DOJ making a new filing today giving Apple more time to respond to the government's application. It is, of course, possible, that the DOJ will eventually drop this case as well, but its argument that it absolutely needs Apple's help here seems to be yet another misleading, or simply false, statement from the Justice Department.
Update: We've now added to the story that the DOJ is saying that CNN got the quote wrong, and the vulnerability applies to any iPhone 5C, which is more believable, but still raises questions. Original story, with a note appended is below.
So late yesterday the Justice Department told magistrate judge Sheri Pym that it had successfully broken into Syed Farook's work iPhone and therefore no longer needed to continue with the court's order compelling Apple to write a new version of its iOS with security features removed. And then, in talking to the press, the DOJ apparently claimed the method only works for Farook's iPhone:
On Monday, the Department of Justice said the method only works on this particular phone, which is an iPhone 5C running a version of iOS 9 software.
Perhaps the CNN reporter who wrote this really meant "this particular type of phone," in which case the statement would be only marginally more believable, but the idea that it only applies to "this particular phone" makes absolutely no sense, and suggests the DOJ is flat out lying again. The only way in that works with just this phone would be magically finding Farook's passcode (perhaps he left a post-it somewhere?). But if that was the case, the DOJ wouldn't have asked for two weeks to "test" the method (even if they only took one week). Finding the passcode and testing it doesn't take that long. Update: A DOJ spokesperson says that CNN got the quote wrong and that the actual statement was that the crack only applied to iPhone 5C devices.
And if it's any other method, it must have wider applicability to other iPhones. It's possible, if unlikely, that the method in question only works on iPhone 5Cs running iOS 9, but if it's a true vulnerability, it's likely that it impacts much more. It is true that later versions of the hardware include a chip called the Secure Enclave that might get in the way of certain vulnerabilities, but claiming that any such crack is limited to a specific phone is ludicrous.
And, of course, as we mentioned in the original post, if the DOJ really did find a vulnerability and refuses to share it with Apple, then the Justice Department is making us all less safe by refusing to reveal a potential security flaw that may impact tons of people. And then it's also lying about it publicly. Not a good look, but an all too typical one, unfortunately.
So it appears that the mainstage event over the DOJ's ability to force Apple to help it get around the security features of an iPhone is ending with a whimper, rather than a bang. The DOJ has just filed an early status report saying basically that it got into Syed Farook's work iPhone and it no longer needs the court to order Apple to help it comply by writing a modified version of iOS that disables security features.
The government has now successfully accessed the data stored on Farook's iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court's Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016.
There's also an associated one line proposed order that magistrate Judge Sheri Pym will almost certainly sign off on shortly.
And thus... the big showdown between the tech industry and the Justice Department goes nowhere. Just a little over a month after the DOJ swore to a court that it had exhausted all possibilities that didn't involve co-opting Apple to hack its own phones, the DOJ is admitting that the FBI has found a way in. Still, this was just one fight in a war that is still ongoing. It seems fairly clear that the DOJ and FBI expected their side of things to get a lot more support, which is why they chose the Syed Farook case to make a big public stand, rather than one of the many other cases where similar issues are at stake.
However, the overall issue is not over. There are still plenty of questions: What method did the DOJ use to get into Farook's iPhone? And what will happen in the other cases involving iPhones or involving other companies such as Whatsapp? And what will happen as Apple and other companies increasingly strengthen their encryption and security, making it more and more difficult for the FBI to get in?
In short, this is far from over. However, in the short term, the DOJ has learned that it isn't easy to win over public opinion on this issue, which suggests that future battles may play out under the cover of a bit more darkness, as the DOJ seeks to seal various filings and orders off from the public. My guess is that perhaps the next big fight will be in revealing what kinds of orders come through under the cover of darkness.
When the DOJ announced that the FBI may have miraculously found a way in to Syed Farook's work iPhone after swearing to a court that such a thing was impossible, many people zeroed in on the possibility of "NAND Mirroring" as the technique in question. After all, during a Congressional hearing, Rep. Darrell Issa had gone fairly deep technically (for a Congressperson, at least) in asking FBI Director James Comey if the FBI had tested such a method. Well-known iPhone forensics guru Jonathan Zdziarski wrote up a good blog post explaining why such a technique was the most likely. While recognizing that there are other possibilities, he does a good job breaking down why none of the other possibilities are all that likely, given a variety of facts related to the case (I won't go through all of that -- just go read his post). It's worth a read. It also has a nice quick explanation of NAND mirroring:
This is where the NAND chip is typically desoldered, dumped into a file (likely by a chip reader/programmer, which is like a cd burner for chips), and then copied so that if the device begins to wipe or delay after five or ten tries, they can just re-write the original image back to the chip. This technique is kind of like cheating at Super Mario Bros. with a save-game, allowing you to play the same level over and over after you keep dying. Only instead of playing a game, they’re trying different pin combinations.
However, on Friday, we noted that FBI Director James Comey was already denying this was the method, saying that it "doesn't work." The FBI also "classified" the method in question which raised some additional eyebrows. Either way, Zdziarski was pretty sure that Comey's claim that NAND mirroring doesn't work was bogus:
FBI Director Comey, in a press conference, claims the NAND technique “doesn’t work”; this says more about the credibility of this information than anything. Every expert I’ve consulted (including three hardware forensics firms) believe it works, and multiple firms are still in the process of validating the technique. The amount of time to prep and test this technique alone is proving greater than the month that we’ve been discussing it – it’s very unlikely that any reputable source could have already discredited this method, given how much time and effort it is taking everyone else to fully flesh out and test it. When asked directly if the FBI tried this technique, Comey dodged the question and replied (on the topic of “chip copying”), “I don’t want to say beyond that”, indicating the FBI hadn’t tried it. This speaks volumes about how flippantly the FBI is willing to discount viable methods endorsed by numerous researchers.
This is a simple “concept” demonstration / simulation of a NAND mirroring attack on an iOS 9.0 device. I wanted to demonstrate how copying back disk content could allow for unlimited passcode attempts. Here, instead of using a chip programmer to copy certain contents of the NAND, I demonstrate it by copying the data using a jailbreak. For Farook’s phone, the FBI would remove the NAND chip, copy the contents into an image file, try passcodes, and then copy the original content back over onto the chip.
I did this here, only with a jailbreak: I made a copy of two property lists stored on the device, then copied them back and rebooted after five attempts. When doing this on a NAND level, actual blocks of encrypted disk content would be copied back and forth, whereas I’m working with files here. The concept is the same, and serves only to demonstrate that unlimited passcode attempts can be achieved by back-copying disk content. Again, NO JAILBREAK IS NEEDED to do this to Farook’s device, as the FBI would be physically removing the NAND to copy this data.
Elsewhere Zdziarski also points out that, despite the FBI insisting that it was reaching out to everyone who might be able to help, none of the top researchers in the space have been approached by the FBI (and apparently a few who reached out the other way were rebuffed). Once again, it looks like whatever the FBI is doing with the phone, it's not being particularly upfront with the public (or, potentially, the courts).
As we pointed out earlier this week, it's pretty obvious that the Justice Department lied to a federal magistrate judge in saying that it had exhausted all possible opportunities to get into the work iPhone of Syed Farook, given that it has now put the case about it on hold to test out a "new way" to get into the phone. The DOJ had made a filing claiming that Apple's help was the only way to get into the phone, yet now is saying that's probably not true. However, the FBI is insisting that the DOJ wasn't lying. In a letter to the Editor at the Wall Street Journal, FBI Director James Comey reacts angrily to a similar opinion piece at the WSJ suggesting the DOJ lied:
You are simply wrong to assert that the FBI and the Justice Department lied about our ability to access the San Bernardino killer’s phone. I would have thought that you, as advocates of market forces, would realize the impact of the San Bernardino litigation. It stimulated creative people around the world to see what they might be able to do. And I’m not embarrassed to admit that all technical creativity does not reside in government. Lots of folks came to us with ideas. It looks like one of those ideas may work and that is a very good thing, because the San Bernardino case was not about trying to send a message or set a precedent; it was and is about fully investigating a terrorist attack.
James B. Comey
It's difficult to take much of that at face value -- especially as the government continues to push for similar court orders in other cases. And especially as Comey has been whining on and on about "going dark" for well over a year and a half now. At the very least, it does seem clear that the FBI failed to truly explore all possible options. As some iPhone forensics folks have noted, if this were truly a brand new solution, the FBI would need a hell of a lot more than two weeks of testing to make sure it really worked.
In the meantime, I'd heard from a few folks, and now others are reporting as well, that the assumptions that many had made about the Israeli company Cellebrite providing the solution are simply not true -- along with the idea that the solution involves reflashing the chip. The FBI itself now says it's a "software-based" solution.
FBI Director James Comey, in response to a reporter's question at a briefing, said making a copy of the iPhone’s chip in an effort to circumvent the password lockout “doesn’t work.” Comey wouldn't identify the company that's helping it or discuss details of the technique.
Law enforcement officials speaking on background debunked another report that had named Israeli forensics firm Cellebrite as the mystery firm helping it break into the phone.
Of course, this is after Cellebrite got a ton of free publicity from press reports claiming that it was the company (all of which was based on a few rumors from within the forensics world):
At this point it's not clear that you can trust the FBI or DOJ on anything about these issues, as they've managed the messaging very, very carefully, and at times have made statements that are somewhere in that gray zone between "misleading" and "outright lies." But Comey's actions over the last year and a half make it quite clear that this is not just about this one iPhone and he very, very much wants a precedent that will effectively stop the possibility of encryption that the FBI can't easily circumvent.
So now that there's been a little time to process the Justice Department's last minute decision to bail out on the hearing in the San Bernardino case, claiming it was because some mysterious third party had demonstrated a way to hack into Syed Farook's iPhone, it's becoming increasingly clear that (1) the DOJ almost certainly lied at some point in this case and (2) this move was almost entirely about running away from a public relations battle that it was almost certainly losing (while also recognizing that it had a half-decent chance of also losing the court case). Just replace "Sir Robin" with "the DOJ" in the following video.
That said, there are still some things to clear up. First, did the DOJ lie? It seems pretty obvious that it must have. After all, it insisted earlier in the case, multiple times, that it had "exhausted" all other possibilities and "the only" way to get into the phone was with Apple's help. That's certainly raised some eyebrows:
The DOJ and its supporters, of course, will argue that "new shit has come to light, man," but that seems... doubtful. My first thought was that when the FBI said that it had been alerted to a way in over the weekend, it potentially was using the announcement from researchers at Johns Hopkins about a flaw in iMessage encryption. If so, that would be particularly bogus, since everyone admits that the vulnerability found would not apply to this case.
However, there's now a ton of speculation going around about the likely method (and the likely third party) that the FBI is probably using, involving copying the storage off the chip and then copying it back to brute force the passcode without setting off the security features or deleting the data. But, again, this possible solution isn't really new. Just a few weeks ago, during a Congressional hearing, Rep. Darrell Issa quizzed FBI Director James Comey about this very technique (which was so deep in the technical weeds, that many reporters and other policy folks were left scratching their heads):
That video is worth watching, because Director Comey insists, pretty clearly, that there is no way to get into the phone:
Comey: We wouldn't be litigating it if we could [get in ourselves]. We've engaged all parts of the US government to see 'does anyone have a way -- short of asking Apple to do it -- with a 5c running iOS 9 to do this?' and we do not.
At that point Issa starts asking really technical questions about can't the FBI remove the data from the phone to make copies of the storage, putting it with the encryption chip, trying passcodes, and then reflashing the memory before the 10 chance are used up -- thus brute forcing the passcode without setting off the security features. As Issa notes:
If you haven't asked that question, how can you come before this committee and before a federal judge and demand that somebody else invent something if you can't answer the question that your people have tried this? ... I'm asking who did you go to? Have you asked these questions? Because you're expecting to get an order and have somebody obey something they don't want to do and you haven't even figured out if you can do it yourself.
Comey is clearly befuddled by the questions and basically says that he's sure that his people must have thought about this, but he assumes that they're watching and if they haven't thought of this then they'll test it out. But, really, a few people had suggested similar things early on, so if that is the solution then it only adds weight to the idea that the FBI didn't do everything it could possibly do before running to the judge.
Others have questioned the "two week" timeframe for the DOJ to issue a status report to the court, noting that a brand new solution would almost certainly take much longer to test thoroughly before using it on the iPhone in question.
And then there's the other question: if the FBI really has tracked down a new "vulnerability" in Apple's encryption... will it tell Apple about it so that Apple can patch it? Remember, the White House has told the various parts of the federal government that they should have a "bias" towards revealing the flaws so they can be patched... but leaving a "broad exception for 'a clear national security or law enforcement need.'" It's pretty clear from how the DOJ has acted that it believes this kind of hole is a "law enforcement need."
So, if the FBI really did figure out a vulnerability in Apple's encryption, it probably won't actually reveal it -- but I'd imagine that Apple's security engineers are scrambling just the same to see if they can patch whatever flaws there may be here, because that's their job. And, again, that gets back to the point here: there are always some vulnerabilities in encryption schemes, and part of the job of security folks is to keep patching them. And one of the worries with the demand for backdoors is that the introduce a whole bunch of vulnerabilities that they're then not allowed to patch.
Either way, the DOJ's actions here are highly questionable, and it seems pretty clearly an attempt to save face in this round. But the overall fight is far from over.
For many years, if you mentioned the term "cybersecurity czar" in the federal government, it only meant Richard Clarke. He was one of the earliest people to focus on computer security as an issue, and as such, became an advisor to multiple presidents on the issue. I haven't always agreed with him -- there have been points in the past where he's appeared as a leading voice in support of greater surveillance and exaggerated claims about a coming "cyberwar". However, in the past few years, Clarke has become much better on these issues, warning (just prior to Snowden's revelations) that the US's focus on surveillance has actually made the public less safe by leaving vulnerabilities open, rather than closing them.
And now he's weighed in, quite vocally, on the whole Apple v. the FBI thing, strongly in support of Apple. To some extent, this isn't a huge surprise as he was among a large group of smart folks who signed onto a letter a year ago opposing encryption backdoors, but his NPR interview gave him a chance to be quite explicit in just how dumb the FBI/DOJ's requests are. It's worth listening to the whole thing, or at least reading the transcript, but here are a few key highlights. Specifically, he argues that the FBI is lying in saying that it can't get access to the content on the phone, and just wants to set a precedent:
If I were in the job now, I would have simply told the FBI to call Fort Meade, the headquarters of the National Security Agency, and NSA would have solved this problem for them. They're not as interested in solving the problem as they are in getting a legal precedent.... Every expert I know believes that NSA could crack this phone. They want the precedent that the government can compel a computer device manufacturer to allow the government in.
Earlier in the interview, he totally dismisses the idea that there's a big dispute in the administration about this, saying that it's just the FBI and DOJ exaggerating:
Well, I don't think it's a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they're much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it's Jim Comey and the attorney general is letting him get away with it.
It's good to see more officials speaking out and calling bullshit on the FBI/DOJ claims on all of this.