The question is at what point is the lock on the door overkill?
For software security, the answer is "when it requires so much processor power that it significantly slows down other functionality". The issue doesn't really arise for the modern generation of smartphones.
My first thought was that when the FBI said that it had been alerted to a way in over the weekend, it potentially was using the announcement from researchers at Johns Hopkins about a flaw in iMessage encryption. If so, that would be particularly bogus, since everyone admits that the vulnerability found would not apply to this case.
It still provides them with a smokescreen to cover their retreat -- most people are only going to remember "somebody found a flaw in iPhone security and then the FBI said they don't need Apple to unlock the phone".
Wrong again. Tying up the staff that is capable of writing a new OS would obviously have yuuuge (to quote your intellectual peer among the current crop of candidates) implications (e.g. delaying the next iteration of the legitimate iOS because of the diversion of skilled expertise to FBiOS).
Well, then, you need to start reading stuff that would educate you about the case. Apple has repeatedly pointed out that several highly trained engineers would have to put in a few weeks of work:
According to the declaration of Apple manager of user privacy Erik Neuenschwander, it could take six to ten engineers anywhere from two to four weeks to design, create and deploy the requested software.
The doctrine of force majeure (overwhelming circumstances beyond our control make compliance impossible) is a well-established legal doctrine. For example, if the airport is shut down by a blizzard, the airlines can cancel their flights. Under Whatever's understanding (if one may dignify it with that word) of the law, he can demand that they get him to his destination on time like it says on his ticket.
Nope. Given the mood in Silicon Valley, the more likely scenario is "I quit." followed by a tough decision between a half-dozen different employment offers each sending the message "I like the cut of your jib".
Even somebody who learned everything they know about the law from CSI: Wherever and Ace Attorney can see through the obvious absurdity of the "Apple can keep the backdoor code secure so it won't get out" argument. If the Feds actually allowed that (or pretended to allow that) it would open the door to scenarios like:
Feds: We've got a search warrant for this phone, but we can't get in. Can you help? Me: (Sees owner's name engraved on phone back, and recognizes it as that of the asshole who stole my girlfriend, ran over my cat, and keyed my car) Sure! There's just one caveat, though -- I can't let you look at the code I'll be using because it might release a dangerous cyber pathogen. Feds: Well... OK. (later) Feds: Geezus Q. Christ! We thought this guy was just stealing credit card numbers, and it turns out that he's the world's biggest kiddie-porn meth-lab jihadist ringleader!
If they would have destroyed this phone like the others we wouldn't have this discussion.
Obviously, they filled this phone with phony evidence designed to lead the authorities on wild goose chases and discredit them (well, discredit them worse than they already are). The clear conclusion is that the phone should be left lock and the case should be dismissed. There is no evidence of this, but I wouldn't call it a total hypothetical.