FBI's Scorched Earth Approach To Apple Means That Tech Companies Now Have Even Less Incentive To Help Feds

from the stupid-and-shortsighted dept

On Friday, we debunked a key FBI talking point, which the press has been parroting, that Apple had helped the FBI in 70 previous cases, and only changed its mind now for “marketing” or “business model” reasons. As we explained, that’s not even remotely true. In the past, Apple helped out because it had access to the content, and so it got it and turned it over following a lawful search warrant/court order. In this case, the situation is entirely different. Apple does not have access to the content that the FBI wants, and is now being forced to create a backdoor — build an entirely revamped operating system — that undermines some key security features found on iPhones today. That’s quite different.

But here’s something we didn’t point out — but which was highlighted by Chris Soghoian. The FBI’s scorched earth policy here in pushing that talking point is really going to backfire in a big bad way. The lesson the entire tech industry is going to get from this is: if you ever help the FBI and if you ever push back later, they’ll use your earlier cooperation against you.

Indeed, the FBI is going on and on in the press and in court filings pointing to those “previous” times, as if it’s proof that Apple has suddenly “changed its mind” for “marketing” reasons. From the motion to compel on Friday:
Given all that, why would any tech company voluntarily help the FBI going forward? It’s only going to be used against them if they ever protest an overbroad request. We keep hearing government folks say that Silicon Valley has to stop treating the government as an adversary — and I think the only proper response to that is: you first.

Filed Under: , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “FBI's Scorched Earth Approach To Apple Means That Tech Companies Now Have Even Less Incentive To Help Feds”

Subscribe: RSS Leave a comment
112 Comments
That One Guy (profile) says:

Gratitude, FBI style

‘Thanks for going above and beyond what you legally had to do to help us before. In thanks for your actions before we’re going to use that fact against you, both in court and in public, should you ever decline one of our ‘requests’ for assistance in the future.’

Yeah, with how the FBI has used the fact that Apple has helped them in the past against the company, tech companies would have to be incredibly stupid to do anything similar in the future.

Demand a legal order from a judge before offering any assistance, and assuming the order is acceptable do exactly what it says and nothing more.

Even if the FBI/DOJ ‘win’ this case they have ensured that the level of willing cooperation they’re likely to receive from companies in the future is going to be dramatically less. If they thought the reception they received from companies they requested help from before was chilly, it’s going to be downright arctic now, and their have only their own actions to blame.

WDS (profile) says:

Re: Gratitude, FBI style

I don’t think in the previous cases Apple above and beyond what they legally had to do. They received either a subpoena, warrant or a court ordered “All writs” request and had the information and provided it. Those were legal, reasonable requests.

What they didn’t do was write a back door to their phones. There is a vast difference between providing information you have, and creating something that doesn’t exist.

nasch (profile) says:

Re: Re: Gratitude, FBI style

I don’t think in the previous cases Apple above and beyond what they legally had to do. They received either a subpoena, warrant or a court ordered “All writs” request and had the information and provided it.

Then that’s even worse. That means to avoid having cooperation thrown back in their faces later, companies have to actively resist court orders from the FBI. Every. Single. Time. Is this really what they wanted to accomplish?

Anonymous Anonymous Coward says:

Re: Gratitude, FBI style

From their point of view they did nothing wrong, so have nothing to hide. Interesting how mesmerized they get by shiny stuff. Even still those things they’re not hiding…your not allowed to see.

If your behavior doesn’t help us, then expect us to help ourselves to whatever you have, whether we need it or not. Hmm, what are you hiding there?

Liquid Nitrogen isn’t cold enough to chill a government in full tyranny mode, and it appears ours is, or approaching that rapidly. Dismantle and re-boot is the way to go, and all the heat of the sun may be of necessity. Next time, make some rules about corruption.

Anonymous Coward says:

Re: Re: Gratitude, FBI style

Next time, make some rules about corruption.

The current government have these rules that they are meant to follow, called the constitution and bill of rights. They way they are following those rules suggest that you are an optimist in suggesting that rules will limit tyrants.

Anonymous Anonymous Coward says:

Re: Re: Re: Gratitude, FBI style

I am optimistic that a system can be devised that will make it harder for tyranny to grow, which assumes that we start with some non-tyrannical government types, elected, appointed, or just plain hired.

I am pessimistic that a perfect system does or can exist.

Maybe the best way would be to have a more accessible re-set button that the populace can execute, and/or an automated system whereby lying to the government, or by the government, or by politicians (campaign promises not followed) causes a personal automatic reset (proof of the lie by anyone and your position and right to hold any office or work for any government is lost) , or something.

Anonymous Coward says:

Re: Re: Re:2 Gratitude, FBI style

And the halls of government would be filled with crickets, tumbleweeds, and those peculiar types of passive-aggressive clerks who delight in looking up obscure rules with which to deny your existence.

…although the crickets would be amusing, if we could teach them to chirp in harmony.

Anonymous Anonymous Coward says:

Re: Re: Re:3 Gratitude, FBI style

Alternatively we could ban political parties. Not a new subject, several of the Framers as well as George Washington warned against those.

We could also remove money from politics. If representatives didn’t need to fund raise we would remove a significant vector from the corruption venue. The money needed to run all campaigns would be chicken feed compared to other budget ‘priorities’. Professional lobbyists is another concern.

We could get rid of PAC’s and corporate influences in government. Why should they get a seat at trade treaty negotiations and not the rest of us? Why should their money be able to speak louder than others. Removing money from politics would go a long way in this area, but lots of lunches could sway weak elected people. Remove temptation.

We could make officials, elected, appointed, or otherwise, criminally liable for their oaths of office. Criminal penalties for lying to Congress, prosecutable immediately upon failure to comply or upon proof of lie. Give those otherwise people term limits in government work, not just their jobs. Institutional knowledge is both good and very bad.

We could give Inspector Generals the power they need to do their jobs.

We could remove politics from the Senates ‘advise and consent’ roles.

We could force Congress to have one issue and one issue only on each bill. With that, some legislatures require a reading of the entire bill, in some cases several times, prior to the vote, with a quorum in place. We could, with that in mind, limit the length of bills and simplify the language used. Along this line we could sunset every law every seven years. This has the dual impact of keeping legislatures busy (few new laws because we need to look busy) as well as culling all the unnecessary crap already in laws. No more riders added to must pass budget bills, etc.

We could eliminate the ability of legislators from ‘revising and extending’ their comments in the public record where they clean up their comments for political or historical (I always want to amend that to hysterical for some reason) purposes.

We could force Congress to change their rules so that they cannot manipulate things at all, let along easily. Committee passes it, bring it to vote. Party head is Speaker? Come on. Get rid of the parties and elect a leader from the entire conclave, not from a group foisted upon some agenda. Committee chairs by party in majority, how about an actual leader? This will cull a bunch of bickering that is agenda based rather than constituent based.

This list is not all inclusive nor is it meant to be final. Discuss it and other options.

There are lots of things that could be done if the political will existed and didn’t remove ‘power’ from those that have drunk the elixir.

I am not holding my breath, and still looking for that re-set button.

Anonymous Anonymous Coward says:

Re: Re: Re:5 Gratitude, FBI style

Then what is the justification for riders? They are one issue each and added to a bill that has one issue, but then automagically they become one passable bill. That has to stop.

The counter to that is that it becomes a defacto line item veto. There a a lot of agenda oriented folks that don’t want that.

My point is that Congress should not be able to make such a determination, there should be one subject per bill, period. Congress should not be able to make rules to game the game.

Anonymous Coward says:

Re: Re: Re:2 Gratitude, FBI style

your kidding right?

This button already exists but “The People” have willfully grown ignorant of it. It’s called jury nullification.

The other one called the 2nd Amendment has been under assault by BOTH sides for quite some time.

The 2nd Clearly states that “The People” should have the right to keep and bare firearms in case a Militia needs to be raised to fight enemies foreign or domestic.

The Declaration of Independence also states that is is the Duty of “The People” to throw off oppressive forms of engorgement and last time I checked there are still a lot of cowards that deserve NO SAY in the course of our nation because they have willfully given up their liberty for faux safety.

The US government as constructed by the founders is about as perfect as it gets. Any system with evil actors can be corrupted with sufficient amounts of time, control over the education system, and copious amounts of ignorant and cowardly people.

There is a reason for importing a lot of illegals into the USA, it is to water down the country so that it can be turned into even more of a joke of a country than it has already become.

nasch (profile) says:

Re: Re: Miranda Rights

Witnesses not so

Witnesses absolutely have 5th amendment rights, but as you say this isn’t a criminal proceeding which I think is the only time the 5th applies.

http://criminal.findlaw.com/criminal-rights/fifth-amendment-right-against-self-incrimination.html

The rights are not the “Miranda rights” it’s a “Miranda warning” of your 5th amendment rights. But that doesn’t mean only people who have received the warning have the rights. That could be the next thing on the agenda though, after deciding you only have a right to silence if you explicitly invoke it.

Rattran (profile) says:

Just take a lesson from FOIA costs...

Apple just needs to apply FOIA costs math to the situation, and inform the FBI that they can start working on the code needed as soon as the FBI gives them estimated costs of $15,564,708,001.34 The code should be available to the FBI in approximately 3 years after receipt of the cash.

Then send a hard copy of the firmware on a4 paper from a dotmatrix printer with an old ribbon.

Anonymous Coward says:

Re: Re: Re: Just take a lesson from FOIA costs...

But do they start printing illegibly when they get old?

Ever stuffed a cash register receipt into your pants pocket, and then pulled it back out a few hours or a day later? Tried to read it? Quite a few of those cash register receipts may still be printed on thermal paper.

DannyB (profile) says:

Re: Re: Just take a lesson from FOIA costs...

A printout is more appropriate than punch cards.

Punch cards are machine readable much easier than a printout is. Even if you have to build a new punch card reader from scratch.

Optically reading the printout, while possible, is a much higher hurdle. Enough so that it makes the FBI consider whether it is cheaper to employ a bunch of humans to hand key in the printed information.

Anonymous Coward says:

Here's the counter to Comey's "but it's a narrow targeted request" argument.

Smith vs. Maryland was also a narrow targeted request too and look how that decision has been used ever sense. It really doesn’t matter if you are doing this TO set the legal precedent or not. What matters is that it WILL set a legal precedent which WILL then be abused by the government regardless of whether that is why you are doing it.

Anonymous Coward says:

Re: Here's the counter to Comey's "but it's a narrow targeted request" argument.

In fact, just once I would like to see an interviewer ask Comey the tough question when he spouts this garbage…

“Ok, Mr. Comey, you say your intention is not to create precedent, what are you willing to do to ensure that the abuse of bad precedent is not allowed to occur? You say you don’t want to undermine the security, how are you going to keep this from undermining the security in the future?”

Whatever (profile) says:

“and is now being forced to create a backdoor”

You need to work on definitions of a backdoor. A backdoor would be “presto, here’s the data”. It would be “enter this 40 character string, and the phone spills it’s guts”. That is not what they are seeking. They are seeking the very limited concept of modifying the OS on a single phone to make it so there is no 10 tries limit, as well as to remove an artificial processing delay of 5 seconds per try.

The phone will still be secure with those patches applied, and without further efforts to actually hack the phone (aka, pick the lock) they will not be able to read the data. If the user selected a long enough pincode in the encoding proccess, then it’s quite possible they will never get full access.

A backdoor would assure full access. There is no assurance here.

JD says:

Re: Backdoor definition

A backdoor is a mechanism — mathematical, hardware, or software — which bypasses or disables some or all security measures.

Are you seriously arguing that because the court is only ordering Apple to remove all the security measures over which it currently has control, that it’s not a backdoor?

If an ATM manufacturer updated their terminals so it only checked the last digit of a customer’s PIN, would you claim that’s not a backdoor because an attacker still has to guess one digit? Maybe the customer picked a less-popular number like ‘6’ or ‘0’ and statistically it’ll take an attacker longer to guess because it’s not ‘7’. Good to know you wouldn’t consider that to be a backdoor.

Uriel-238 (profile) says:

Re: Re: Thank you for addressing this.

It’s a nit I too wanted to pick.

Apple figured that iPhone users had poor password discipline and created a system to improve the security of those with poor passwords. Ergo, the system has a known vulnerability. One can argue that is a backdoor already in place.

The code that the FBI wants written is not a backdoor, but a hacking tool that that exploits the vulnerability.

The backdoor / known vulnerability is already in place. To add a backdoor to encrypted data, you have to decrypt the data and re-encrypt the plaintext with the vulnerable crypto.

Chronno S. Trigger (profile) says:

Re: Re: Re: Thank you for addressing this.

Apple wouldn’t be building a backdoor into the encryption. It would be building a backdoor into the other security features in iOS. The vulnerability you speak of is only a way to install the backdoor. The hacking tool would be the software the FBI already has to brute force the password.

Tech lingo is a bitch, I hope that helped.

Uriel-238 (profile) says:

Re: Re: Re:2 Semantics

The other OS security feature, I would argue is the vulnerability. It might have been less vulnerable for Apple simply to advise their end users to choose a 16+ digit AES password. Many wouldn’t do it, but neither Apple nor the Feds would know which ones.

The FBI’s brute-force software is also a hacking tool. Some of us want a vice-grip and a hammer with which to work.

Uriel-238 (profile) says:

Re: Re: Re:2 Ah, and yes, there are two vulnerabilities:

a) Using the PIN system to create the end-user password to be hashed with an internal TPM-held number, and trying to limit the guesses via software so that end users think they’re only responsible for a small, easy, numerical password…and,

b) Allowing for the phone to be updated with code digitally signed by apple.

And this is not addressing vulnerabilities of the TPM, which I’d think the Feds would already have tiger team tasked to discover.

Anonymous Coward says:

Re: Re:

Firstly, any time you deliberately create any mechanism to work around the security features of hardware or software, you are by definition creating a backdoor. This has been established repeatedly.

Secondly, this one phone has suddenly morphed into hundreds of phones as law enforcement agency around this country are now coming out of the woodwork gleefully awaiting the precedent to be set so they can demand the same of Apple or other tech companies.

You are a blatantly dishonest person, Whatever. No ifs, ands or buts about it. Honestly, its the only thing I expect from authoritarian people such as yourself.

Whatever (profile) says:

Re: Re: Re:

“Secondly, this one phone has suddenly morphed into hundreds of phones as law enforcement agency around this country are now coming out of the woodwork gleefully awaiting the precedent to be set so they can demand the same of Apple or other tech companies.”

That’s the narrative they would like you to believe. Notice the list Mike put up didn’t include any links or backing info. It doesn’t describe what they were looking for (or why). It does seem very reasonable, considering that almost every carries a phone these days, that law enforcement would want to look into a small number of devices relative to ongoing investigations.

“Firstly, any time you deliberately create any mechanism to work around the security features of hardware or software, you are by definition creating a backdoor. This has been established repeatedly.”

No, it’s been repeated loudly often enough, but it’s still crap. Removing the two “features” in the OS does not suddenly generate free access to the phone.

From Wiki:

“A backdoor is a method, often secret, of bypassing normal authentication in a product, computer system, cryptosystem or algorithm etc. Backdoors are often used for securing unauthorized remote access to a computer, or obtaining access to plaintext in cryptographic systems.”

Nothing is the FBI request makes a back door by this definition. The normal authentification on the phone would still be in place, only the artificial limits on number of tries and speed of tries would be disabled. A true backdoor would allow access without authentification, which is just NOT the case here.

So you can repeat “backdoor” a million times, but it’s just not what is being sought here.

Anonymous Coward says:

Re: Re: Re: Re:

Why not quote the second paragraph of the Wikipedia article?

Here, I’ll do it for you and and I’ll even highlight the relevant passage describing the similarities to what Apple is dealing with here.

A backdoor may take the form of a hidden part of a program,[1] a separate program (e.g. Back Orifice may subvert the system through a rootkit), or may be a hardware feature.[2] Although normally surreptitiously installed, in some cases backdoors are deliberate and widely known, and may have somewhat legitimate uses such as the manufacturer having a way to deal with users losing passwords.

I question the legitimacy of any use of a backdoor to assist with users forgetting passwords, as there are better ways to do such things. But make no mistake, what the FBI is asking for here is by definition a backdoor, in an effort to help them “remember” a password they have “forgotten” (read: never knew to begin with).

Anonymous Coward says:

Re: Re: Re: Re:

Nothing is the FBI request makes a back door by this definition. The normal authentification on the phone would still be in place, only the artificial limits on number of tries and speed of tries would be disabled.

Bull. Re-try limiting is part of the normal authentication process on the phone.

So you can repeat “backdoor” a million times, but it’s just not what is being sought here.

You can lie and deny it all you want, but a backdoor is exactly what’s being sought here.

Anonymous Coward says:

Re: Re: Re: Re:

“A backdoor is a method, often secret, of bypassing normal authentication in a product, computer system, cryptosystem or algorithm etc. Backdoors are often used for securing unauthorized remote access to a computer, or obtaining access to plaintext in cryptographic systems.”

Nothing is the FBI request makes a back door by this definition. The normal authentification on the phone would still be in place, only the artificial limits on number of tries and speed of tries would be disabled.

In other words, BYPASSED. Unless your definition of “disabled” means “working as designed.”

Whatever (profile) says:

Re: Re: Re:2 Re:

“In other words, BYPASSED. Unless your definition of “disabled” means “working as designed.””

The device is still secured and locked after these changes are made, which means there is no bypass.

Yes, they are lowering certain security “features”, but those features are not the actual pincode lock system, which would remain entirely intact.

Anonymous Coward says:

Re: Re: Re:3 Re:

The device is still secured and locked after these changes are made, which means there is no bypass.

Ah, I see – so it’s your definition of “secured” that’s all goofed up.

Yes, they are lowering certain security “features”, but those features are not the actual pincode lock system, which would remain entirely intact.

You’re assuming that the pincode, in and of itself, is the security.

It isn’t.
Wanna know why?
Because the ability to count from 0000-9999 is not security.
Apple knows this, as does the FBI.

WDS (profile) says:

Re: Re:

My definition of a “Backdoor” is a way around the built in security features (i.e. the front door). Obviously the definition you are using is what the government is using, when they say they are not asking for a back door.

I’m willing to admit that you could say they are asking for a backdoor with a less secure lock, rather than asking for an unlocked backdoor.

Ninja (profile) says:

Re: Re:

Ah the bullshit again. If you are leaving a weak spot in tha back wall it is a backdoor that isn’t open with a key but with a hammer. Let’s call it the Hammer Key ™ and make it shaped like a hammer. Maybe then it will get through your thick skull.

A backdoor would assure full access. There is no assurance here.

Yes there is. Given the hammer is just powerful enough.

Whatever (profile) says:

Re: Re: Re:

“Yes there is. Given the hammer is just powerful enough”

Incorrect. If the pincode was 10 or 20 or 50 digits, do you think there would be a powerful enough hammer? Remember, you are limited by the phone itself and it’s ability to process authentication requests.

Had Apple required a 10 digit pincode, this would be a moot discussion, as it would take something like 30 years to get in. (assuming the a 6 digit code is a single day, as Mike has claimed, 7 digits would be 10 days, 8 digits would be 100 days, 9 digits 1000 days, and 10 digits 10,000 days… 27 plus years. Even allowing for the law of averages, it would be almost 14 years on average to access the device).

Apple has weak security, saved only by a couple of programming barriers against brute force tries. The weak spot is the entire wall.

Ninja (profile) says:

Re: Re: Re: Re:

Incorrect. If the pincode was 10 or 20 or 50 digits, do you think there would be a powerful enough hammer?

People wouldn’t use it. They use because humans are fallible and can’t remember that much. Even 8 digits may be a little bit too much for most people. Remember we are not talking about the criminal but the freaking 99,9% that are not criminals and will be exposed to the Government if such precedent is set.

Apple has weak security, saved only by a couple of programming barriers against brute force tries. The weak spot is the entire wall.

No it has not. The fact that there are multiple layers to prevent multiple means of trying to get access does not mean their security is weak. Rather it means they went an extra mile to help making the wall as sturdy as possible. And weakening it is creating a backdoor or a spot where one can be forced (which is basically the same). The user may have chosen to use a less complex key but this is not Apples weakness. If by any means they can actually do what is being asked (possibly with enough financial resources they can) it does not mean they should. Which is the entire point of the outrage against the order.

Anonymous Coward says:

Re: Re: Re:2 Re:

They use because humans are fallible and can’t remember that much.

Can you remember seven-character passwords for three different accounts? Can you remember three different seven-character passwords? Using a modified base64 alphabet?

If you can memorize three passwords for three accounts, then you can string three passwords together for one account. I use colons as separators. Here’s an example.

VPC80UG:FvbF12W:_PhFiXE

Generated by:

head -c 16 /dev/urandom | base64 -w 7 | sed -ne '1 h; $ { g; y,n+/,:-_,; p; q }; 1! H'

Anonymous Coward says:

Re: Re: Re:3 Re:

“Can you remember seven-character passwords for three different accounts? Can you remember three different seven-character passwords? Using a modified base64 alphabet?”

As a network administrator, I can say this an ideal but not realistic. The average staff member would be calling daily if passwords requirements had been set at that standard.

Whatever (profile) says:

Re: Re: Re:2 Re:

“People wouldn’t use it. They use because humans are fallible and can’t remember that much. Even 8 digits may be a little bit too much for most people. Remember we are not talking about the criminal but the freaking 99,9% that are not criminals and will be exposed to the Government if such precedent is set.”

yet oddly, we use those same devices to make phone calls to numbers ranging from 7 to 10 digits all the time. Humans are remarkably good at remembering groups of three and four digits, and even longer ones (such as credit card numbers or membership card numbers for things we use frequently. So if we can remember areacode threedigits fourdigits then yes, you can remember threedigits threedigits threedigits for a 9 digit pincode.

” The fact that there are multiple layers to prevent multiple means of trying to get access does not mean their security is weak.”

It most surely is a clear indication that the core security is weak. The “door” is weak enough that they need to build walls over the top of it to keep people from being able to get to the door, because they know it’s a weak point.

“The user may have chosen to use a less complex key but this is not Apples weakness. “

It’s a huge weakness to allow and not discourage overly short pincodes. Apple already upped it from 4 to 6 minimum to unlock the phone knowing that 4 was just too simple. Had they forced 8, as an example, the brute force question would be all but moot (100 days on Mike’s based case 1 day to hack 6 digits, more than a year on a more reasonable scale).

Apple’s legal action is clear indication that what is being asked is not only possible, but potentially has already been done or a subset has been done already. If they didn’t think it possible, they could just shrug, bill the government for millions of hours, and after a long enough period say “see, we told you!”. The aggressive natural of Apple’s response here tells me that the court order has hit a major weak spot, and Apple is very concerned that someone has let out a really bad secret.

Ninja (profile) says:

Re: Re: Re:3 Re:

So if we can remember areacode threedigits fourdigits then yes, you can remember threedigits threedigits threedigits for a 9 digit pincode.

Multiply that 9 digit code by 87. That was the last count of passwords I use (the average person may use a bit less). I’ve checked and I absolutely must remember at least 7 different passwords that cannot be managed by services like Last Pass. So it’s not that simple. Some of us go an extra mile into paranoia but we are the minority and it’s very understandable why.

It most surely is a clear indication that the core security is weak. The “door” is weak enough that they need to build walls over the top of it to keep people from being able to get to the door, because they know it’s a weak point.

No it is not. They don’t build walls over the top, it’s still a door but it is reinforced. The reinforcements are part of it being a strong door. You don’t put an extra lock because your door is weak but rather to make it harder in the occasion some criminal can open one of the locks.

It’s a huge weakness to allow and not discourage overly short pincodes.

Yes, which is why they made the 6 digit change which seems reasonable and not too long (remember credit card pins are 6 digits). Still, technology can make this moot quite fast which is why you add enforced delays and other layers to avoid brute forcing. I do agree that it is a best practice to have longer passwords but it is NOT practical for daily use. I use an alphanumeric password on my mobile devices that’s a bit longer than that and it is inconvenient for daily use. If this can be solved by adding such layers against brute force then it is a GOOD thing. It provides stronger protection with shorter passwords. It’s not a weakness, it’s a strength.

If they didn’t think it possible, they could just shrug, bill the government for millions of hours, and after a long enough period say “see, we told you!”.

Which would still be bad. The problem is that the Government asked and it was granted, not if it is feasible or not. And you ignore the fact that a lot of other companies that even compete with Apple are supporting their fight. It seems you are unable to grasp what this is about. It’s useless to argue with dumb.

Whatever (profile) says:

Re: Re: Re:4 Re:

if you are really concerned and want safety, you require only a 4 digit pin, 5 chances and a 1 hour delay between each try. It will be SO secure! The delay and the chances are not part of the security, they are the “security by obscurity” method that Techdirt has picked apart in the past.

“Which would still be bad. The problem is that the Government asked and it was granted, not if it is feasible or not. And you ignore the fact that a lot of other companies that even compete with Apple are supporting their fight. It seems you are unable to grasp what this is about. It’s useless to argue with dumb.”

I understand it very well. They all suffer from the same problems, which is claiming to have amazing security that is likely down to a short pin code that defeats the purpose. None of them (and I mean not a single one) is going to want to come out and say “our encryption technology is totally amazing but fails because you are an idiot who can’t remember more than 4 digits for a pincode”. They don’t want to have to admit that your personal fort knox of information is protected by the cheapest of padlocks.

I understand exactly what it’s about, and it’s not what is in their carefully crafted narrative.

“Multiply that 9 digit code by 87. That was the last count of passwords I use “

I would say you probably need to get a life, but you probably password protected it and can’t get it out anymore. 🙂 Seriously though, out of those 87 passwords, how many of them are absolutely key, that you enter 100 times a day? I can remember my social security, my bank card numbers, my credit card numbers, my government ID numbers, my passport number, and a whole bunch of other things that I only use occassionally. I have 6 digit pincodes for a half a dozen or more bank cards. Those are all things I don’t use every hour of every day. A 9 digit pincode for your phone (330330331 example) would easily be remembered by almost everyone because they would use it all of the time.

For what it’s worth, if you have 87 passwords to deal with, I am hoping they are all longer than 4 to 6 digits, otherwise your “security” is all in your mind!

Anonymous Coward says:

Re: Re: Re:5 Re:

they are the “security by obscurity” method that Techdirt has picked apart in the past.

There is nothing obscure about limiting tries, and delaying retries, indeed it is a standard technique whenever passwords and pins are used. If it was a successful method, plastic money would require you to create and remember a long pin, which would reduce the security of the system because people would keep written copies of the pin with their cards.

Ninja (profile) says:

Re: Re: Re:5 Re:

One hour is not needed. A few seconds are enough to make brute force virtually impossible even with shorter passwords. You ignore that such method is used in a wide array of security devices. Electronic locks used by the military itself use this system to prevent brute force. Go tell them this doesn’t work. But please record the interaction, I wanna laugh on your face.

And no, you don’t understand a shit. Or you are being an ass (or a bit of ignorance and a lot of ass). Shorter pins do not defeat the purpose. They are practical. To allow practical pins with more security there is the delay and the limit of tries. It is that simple.

They don’t want to have to admit that your personal fort knox of information is protected by the cheapest of padlocks.

I’ve said before, it’s useless to argue with you. You are a total moron. But it’s worth pointing out how wrong you are to others. You are wrong. The padlock is neither cheap nor weak. It is made in a fashion that you don’t need to use alien keys that wouldn’t fit your pockets without giving up security. It is simple as that.

I would say you probably need to get a life, but you probably password protected it and can’t get it out anymore.

Oh no, I’m just security conscious and don’t re-use passwords like the majority of the population. Nowadays you have to sign up to accounts to do even the most trivial tasks (I was forced to get a microsoft account to download a goddamn copy of an Office disk that stopped working for me – as a side note I should have downloaded from the pirate bay). Now you could listen to your own advice and get a life but I think you threw your brain out the window at some point.

Seriously though, out of those 87 passwords, how many of them are absolutely key, that you enter 100 times a day?

I told you in my reply. That you fail at reading comprehension is not my fault. But it doesn’t matter, even if you have to remember a few long passwords it may prove to be a challenge. My company enforces 12 digit+ passwords that change every 3 months. Cool security practice but most employees end up using variations of easy passwords in order to remember them so it kind of defeats the purpose (then again if they added a maximum number of attempts they could make the minimum digit smaller and easier to remember, see, see???).

I can remember my social security, my bank card numbers, my credit card numbers, my government ID numbers, my passport number, and a whole bunch of other things that I only use occassionally.

Good for you. I can’t remember half of it. But I can remember the detailed specs of my computer up to the DDR timing and my father remembers every single birthday or otherwise important date ever (I fail hard at that). Got my point? I suspect no.

A 9 digit pincode for your phone (330330331 example) would easily be remembered by almost everyone because they would use it all of the time.

True. But would it be practical? It isn’t, actually. So It’s awesome that Apple made shorter passwords more secure with a quite simple solution.

For what it’s worth, if you have 87 passwords to deal with, I am hoping they are all longer than 4 to 6 digits, otherwise your “security” is all in your mind!

Some sites don’t accept too much complexity (special characters or even capital letters). In fact I do have 6 digit numeric pins as passwords because the site operators are morons and won’t allow more than that. I try to change those passwords frequently when I must use the site. But you see, I don’t have to type any password beside some of them and that’s why I can use such complexity. It would not be practical in the real world. My mobile password is fairly short when compared to those because a long password is an incredible hassle as I found out the hard way. And even so it is still not practical because I mixed all types of characters. Android does have the same system Apple put in place (the delays) but I chose not to rely in this feature sacrificing the practical aspect. But I do have the option of using a shorter, numbers only password but paranoia is a problem. It would still be safe.

JonK (profile) says:

Re: Re: Re:6 People who Subsist

I am also required to memorize continuously changing complex (15 chara mixed alpha, numeric, caps, spacial-charas) passwords, and most are limited to 3 time-spaced tries. Our security department HATES the company year end two week shutdowm, because of the number of login failures (3-tries and you’re out!).

Typical large companies “settle” with the IRS three or more years after taxes are due. They aren’t following this APPLE vs. FBI courtship, but if they would be having hissy-fits if they were. Direct Government access into anybodies database (in this case in an IPhone), affords the government the ability to short circuit any investigation by the Government interpreting what you meant in a quick email to what they want you to have meant. In this example, going after the meaning of data of unrepresented dead killers.

That One Guy (profile) says:

Re: Re:

I like how you keep latching on to the word ‘backdoor’, as though if you simply call it something else it suddenly isn’t a problem.

Get rid of the term, call it whatever you want, and you’re left with the following question:

Is the FBI/DOJ ‘asking’ Apple to create a modified version of their OS, with the express purpose of removing/bypassing security features designed to protect the data on the device by making it effectively impossible to brute-force the device?

Yes or No?

If ‘Yes’, do you believe that should the FBI/DOJ succeed in their case, that it will create a precedent that it is now legal to force companies to undermine their own encryption, granting access that would otherwise not be possible?

Yes or No?

If No, why do you believe that the precedent from this case would not be used in other situations?

Semi-related, but do you believe that it should be legal for companies to implement or create encryption or other security systems that they themselves cannot bypass or defeat, such that no legal order would be able to compel them to hand over the information contained in the account/service it is stored on, as it would not be possible for them to comply?

Yes or No?

Whatever (profile) says:

Re: Re: Re:

“Is the FBI/DOJ ‘asking’ Apple to create a modified version of their OS, with the express purpose of removing/bypassing security features designed to protect the data on the device by making it effectively impossible to brute-force the device?

Yes or No?”

Yes, but it is not the encryption, nor does it magically permit access to the data by itself, nor does it provide a “golden key” to the phone.

“If ‘Yes’, do you believe that should the FBI/DOJ succeed in their case, that it will create a precedent that it is now legal to force companies to undermine their own encryption, granting access that would otherwise not be possible?

Yes or No?”

NO, ABSOLUTELY NOT. Nobody is asking Apple to break their own encryption. If Apple’s encryption was strong enough (and not basically attached to a simple pincode) then we wouldn’t have this discussion at all. But for two simple programming tricks (limit count and time delay), Apple’s entire protection scene would be very very weak.

“If No, why do you believe that the precedent from this case would not be used in other situations?”

I think that it would be used in appropriate situations where weak encryption systems are in place which can generally be accessed through simple methods. I suspect it will be rendered moot by Apple rolling out a more significant / more secure system that will make brute force hacking, no matter the delays or try counters, meaningless. The ruling has value only as long as companies like Apple make weak security (and try to pass it off as super strong).

“Semi-related, but do you believe that it should be legal for companies to implement or create encryption or other security systems that they themselves cannot bypass or defeat,”

Yes I do. But I think that almost any system that has a human interaction point (passcode) is likely to be suspectible to brute force methods because the public wants generally to use the smallest / shortest password possible to make things easy to live with.

On that basis, unless they enforce good and long passwords, they are very likely to ALWAYS have a system that came be compromised. The real “fix” will be there, and nowhere. Until they, I suspect the FBI will use the results of this case as an effective tool for accessing locked phones IN THEIR POSSESSION AS PART OF A CRIMINAL CASE (uppercase to make a sigificant point) and not as some sort of random back door to read your email during a traffic stop.

Ninja (profile) says:

Re: Re: Re: Re:

Yes, but it is not the encryption, nor does it magically permit access to the data by itself, nor does it provide a “golden key” to the phone.

Dude. The security measure against brute force is a goddamn part of their encryption system. Stripping it (if they can) IS UNDERMINING THEIR SECURITY SYSTEM. It may not be a full door but it is a path inside. Call it door, path, unicorn or golden key it’s the same.

NO, ABSOLUTELY NOT. Nobody is asking Apple to break their own encryption. If Apple’s encryption was strong enough (and not basically attached to a simple pincode) then we wouldn’t have this discussion at all. But for two simple programming tricks (limit count and time delay), Apple’s entire protection scene would be very very weak.

It is asking them to weaken their system which may not be breaking it altogether but it’s almost that. Security systems are made of a combination of smaller components. There’s an elliptic curve for instance but it alone does not account for the entirety. There are other components and some of them happen to be designed to prevent brute forcing. Just pick Gmail, Facebook or whoever and try inserting many wrong passwords. They will react because this is part of their security system. Can you see the implications? The more components you add the more secure it is.

I think that it would be used in appropriate situations where weak encryption systems are in place which can generally be accessed through simple methods.

Simple methods? Writing a modified system, building a copy of physical hardware up to the almost atomic structure etc etc knowing this may not work because Apple deletes the hardware keys is simple? And who says what situations are appropriate? There are PLENTY of examples where the Government is abusing precedents like there was no tomorrow. I know you think there are only Saints in the current Government but think ahead. Countries don’t become dictatorships overnight.

Yes I do. But I think that almost any system that has a human interaction point (passcode) is likely to be suspectible to brute force methods because the public wants generally to use the smallest / shortest password possible to make things easy to live with.

Simple: enforce a delay in your algorithm. It does not mean the system will be weak or that the Govt may force them to magically remove that delay.

On that basis, unless they enforce good and long passwords, they are very likely to ALWAYS have a system that came be compromised.

If the there is an enforced, long enough and unstoppable delay between the attempts then even smaller passwrods would be secure enough. Are you saying people should be forced to use longer passwords because you and the FBI say so? Sorry but that’s very despotic of you.

IN THEIR POSSESSION AS PART OF A CRIMINAL CASE

And? You are focusing on this specific case, the question is much broader. You fail.

Whatever (profile) says:

Re: Re: Re:2 Re:

“And? You are focusing on this specific case, the question is much broader. You fail.”

Nope, that’s called “narrative building”, where Apple and others are trying to make your worried about your personal privacy by concocting nightmare scenarios that just ain’t real. This isn’t random selecting phone for scanning or remote exploits, it’s making it reasonable to unlock a phone’s pincode when it’s already in police possession, and a warrant is issued by the court to open it.

The nightmare stories being told are whole cloth material. Don’t fall for it.

“If the there is an enforced, long enough and unstoppable delay between the attempts then even smaller passwrods would be secure enough. Are you saying people should be forced to use longer passwords because you and the FBI say so? Sorry but that’s very despotic of you.”

No, rather that the method chosen by Apple to be secure at the type of thing that could be hacked and thus put the device at risk. Short passwords are the “cheap lock”, the proverbial weakest link. Fixing the weakest link rather than trying to hide it’s existence is always a better choice. Apple phones right now are ripe for a hack, but 8 or 9 character pin codes would all but negate that – without having to worry about delays or other OS based protections that can be changed in firmware.

“Countries don’t become dictatorships overnight.”

And you having become a paranoid person overnight either. it’s takes a whole lot of reading and re-reading the Alex Jones like banter going on to think that the government really gives a crap about your specific personal phone.

Ninja (profile) says:

Re: Re: Re:3 Re:

where Apple and others are trying to make your worried about your personal privacy by concocting nightmare scenarios that just ain’t real

They may be not real now. And I know you are a moron but it’s very easy to find of Govt abuse of things that were not supposed to be a problem at the time. The very law that the FBI/DOJ are relying on are a shinny example.

This isn’t random selecting phone for scanning or remote exploits

THIS one. What about the rest? As repeatedly said before (and ignored by you) there’s plenty of examples of the Govt being an ass and generally ignoring civil rights.

it’s making it reasonable to unlock a phone’s pincode when it’s already in police possession, and a warrant is issued by the court to open it.

This shows how much you are confused. Apple isn’t even being asked to unlock the phone. And as repeatedly said before it is not reasonable. You keep focusing in the goddamn crime. If we follow your logic it’s ok for law enforcement to do whatever they want, because CRIME!

The nightmare stories being told are whole cloth material. Don’t fall for it.

That’s what we were told back in 2001. That no Constitutional Rights would be eroded. I don’t know how old you are but I’m sure you fell for it and probably still think it’s all ok even with the wealth of evidence proving this wrong.

No, rather that the method chosen by Apple to be secure at the type of thing that could be hacked and thus put the device at risk.

And you are painfully wrong. It cannot be hacked unless Apple tramples with the security system directly (if they can of course, we don’t know yet).

Short passwords are the “cheap lock”

No, they are cheap keys. But you can use cheap keys if you prevent others from getting a copy of it.

without having to worry about delays or other OS based protections that can be changed in firmware.

That would be the weak link: if it can be deactivated at Apple’s will then it doesn’t matter how long the pin is because at some point technology will be fast enough to brute force it.

And you having become a paranoid person overnight either. it’s takes a whole lot of reading and re-reading the Alex Jones like banter going on to think that the government really gives a crap about your specific personal phone.

I don’t know who this guy is. And besides you only have to read history books (up to the very recent history) to see that MAYBE this Government gives a crap (HAHAHAHAHAHA) but what about the next? It’s not like the intelligence services are collecting everything you do electronically and keeping it against the Constitution. No, they’d never do it, right?

JonK (profile) says:

Re: Re: Re:3 Whatever

DILLWAD…DILLWAD…DILLWAD

I live the OPM nightmare. Having held a security clearance prior to the break in, OPM “no longer has any data” on me.

“…Don’t fall for it”?: our government can’t protect any data. …and you want to give them access to what? Go live in a glass house with no curtains, and post your life on poster boards in your front yard. Though that’s the access asked for, know that it is illegal for you to provide it, “We must protect the children” from you.

That One Guy (profile) says:

Re: Re: Re: Re:

Yes, but it is not the encryption, nor does it magically permit access to the data by itself, nor does it provide a “golden key” to the phone.

They are being ‘asked’ to disable the security protecting the device, the difference between that and a ‘golden key’ is effectively nothing in practice. A ‘golden key’ allows them to always have access any time they feel like it, bypassing any protections on the device/system, this does the same thing, it just requires that they go to the company and demand that they remove the security protecting the device first.

NO, ABSOLUTELY NOT. Nobody is asking Apple to break their own encryption. If Apple’s encryption was strong enough (and not basically attached to a simple pincode) then we wouldn’t have this discussion at all. But for two simple programming tricks (limit count and time delay), Apple’s entire protection scene would be very very weak.

That’s like saying that without a lock any door is easy to open.

You keep treating the two systems, pin-code and delay/device wipe as though only the first was part of the security, when both of them are. It’s kinda like saying that the only security a bank really has is the bank vault door, so taking away all the guards and disconnecting the electronic surveillance wouldn’t be negatively impacting the security.

Yes, the pin-code on it’s own would be weak, given there’s devices that are specifically designed to run through all the possible numbers as fast as possible, that’s why the other features were added. It doesn’t matter if one someone can simply run through the numbers until they get the right one, if they only have ten tries to do so, with increasingly large delays between attempts.

With the ability to brute-force a password with minimal effort the pin effectively isn’t the security protecting the device, the delay/wipe functions are, and those are what are what the FBI/DOJ through the courts are ordering removed, via, and I’m repeating this yet again because it’s so freakin’ important custom code that Apple is being forced to create to undermine their own security features.

A company is being ‘asked’ not to hand over something they already have, but to create something with the express purpose of removing security features that they implemented in the first place. How do you not see that as a problem, and how is it that you don’t think that if the FBI/DOJ wins this they won’t demand that it be done by other companies?

I think that it would be used in appropriate situations where weak encryption systems are in place which can generally be accessed through simple methods. I suspect it will be rendered moot by Apple rolling out a more significant / more secure system that will make brute force hacking, no matter the delays or try counters, meaningless. The ruling has value only as long as companies like Apple make weak security (and try to pass it off as super strong).

That ‘weak’ security is in this case more than enough to stop the FBI/DOJ cold. That doesn’t strike me as very ‘weak’, but ultimately it doesn’t matter how strong it is if a court can simply order it removed to allow access to what it was protecting.

Yes I do. But I think that almost any system that has a human interaction point (passcode) is likely to be suspectible to brute force methods because the public wants generally to use the smallest / shortest password possible to make things easy to live with.

Well yes, if you want the general public to use security you need to make it easy enough that the majority will be willing to use it. That part of security is always going to be the weakest, what companies can do to make up for that weakness is what Apple has done here, making it so that someone can’t just keep guessing until they get the right combination.

If a company is dealing with a small, specialized group that is willing to jump through extra hoops to keep their stuff secure, they can make said security more difficult to get around at the user level, but a company that is trying to sell to the general public basically has two options, no security or ‘weak’ security at the user level, and compensate for that weakness.

On that basis, unless they enforce good and long passwords, they are very likely to ALWAYS have a system that came be compromised. The real “fix” will be there, and nowhere.

Right, which would you say is more secure:

A longer password that can be guessed an infinite amount of times.

Or a shorter password that can be guessed ten times before the content it’s protecting is lost.

A longer password on it’s own is more secure than a shorter one, no argument there, but it’s possible to make the difference moot by adding in additional security, such that what makes the longer password stronger(more required guesses) isn’t a factor. It doesn’t matter if a longer password took one-thousand guesses to get right and a shorter password only took one-hundred if you only have ten guesses available, at that point it’s simply a matter of luck and hoping the one who set the password was a lazy idiot and went with all zeros.

Until they, I suspect the FBI will use the results of this case as an effective tool for accessing locked phones IN THEIR POSSESSION AS PART OF A CRIMINAL CASE (uppercase to make a sigificant point) and not as some sort of random back door to read your email during a traffic stop.

And again you miss what people are objecting to. It’s not a matter of ‘police might check our phones at a whim'(that’s what passwords in general are for), it’s the idea of forcing a company to create custom code that allows someone else, in this case the FBI, to bypass security on a device.

Anonymous Coward says:

Re: Re: Re:5 Re:

Whatever in a nutshell, really. He thinks he’s so above everyone else, nobody should be allowed to complain when the government wants in on their data, because everyone else is a low-life criminal and/or pirate. Won’t happen to him, because he’s apparently a world-class model citizen.

If Google or Apple or Facebook wants a little of that information, though, watch him froth at the mouth and lose his shit. Playing with anonymity, the same anonymity that he demands to be stripped from everyone else, is his only advantage. He knows that any attempt to establish his credibility would instantly ruin it. How else could you argue for things like reduced police oversight?

Trails (profile) says:

Re: Re:

The phone will still be secure with those patches applied

Bullshit. Password attempt limits and delays are security measures meant to counter brute-force attacks. Removing them makes the phone less secure.

To demonstrate your belief in your own nonsense, I’m sure you’d apply this patch to your phone should it ever become available, yes?

Dkone says:

Re: Re:

It is a shame techdirt doesn’t have a button to ‘mark this post as stupid”

If this tool to weaken security it created, it can be used on any phone with the same version iOS. If you don’t understand what this tool would do I will try to explain it in a simple manner.

It will allow the government unlimited access to rapidly input (ie.. electronically) random pass codes. This is know as a brute force attack. If the phone has a 4 digit code, it will be cracked in a trivial amount of time. Without knowing how fast the interface is between the phone and the device sending the code it is hard to say how long, but it will be quick.

Suffice it to say, this tool IS the back door.

Whatever (profile) says:

Re: Re: Re:

“It will allow the government unlimited access to rapidly input (ie.. electronically) random pass codes. This is know as a brute force attack”

DUH. No, they won’t put in random passwords, they will start at one end of the possible password list (000000) and move towards the end (99999) a step at a time until they hit the code tha unlocks it.

“If this tool to weaken security it created, it can be used on any phone with the same version iOS.”

Since Apple uses version signing and other (quite effective) methods to block unauthorized changes to their phones (you know, the ones you “own”), it’s not something that can be applied randomly to any phone at any time. Moreover, it has a limited use to hackers and others as it requires physical long term access to the device in order to profit from it. It’s not a back door in any sense that it provides access without fulfilling the normal security requirements (entering a valid pincode).

If it was a backdoor, you could put it on any Iphone and have instant access. You will NOT have access even if you apply this update. As a human, entering 1 pin code every 5 seconds (because there is a limit on how fast you can accurately enter them), it’s going to take you a year or more of full time (8 hour per day) work to be able to brute force a device. A backdoor would give you this access directly.

nasch (profile) says:

Re: Re: Re: Re:

No, they won’t put in random passwords, they will start at one end of the possible password list (000000) and move towards the end (99999) a step at a time until they hit the code tha unlocks it.

Often when someone says “random” it’s more accurate to substitute “arbitrary”. I think that is the case here. I don’t think he meant to suggest the FBI would use a psedorandom number generator to generate passcodes.

Whatever (profile) says:

Re: Re: Re:2 Re:

“I think your issue is you insist on defining a backdoor as immediate access, and almost no one else does. You can argue with people forever if you redefine the terms to suit yourself.”

I think it’s very important though. The mistake many are going with here is that Apple is being asked for a golden key or to completely and instantly defeat all encryption and passwords. It’s just not the case. They will certainly make it easier to get in to the individual phone in question, but they will not be magically rendering encryption or security moot on the hundreds of millions of other devices they have sold.

Anonymous Coward says:

Re: Re: Re:3 Re:

They will certainly make it easier to get in to the individual phone in question, but they will not be magically rendering encryption or security moot on the hundreds of millions of other devices they have sold.

Rather, the FBI will be able to go to Apple and have Apple break the security, upon demand…for potentially hundreds of millions of devices.

So it it simple laziness by the FBI then?

Wyrm (profile) says:

Re: Re: Re:3 Re:

“The mistake many are going with here is that Apple is being asked for a golden key or to completely and instantly defeat all encryption and passwords. It’s just not the case.”
Isn’t it your mistake assuming we do? Particularly here on Techdirt after a number of articles pointing out clearly that Apple is “only” asked to take down the number and delay of retries, I think that’s quite a wrong assumption to make.

So calling it a backdoor might not suit your definition of it (no “instant access” here), but it does ours. And it does weaken security all around, even if YOU think that this feature is just “bubble wrap” around the lock. (Must be Titanium-level bubble-wrap though, given how it’s so annoying to the FBI.)

You’re the only here who doesn’t want to call it a backdoor and you’re not making your point across. At least you focused on this until TOG asked you very direct questions. (And it was nice of you to reply as straight-forwardly.)

Now if we could drop this nonsense about “backdoor/non-backdoor”, we can maybe move on with the actual debate.

Ninja (profile) says:

Re: Re: Re: Re:

Dude. If Apple does to this phone they can do to whichever phone. Are you that dumb? And law enforcement will have whatever time they need to physically rape the phone. Except that if there is a decent security system it will take much longer effectively making it unfeasible.

If it was a backdoor, you could put it on any Iphone and have instant access.

As opposed to using it on any iphone and getting access in a few minutes to a few hours. Oh the huge, unbearable difference. Do you really believe that if someone wants to get access to a phone they will type manually?

Seriously.

Whatever (profile) says:

Re: Re: Re:2 Re:

“f Apple does to this phone they can do to whichever phone”

Yes, but only if (a) the phone is in physical possession, because making this change to a remote phone wouldn’t do anything (the phone would still be secure and encrypted), and (b) only if Apple makes no other changes ever in their security system of their phones.

I am starting to think that part of Apple’s legal objections here are to create a delay while they roll out an iOS update that will enforce stronger passwords or that will otherwise render the court’s order moot for any other devices except this one.

“As opposed to using it on any iphone and getting access in a few minutes to a few hours. “

Not a few minutes, sorry – and unlikely even a few hours. There is a limit to how many pincode attempts the phone can process. While Mike optimistically claims less than a day on the current system, the reality is much more like a week to a month.

“Do you really believe that if someone wants to get access to a phone they will type manually?”

Nope, but again, there is a limit as to how many requests the phone itself can handle. You can’t go massively parallel (because you can’t duplicate the phone), so you are stuck dealing with a single device. It’s likely to be no more than 1 or 2 a second from what I can figure.

Ninja (profile) says:

Re: Re: Re:3 Re:

because making this change to a remote phone wouldn’t do anything (the phone would still be secure and encrypted)

No, it would be vulnerable for those that can get physical access to it. Put that in your head: the Government is not the only one that may get access. And more importantly, the Government cannot be trusted to always act within reasonable boundaries.

While Mike optimistically claims less than a day on the current system, the reality is much more like a week to a month.

And it doesn’t matter. They’d still have access. Remember, it doesn’t matter if China gets access in a few hours or a week. Once the content of a dissident is decrypted it’s all over. And that’s the point here. The criminals in question should be tried and punished yes. But them alone, not the entirety of the population.

It’s likely to be no more than 1 or 2 a second from what I can figure.

Actually it has been pointed before. The actual hardware delay is around 80 milliseconds. And again it doesn’t matter.

Whatever (profile) says:

Re: Re: Re:4 Re:

“Actually it has been pointed before. The actual hardware delay is around 80 milliseconds. And again it doesn’t matter.”

Still 22 hours… even at that high speed. Add a 7th digit, and it’s 220 hours. 8th digit and it’s 2200 hours, and 9 it’s 22,000 hours.. or two and a half years.

“And it doesn’t matter. They’d still have access. Remember, it doesn’t matter if China gets access in a few hours or a week. Once the content of a dissident is decrypted it’s all over. And that’s the point here. The criminals in question should be tried and punished yes. But them alone, not the entirety of the population.”

…and you don’t think China doesn’t already have access? That’s another part of the deal here, Apple may not want to admit or expose that they have already made the deal with the devil a long time ago, and all the FBI is asking for amounts to having the “China version OS” applied to an American phone.

Anonymous Coward says:

Re: Re:

You need to work on definitions of a backdoor.

Actually, it’s YOU who need to work on the definition.

They are seeking the very limited concept of modifying the OS on a single phone to make it so there is no 10 tries limit, as well as to remove an artificial processing delay of 5 seconds per try.

This is something that defeats the security inherent in the device. As I told you in the other post, (where you also spewed out similar bullshit), per Wikipedia:

A backdoor is a method, often secret, of bypassing normal authentication in a product, computer system, cryptosystem or algorithm etc.

So, to make it clear to you, for the 5th time – the method they’re asking for bypasses normal authentication method. Otherwise, they wouldn’t need code to do it. As such it qualifies as a backdoor.

No amount of law-enforcement-induced-delusion spin will change that. Perhaps you’d also like to argue about what the definition of “is” is while you’re at it…

A backdoor would assure full access. There is no assurance here.

And you assume that there’s no assurance? You seriously want to argue that the FBI might not have the ability to count in sequence?

Whatever (profile) says:

Re: Re: Re:

“So, to make it clear to you, for the 5th time – the method they’re asking for bypasses normal authentication method. Otherwise, they wouldn’t need code to do it. As such it qualifies as a backdoor. “

Once again, it doesn’t bypass the normal authentication methods – it only exposes that method without other restrictions. The authentication method is “enter a pincode”, and nothing else. Everything else are barriers to using the authentication system.

Remember, if the phone isn’t unlocked because of the changes, it’s because the authentication method is still in place and still intact.

Anonymous Coward says:

Re: Re: Re: Re:

Once again, it doesn’t bypass the normal authentication methods

And once again, you’re wrong, and continuing to look like an ass.

Why not stick to things you’re better versed in, like complaining about starving artists and piracy?
Because frankly, your understanding of how security works just plain sucks.

Anonymous Coward says:

Re: Re: Re:2 A simple typo

And that if you fuck up, byebye to what’s inside your phone. Not sure if it also bricks it.

For sure it’s a deterrent to steal a phone. Or at least to use it or see what’s inside.

Now maybe it’s hard, but computers get better. If I get a password cracked in a day or 2 from a mobile I stole, I can do a lot of things with it. Or with the data inside, like, for starters, selling it.

And considering that as time passes, you are forced more to using mobiles and devices to do stuff (like ewallets, wait until your bank starts making you pay for using ATMs. And they will); losing or getting your mobile stolen will be more of an issue that is now.

As if it wasn’t enough having it stolen, now you’ll have to worry if by any chance, some kid that knows too much about computers, has cracked it open or not.

Anonymous Coward says:

I would imagine one of the big hitters for Apple is what happened to Cisco after the Snowden releases. Cisco took a major hit on the global market because of the backdoors that the NSA put in the routers while they were in transit through the postal service.

Should Apple do this of free will to assist, they can kiss their global market for cell phones goodbye because the selling point of their data being their data went out the window.

Before this is over with, much of US technology will not be wanted by the rest of the world because they can’t trust that what they put in their phones are really private.

Anonymous Coward says:

every american tech company that has any sense at all is looking hard right now for how they are going to move off-shore. either that or they are sitting around, speculating on what foreign companies to try to buy.

any not engaged in either of those activities is well on the way toward becoming the answer to a tomorrow’s trivia question.

Ehud Gavron (profile) says:

First Amendment issue

I see a strong First Amendment issue here.

Apple is being ordered to use their private signing certificate to sign code that the government has specified to be written.

In forcing Apple to sign something the government would be taking away Apple’s sole right to speak (or not speak) on its behalf by virtue of signing (or not signing) code.

E

Ehud Gavron (profile) says:

Re: First Amendment issue

Looks like Apple’s counsel agrees with my comment from two days ago:

[quote]The code must contain a unique identifier “so that [it] would only load and execute on the SUBJECT DEVICE,” and it must be “‘signed’ cryptographically by Apple using its own proprietary encryption methods.” Ex Parte App. at 5, 7.

This amounts to compelled speech and viewpoint discrimination in violation of the First Amendment.[/quote]

Apple Inc’s motion to vacate order compelling…
p.43

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...