Law Enforcement Freaks Out Over Apple & Google's Decision To Encrypt Phone Info By Default

from the we're-all-gonna-die dept

Last week, we noted that it was good news to see both Apple and Google highlight plans to encrypt certain phone information by default on new versions of their mobile operating systems, making that information no longer obtainable by those companies and, by extension, governments and law enforcement showing up with warrants and court orders. Having giant tech companies competing on how well they protect your privacy? That’s new… and awesome. Except, of course, if you’re law enforcement. In those cases, these announcements are apparently cause for a general freakout about how we’re all going to die. From the Wall Street Journal:

One Justice Department official said that if the new systems work as advertised, they will make it harder, if not impossible, to solve some cases. Another said the companies have promised customers “the equivalent of a house that can’t be searched, or a car trunk that could never be opened.”

Andrew Weissmann, a former Federal Bureau of Investigation general counsel, called Apple’s announcement outrageous, because even a judge’s decision that there is probable cause to suspect a crime has been committed won’t get Apple to help retrieve potential evidence. Apple is “announcing to criminals, ‘use this,’ ” he said. “You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.”

The level of privacy described by Apple and Google is “wonderful until it’s your kid who is kidnapped and being abused, and because of the technology, we can’t get to them,” said Ronald Hosko, who left the FBI earlier this year as the head of its criminal-investigations division. “Who’s going to get lost because of this, and we’re not going to crack the case?”

That Hosko guy apparently gets around. Here he is freaking out in the Washington Post as well:

Ronald T. Hosko, the former head of the FBI?s criminal investigative division, called the move by Apple ?problematic,? saying it will contribute to the steady decrease of law enforcement?s ability to collect key evidence ? to solve crimes and prevent them. The agency long has publicly worried about the ?going dark? problem, in which the rising use of encryption across a range of services has undermined government?s ability to conduct surveillance, even when it is legally authorized.

?Our ability to act on data that does exist .?.?. is critical to our success,? Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.

Think of the children! And the children killed by terrorists! And just be afraid! Of course, this is the usual refrain any time there’s more privacy added to products, or when laws are changed to better protect privacy. And it’s almost always bogus. I’m reminded of all the fretting and worries by law enforcement types about how “free WiFi” and Tor would mean that criminals could get away with all sorts of stuff. Except, as we’ve seen, good old fashioned police/detective work can still let them track down criminals. The information on the phone is not the only evidence, and criminals almost always leave other trails of information.

No one has any proactive obligation to make life easier for law enforcement.

Orin Kerr, who regularly writes on privacy, technology and “cybercrime” issues, announced that he was troubled by this move, though he later downgraded his concerns to “more information needed.” His initial argument was that since the only thing these moves appeared to do was keep out law enforcement, he couldn’t see how it was helpful:

If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It?s just a nice piece of paper with a judge?s signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple?s inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn?t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.

Apple?s design change one it is legally authorized to make, to be clear. Apple can?t intentionally obstruct justice in a specific case, but it is generally up to Apple to design its operating system as it pleases. So it?s lawful on Apple?s part. But here?s the question to consider: How is the public interest served by a policy that only thwarts lawful search warrants?

His “downgraded” concern comes after many people pointed out that by leaving backdoors in its technology, Apple (and others) are also leaving open security vulnerabilities for others to exploit. He says he was under the impression that the backdoors required physical access to the phones in question, but if there were remote capabilities, perhaps Apple’s move is more reasonable.

Perhaps the best response (which covers everything I was going to say before I spotted this) comes from Mark Draughn, who details “the dangerous thinking” by those like Kerr who are concerned about this. He covers the issue above about how any vulnerability left by Apple or Google is a vulnerability open to being exploited, but then makes a further (and more important) point: this isn’t about them, it’s about us and protecting our privacy:

You know what? I don?t give a damn what Apple thinks. Or their general counsel. The data stored on my phone isn?t encrypted because Apple wants it encrypted. It?s encrypted because I want it encrypted. I chose this phone, and I chose to use an operating system that encrypts my data. The reason Apple can?t decrypt my data is because I installed an operating system that doesn?t allow them to.

I?m writing this post on a couple of my computers that run versions of Microsoft Windows. Unsurprisingly, Apple can?t decrypt the data on these computers either. That this operating system software is from Microsoft rather than Apple is beside the point. The fact is that Apple can?t decrypt the data on these computers is because I?ve chosen to use software that doesn?t allow them to. The same would be true if I was posting from my iPhone. That Apple wrote the software doesn?t change my decision to encrypt.

Furthermore, he notes that nothing Apple and Google are doing now on phones is any different than tons of software for desktop/laptop computers:

I?ve been using the encryption features in Microsoft Windows for years, and Microsoft makes it very clear that if I lose the pass code for my data, not even Microsoft can recover it. I created the encryption key, which is only stored on my computer, and I created the password that protects the key, which is only stored in my brain. Anyone that needs data on my computer has to go through me. (Actually, the practical implementation of this system has a few cracks, so it?s not quite that secure, but I don?t think that affects my argument. Neither does the possibility that the NSA has secretly compromised the algorithm.)

Microsoft is not the only player in Windows encryption. Symantec offers various encryption products, and there are off-brand tools like DiskCryptor and TrueCrypt (if it ever really comes back to life). You could also switch to Linux, which has several distributions that include whole-disk encryption. You can also find software to encrypt individual documents and databases.

In short, he points out, the choice of encrypting our data is ours to make. Apple or Google offering us yet another set of tools to do that sort of encryption is them offering a service that many users value. And shouldn’t that be the primary reason why they’re doing stuff, rather than benefiting the desires of FUD-spewing law enforcement folks?

Filed Under: , , , , , ,
Companies: apple, google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Law Enforcement Freaks Out Over Apple & Google's Decision To Encrypt Phone Info By Default”

Subscribe: RSS Leave a comment
106 Comments
Uriel-238 on a mobile device (profile) says:

Re: Re: The dark and gritty ages.

The middle ages was also when the Holy Inquistion reasoned that they could torture witnesses as well.

And they would do so until a given witness would confess another person that may have seen or heard something.

Sometimes the inquisitioner would torture an entirenvillage to get to the enemies of the Church.

Let’s call this one a cautionary tale, yes?

M. F says:

Re: Have we all forgotten those dark ages?

smart phones add a new element to everything. they are integrated into almost everyone’s daily lives.
they can be used as very effective tools for criminals as well as they are useful for regular folks.
cyber stalking, ect.
some people post every moment of their lives and that makes very useful for kidnapping and everything that follows.

Anonymous Coward says:

Three things…

Mr. Kerr mentions that he had fewer concerns if physical access was required. How many phones are stolen every day?

Second, he neglects to mention that police have been taking peoples phones and reading the contents without a warrant for years. They even have law enforcement approved tools to do so.

And lastly, CALEA would still allow a lawful intercept of communications through the provider.

Anonymous Coward says:

There is a reason these are being made available as standard features. They’ve always been out there for you to use without the permission of the makers of hardware. You can do this yourself. So where is all the hubbub about these programs existing to begin with? That part is strangely silent and not mentioned, like it is not a concern unlike encrypted phones coming from the OEM.

What that point above means is that the increased use will return privacy back to the individual. None of this would be necessary had it not been abused, had the public not had it’s nose rubbed in this, had there been any sort of check, balance, or method to reign in these spying yokels.

People understand it has been a violation of their rights. No matter how it is dressed, it still comes out looking like a skunk and they want something done about it, whether the government/authorities/congress/whatever agree or not.

They’ve already had a prime example of how congress views being spied on and the public doesn’t have that avenue to make their dislikes strongly known.

Robert says:

Missing...

One relevant issue…

“Apple Still Has Plenty of Your Data for the Feds”

“If law enforcement confiscated your phone and wanted to snoop at its data, all they would have to do is serve Apple a warrant and to get a copy of the plaintext data. A version of Apple’s Legal Process Guidelines for U.S. Law Enforcement dated May 7th, 2014 explains:”

From The Intercpt

Mason Wheeler (profile) says:

I’m writing this post on a couple of my computers that run versions of Microsoft Windows. Unsurprisingly, Apple can’t decrypt the data on these computers either. That this operating system software is from Microsoft rather than Apple is beside the point. The fact is that Apple can’t decrypt the data on these computers is because I’ve chosen to use software that doesn’t allow them to. The same would be true if I was posting from my iPhone. That Apple wrote the software doesn’t change my decision to encrypt.

He really should have picked a better example. This argument basically boils down to “I own my device and therefore I have the right to use it as I wish.” And as sensible a position as that is, and as much as I agree with it, Apple, specifically, has made it painfully clear from Day 1 that that is not the case. You may have purchased it, but you do not have anything resembling traditional rights of control over your own property; Apple does. That’s what their “walled garden” is all about: your property is not your property, you pay for it but Apple still controls it and dictates what you can and cannot do with it.

If you choose to encrypt your iPhone, you do so at Apple’s sufferance. Do you really believe that they don’t have a way in, their claims notwithstanding?

Mark Draughn (profile) says:

Re: Yup

Mason, I agree, Apple’s approach to control of the iPhone platform has not been helpful, and I think their attitude is part of the reason why people like Hosko think they should be able to get our data by going through Apple without involving us. But as you say, “I own my device and therefore I have the right to use it as I wish” is more sensible, even if Apple and the FBI would disagree.

I think that if Apple does have a way in, but they are responding to warrants by saying they don’t, they would get into a lot of trouble. Maybe even criminal indictments for obstructing justice. So I kind of doubt it.

John Fenderson (profile) says:

Orin Kerr's comment

In Orin Kerr’s WP article, he makes this statement:

The civil libertarian tradition of American privacy law, enshrined in the Fourth Amendment, has been to see the warrant protection as the Gold Standard of privacy protections.

I found this interesting, and telling. He’s talking about the law, and in that context I don’t think this statement is incorrect — that is the tradition. However being the tradition doesn’t make it actually true.

Warrant protection is a far cry from being a “Gold Standard”. I would argue that it’s the minimum standard, instead. It’s on the weak end of privacy protection scale — the weakest that anyone in their right mind would accept.

Anonymous Coward says:

The level of privacy described by Apple and Google is “wonderful until it’s your kid who is kidnapped and being abused, and because of the technology, we can’t get to them,” said Ronald Hosko

Given how many police have made the news in recent years for raping little girls (and get away with it because police chiefs laugh off any and all public complaints), “because of the technology, we can’t get to them” is sending the opposite message he intended.

Anonymous Coward says:

This is going to trigger congress to pass a law requiring law enforcement backdoors in all devices, while making it a criminal offense to use or create encryption schemes without said backdoors.

There was a time when i would have thought that the above idea was ludicrous for all the obvious reasons. Now, with all the liberty erosion i’ve seen since 9/11, i’m convinced those reasons won’t matter.

Once the DOJ really starts barking loudly about this, congress will act and it will all be over but the crying.

John Fenderson (profile) says:

Re: Re:

“There was a time when i would have thought that the above idea was ludicrous for all the obvious reasons.”

Given that there was a strong effort to do exactly this back in 1993 (search for “key escrow” and “clipper chip”), the idea certainly isn’t ludicrous. It was (and, I hope, still is) politically infeasible, though.

Anonymous Coward says:

Re: Re: Re:

Good luck with that. This nation is doomed short of blood in the streets.

We have more illegals to deal with than possible for the electorate to restore our liberties.

The current new influx of illegals will have more freedom and liberty here than their former lives if we lost 50% of what we have.

JMT says:

Re: Re: Re: Re:

“This nation is doomed short of blood in the streets.”

But did you read what Hosko claimed?

“He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.”

That sounds like the nation is doomed if there is blood in the streets. Because you know any violent push-back by the citizenry will immediately be labeled terrorism and responded to accordingly.

nasch (profile) says:

Re: Re: Re:2 Re:

“He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.”

If the pendulum is currently on the side of not giving authorities access to information, what would the other side look like? Never mind, I don’t think I want to know.

Niall (profile) says:

Re: Re: Re: Re:

Somehow, I don’t think your limited numbers of illegals are your problem. I think the larger number of government (local and federal) officials and corporate oligarchs are much more your problem.

How an illegal immigrant affects your liberties the way the NSA, ICE or Disney do, I don’t know.

But you know, easy RW target…

Anonymous Coward says:

Re: The usual excuses

NO, because you live in a peaceful society that has only sparingly used the guise of “fighting a foreign enemy” to take your rights away. Now they just happen to be on an accelerated pace.

Assuredly I say do you, we all have ridiculed those speaking against the encroachment of government against our liberties.

There is an easy way to tell if you are one such individual at odds with the constitution.

Do you support law that removed a felons right to keep and bear firearms.

If you do, then you have no standing… if that constitutional right can be removed just because you now have a label affixed to your status then so can your very right to life.

Rich Kulawiec (profile) says:

Last week, we noted that it was good news to see both Apple and Google highlight plans to encrypt certain phone information by default on new versions of their mobile operating systems, making that information no longer obtainable by those companies […]

Why should we believe this? Yes, I know that’s what they said, but why should we believe that this is actually true? What evidence — what independently-verifiable evidence, actually — is on the table to prove this claim?

I’m echoing/paraphrasing John Gilmore’s insightful comments here — which I STRONGLY recommend to everyone:

http://www.metzdowd.com/pipermail/cryptography/2014-September/022919.html

I think it’s a little too early to conclude that what these glowing press releases claim is really true. Or that if true, that it will remain true for long. It seems quite unlikely that rapacious data predators who have already proven over and over and over again that they will do absolutely anything to acquire data, including breaking any laws that get in their way, will simply sit back and quietly accept this as the new status quo. Why would they do such a thing when they have every motivation to do otherwise and when they can rest assured that no matter what they do, they will never face any consequences of any kind?

nasch (profile) says:

Re: Re:

It seems quite unlikely that rapacious data predators who have already proven over and over and over again that they will do absolutely anything to acquire data, including breaking any laws that get in their way, will simply sit back and quietly accept this as the new status quo.

I don’t see why not, they don’t need access to your data while it’s on your phone. They just need you to use their services and apps that send them your data. Anyone not using their services isn’t going to be particularly attractive to them anyway, and if they can get more people to use them by offering security tools, they benefit.

Stephen Hutcheson says:

Re: Policies, and laws

“The policy switch doesn’t stop hackers, trespassers, or rogue agents.”

WHAT PERPENDICULARLY INVERTED UNIVERSE IS THIS MOROID FROM?

No doubt, back on the planet Htrae (helically orbited by a glowing Nus), street gangs are being told by their thug chieftains to focus on mugging yuppies with Elppa 9 cellphones because “with these phones, there’s nothing to stop us from stealing all the customer’s addresses, bank accounts, and valuable information. It’s just encrypted so the Ecilops can’t see it with a warrant! And those tricks of calling Elppa and pretending to be the customer, or pretending to be an Ecilop, or presenting a fraudulent warrant–they doesn’t work, but we don’t need them any more because NOW THERE’S NOTHING STOPPING US!”

But here on Earth, where Boolean logic and the Peano axioms rigorously rule over the realm of possibility for police and child abusers alike, things are different.

Anonymous Coward says:

Re: Re:

“How is the public interest served by a policy that only thwarts lawful search warrants?”

Off topic … Didn’t the NSA use a lawful search warrant to pull a couple trillion individual records last year?

No. They used an unlawful search demand issued by a secret court pursuant to a secret misinterpretation of law to pull those records.

John Fenderson (profile) says:

Re: Nice, but...

They might, but this would be a major legislative effort that couldn’t happen quickly — and would stir up a major debate once the effort begins.

It’s one thing to require companies to give access to stuff they already have. It’s quite another to require companies to engineer their products in a particular way to make that possible.

John Fenderson (profile) says:

Re: encryption

Not actually that relevant here. What Google & Apple are doing is making it impossible for them to comply, even if they wanted to really, really badly (for instance, because their knees are getting whacked.)

The cartoon addresses being forced to hand over your own keys — nothing in Apple or Google’s actions affects that possibility one bit. It just means that you’ll get the subpena instead of Apple or Google.

Anonymous Coward says:

probable cause

No, you can’t be forced to disclose your password or unlock a phone, unless and that’s the big caveat the government already can prove you know it.

Entering a password or unlocking a safe is a testimonial act because it may reveal ownership, custody, and knowledge of the contents inside the box and may authenticate the person as its owner.

We are in fifth Amendment territory, and you have an absolute right not to give the government incriminating information.

The government can only overcome the Fifhth Amendment bar by granting act of production immunity or prove your knowledge from an independent source.

Probable cause has nothing to do with the Fifth Amendment, and a valid Fourth Amendment search does not negate the Fifth Amendment protection against self incrimination.

John Fenderson (profile) says:

Re: probable cause

“We are in fifth Amendment territory, and you have an absolute right not to give the government incriminating information.”

This isn’t settled law at all. Some courts have ruled in the way you say, others have ruled the opposite. Generally, courts have ruled that if the police already know that there is evidence in the encrypted data, you can be compelled to reveal the key, but you can’t be compelled to reveal the key so they can go on a fishing expedition.

Anonymous Coward says:

Re: Re: probable cause

Generally, courts have ruled that if the police already know that there is evidence in the encrypted data, you can be compelled to reveal the key, but you can’t be compelled to reveal the key so they can go on a fishing expedition.

If the police can prove that there is evidence of wrongdoing in encrypted data, they do not need it decrypted, if it only requires them to claim to know such evidence is in the encrypted data, they are fishing.

John Fenderson (profile) says:

Re: Re: Re: probable cause

“If the police can prove that there is evidence of wrongdoing in encrypted data, they do not need it decrypted”

Not true at all. It’s completely possible to know that evidence of wrongdoing exists somewhere but not have that evidence in your possession.

Really, this isn’t much different than what is required for search warrants: cops can’t (technically) get a search warrant for a fishing expedition either. They have to demonstrate to a judge that they are looking for specific evidence that they already know exists in the place they’re searching.

Anonymous Coward says:

Re: Re: probable cause

John,

This is what really burned my toast in Orrin Kerrs writeup, he seems to think the fifth is a foregone conclusion, because of Boucher. Now, I looked into Boucher, and its like you said: The gov’t already KNEW that there was evidence in Bouchers computer, because they had seen it at the border.

My view is that gov’t can’t compel you to reveal the your password if they don’t know whats on your phone. They can’t just go on fishing expeditions.

I don’t think they can just browbeat you with the threat of contempt until you give up your password if they don’t know what you have. Kerr seems to think otherwise, which is a shame given his credentials.

Uriel-238 on a mobile device (profile) says:

Re: Re: Fifth Amendment territory.

Since the police are so eager to get the bad guy, preferring false positives over false negetives, why don’t they just shoot everyone with an encrypted cell phone? After all cell phones have beem confused as weapons befpre.

A slightly nicer option is to sweat suspects to decrypt their phones Fifth Amendment protections are in place: they don’t have to open the phone, but also the tasings and seasonings with pepper spray don’t have to stop either.

And to cinch things well into legality, the interrogation doesn’t have to stop until the suspect signs a release saying he volunteered the contents of the phone and participation in the interrogation process. What a complaint citizen!

Anonymous Coward says:

probable cause

All the courts having compelled the person to decrypt the information have ruled so because of the foregone conclusion exception to the Fifth Amendment.

In the Boucher and Fricosu cases, the government already knew that the suspect was able to decrypt the computer and the self incrimination privilege could therefore not be invoked because the person by his own admission had made the testimonial aspects a foregone conclusion.

However, in the 11th Circuit case, the government could not
prove that the suspect was able to decrypt, or the existence of an encrypted file system, and Professor Kerr noted that the outcome of the case was likely correct.

Anonymous Coward says:

probable cause

And that’s practically the same outcome even if Congress passes a law like UK’s RIPA S.49.

Under RIPA it’s a criminal offense knowingly to fail to disclose an encryption key to the police.

However, the government must still prove that you know the key and knowingly fail to disclose it beyond a reasonable doubt for the criminal sanction to be applied.

If several users share a computer or online account, or it can’t be proven who has the key no one can be compelled to disclose it.

The sanctions under RIPA are rarely imposed, and only if the suspect openly flouts the requests or is caught in the act of entering the password, he well get convicted.

z! (profile) says:

Children???

Just like we have Godwin’s law (As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1), we need to name the effect for how fast a variant of “Think of the children!!” will appear when discussing civil liberties and related topics.

We also need to say that Yes, we thought of the children, and they’re irrelevant to the discussion.

Anonymous Coward says:

Re: Re:

I have seen (but dont have time to link to) cases where criminals were not conviced because of the fact that all the evidence was encrypted, and the police could not access it. CP, terrorism, and similar activities now can’t be proven, and with services like VPN and proxies on the rise anonymity is increasing.

There is a risk.

I suspect, based on a hint from a little bird, that parent poster has engaged in some form of criminality, but I don’t know what and he encrypted all the evidence, so I can’t prove any of it. Can we convict him anyway? 🙂

Anonymous Coward says:

probable cause

Actually Kerr’s position is more nuanced.

He noted regarding the 11th circuit’s ruling:

“Based on a very quick skim, the analysis seems mostly right to me — in result, at least, although perhaps not as to all of the analysis. I hope to blog
more on the case later on when I have a bit more time.

Also note that the court’s analysis isn’t inconsistent with Boucher and Fricosu, the two district court cases on 5th Amendment limits on decryption. In
both of those prior cases, the district courts merely held on the facts of the case that the testimony was a foregone conclusion.”

http://www.volokh.com/2012/02/23/eleventh-circuit-finds-fifth-amendment-right-against-self-incrimination-not-to-decrypt-encyrpted-computer/

So Professor Kerr seems to agree that the Fifth Amendment can be invoked at least where the government can prove very little about the suspect’s custody and ownership of the data.

paradoxically it means that hard child pornographers who are careful not to talk to the police, and don’t use easily provable encryption can invoke the privilege whereas stupid ordinary people who don’t know or care about criminal procedure will let the cat out of the bag.

The statements yes, the phone is mine and I wont cooperate probably satisfies the foregone conclusion, but the statement I plead the Fifth or even better I only talk to the police with counsel present reveals nothing.

Anonymous Coward says:

Policies, and laws

I have seen (but dont have time to link to) cases where criminals were not conviced because of the fact that all the evidence was encrypted, and the police
could not access it.

You are assuming that there was a crime, and that the incriminating evidence was encrypted.

If the police could prove that there was a crime, but only could find encrypted data, there is no evidence.

Encrypted data is no different from random data, unless you know which algorithm and software was used.

if you already know that encrypted data contains child pornography, you logically has sufficient evidence to convict.

If you only know that there is encrypted data, but don’t know what’s inside, you can’t logically argue that there is evidence of a crime.

And more likely, you even don’t know that the data is encrypted.

It may just be a large blob of random data.

Anonymous Coward says:

“The agency long has publicly worried about the “going dark” problem, in which the rising use of encryption across a range of services has undermined government’s ability to conduct surveillance, even when it is legally authorized.”

So they are admitting that they conduct surveillance when it is NOT legally authorized? That’s a pretty strong admission, considering they’ve denied it so many times before.

Anonymous Coward says:

probable cause

Unless they say the magic word “terrorism” and then all your rights vanish in a puff of smoke… 

No, the Fifth Amendment privilege against self incrimination is applicable to all criminal offenses.

The privilege is about compelling you to testify against yourself and using the coerced testimony to convict you of a crime.

if the police beats you in order to get you to tell them where you have hidden the murder weapon or what your password locking yor child pornography collection, any evidence resulting from the coercion is tainted and can’t be used against you.

This is an extreme form of pressure, but threatening someone with contempt sanction is also compulsion for purposes of the Fifth Amendment.

nasch (profile) says:

Re: probable cause


No, the Fifth Amendment privilege against self incrimination is applicable to all criminal offenses.

Just a thought experiment, what if the President declared you a terrorist, and then had you arrested and taken to Gitmo? I don’t think there are any 5th amendment protections there.

the 2013 NDAA allows this, with no judicial oversight

Anonymous Coward says:

Re: Re: probable cause

don’t forget the Enemy Expatriation act, where the USA government has decided they have the right to strip Americans of their citizenship if they classify them as terrorists.

Considering millions of Americans have been deemed potential terrorists. makes one wonder how many people have been arrested and vanished, since they would no longer be a citizen they could “legally” just be executed or whatever.

MrTroy (profile) says:

The information on the phone is not the only evidence, and criminals almost always leave other trails of information.

Not only that, but the kind of criminal who is smart enough to not leave other trails of information is already either doing their own encryption, or otherwise avoiding anything that will leave a trail.

In fact, if we ever find out that these smart criminals are using the factory encryption, yet being caught through some other means, that will give me a lot of confidence that the encryption works as advertised!

Anonymous Coward says:

So which one is it?

So some times it is “FEEEEAR the hacker and the upcoming CYBER-Pearl-harbor because we are most certainly doomed!”
But then they are all like “FEEEAR the terrorist and for your children because we can’t hack your shit!”. So I guess what they really want is a totally vulnerable system that is impervious to hackers?
Sense it makes not.

nasch (profile) says:

Re: So which one is it?

So I guess what they really want is a totally vulnerable system that is impervious to hackers?

I’m not sure if they’re clueless, or think everyone else is clueless, or both. It’s possible they actually think that if security vulnerabilities are left alone, they’ll be the only ones able to take advantage. It’s also possible they realize this isn’t true, don’t care about it, and assume the public will not understand. The latter seems more likely.

Mark Draughn (profile) says:

Thanks!

Thanks for the link, the quote, and the kind words.

Note that Hosko had to change his WaPo piece. The original title was something like “I helped save a kidnapped man from getting killed. With apple’s new encryption rules, we never would have found him,” but the title has changed to something more generic and there’s now a disclaimer at the bottom that says, “This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.”

Anonymous Coward says:

Fifth Amendment territory.

The Fifth Amendment prohibition on compelled self incrimination is a part of the constitution, and computer technology does not change the fact that compelling a suspect to give testimony against himself is unconstitutional.

The only situations wherein the privilege can be overcome are (1) If the government already can prove from an independent source that you possess a piece of evidence; (2) If the government grants you use and derivative use immunity;

(3) You are compelled to provide the government records kept incident to a valid noncriminal regulatory regime.

The third exception might actually provide a constitutional foundation for compelled key escrow but such a record keeping requirement has never been applied broadly outside tax and financial transactions.

Anonymous Coward says:

Yup

US law may force Apple to grant the US government a backdoor but Apple can’t afford being caught in a big lie while marketing its product overseaws.

If Apple can’t guarantee security to European customers, there is going to be a very nasty fallout if it’s ever revealed that the company deceptively marketed its products as secure.

Also, I don’t think that having an exclusive backdoor for the US government will sit well with other nations’ law enforcement authorities.

Prior to the Snowden leaks, US corporations might well hope that their doubletalk would never be revealed, but Apple may fear that every friendly underhanded deal with the US government will eventually become public.

Other governments might demand that if there is any legally mandated backdoor, this must be on equal terms.

pmshah says:

Law Enforcement Freaks Out

Andrew Weissmann, Apple is “announcing to criminals, ‘use this,’ ” he said. “You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.”

He is so full of sh** ! When FCC or power that may be permitted “burner phones” where was the outrage ? Even today it says exactly what you are attributing to Apple !

In this regard we in India are doing a much better job. No burner phones. No mobile or wired phones without full identity information. No hiding of caller ID. We consider telephony a privilege and a birth right, and rightly so.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...