Once the DOJ told the court in San Bernardino that it had succeeded in hacking into the iPhone of Syed Farook, the big question people asked is whether or not the FBI would then tell Apple about the vulnerability. After all, the administration set up the so-called "Vulnerabilities Equities Policy" (VEP) with the idea of sharing most vulnerabilities it discovers with companies. The White House directly stated:
One thing is clear: This administration takes seriously its commitment to an open and interoperable, secure and reliable Internet, and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest. This has been and continues to be the case.
This spring, we re-invigorated our efforts to implement existing policy with respect to disclosing vulnerabilities – so that everyone can have confidence in the integrity of the process we use to make these decisions. We rely on the Internet and connected systems for much of our daily lives. Our economy would not function without them. Our ability to project power abroad would be crippled if we could not depend on them. For these reasons, disclosing vulnerabilities usually makes sense. We need these systems to be secure as much as, if not more so, than everyone else.
Still, one could make a strong case that this vulnerability should be disclosed... even if almost no one expected it to be. Amusingly, just a few days ago, Apple revealed that the FBI used the VEP to disclose a vulnerabilityfor the very first time, on April 14th, just as everyone was arguing about this. Of course, the flaw it revealed was not about hacking into the iPhone, and was actually about a flaw that Apple had discovered and fixed... nine months ago. But, again, if this is the very first time the FBI has disclosed something to Apple, it certainly suggests that the VEP process generally means nothing gets disclosed. In fact, the timing of this really suggests that someone in the DOJ recently flipped out and realized that there's now going to be scrutiny on the VEP, so they might as well disclose something. Thus, they found an old bug that had already been patched and "revealed" it.
“The F.B.I. purchased the method from an outside party so that we could unlock the San Bernardino device,” Amy S. Hess, executive assistant director for science and technology, said in a statement.
“We did not, however, purchase the rights to technical details about how the method functions, or the nature and extent of any vulnerability upon which the method may rely in order to operate. As a result, currently we do not have enough technical information about any vulnerability that would permit any meaningful review” by the White House examiners, she said.
Now, some are arguing that this suggests absolutely terrible bargaining on the side of the DOJ/FBI. But, another interpretation is that it's how the DOJ knew that it wouldn't have to reveal the flaw to Apple. Of course, this might also explain why the DOJ at one point appeared to claim that the hack in question only worked for Farook's phone. They later claimed that was a misstatement, and it really meant that it only applied to that iPhone configuration. But, if the FBI never actually got the details, then in some sense they'd be right that for the FBI the crack only worked for that one phone. And if they wanted to do it on another phone, they'd have to shell out another ~$1 million or so...
A Philadelphia man suspected of possessing child pornography has been in jail for seven months and counting after being found in contempt of a court order demanding that he decrypt two password-protected hard drives.
The suspect, a former Philadelphia Police Department sergeant, has not been charged with any child porn crimes. Instead, he remains indefinitely imprisoned in Philadelphia's Federal Detention Center for refusing to unlock two drives encrypted with Apple's FileVault software in a case that once again highlights the extent to which the authorities are going to crack encrypted devices. The man is to remain jailed "until such time that he fully complies" with the decryption order.
The Fifth Amendment should prevent the government from punishing a person for not testifying against themselves, which is what's being argued by the defendant's representation in its appeal to the Third Circuit. (Although it's actually indirect representation. The government's case is actually against Doe's devices ["United States of America v. Apple MacPro Computer, et al"] and his lawyer is hoping for a stay of the contempt order during the appeal process.)
Mr. Doe… has a strong likelihood of success on the second issue: whether compelling the target of a criminal investigation to recall and divulge an encryption passcode transgresses the Fifth Amendment privilege against self-incrimination. Supreme Court precedent already instructs that a suspect may not be compelled to disclose the sequence of numbers that will open a combination lock — clearly auguring the same rule for any compelled disclosure of the sequence of characters constituting an encryption passcode.
Doe's rep also argues that the All Writs order obtained by the government has no jurisdiction over Doe or his devices.
Mr. Doe’s first claim is that the district court lacked subject matter jurisdiction. The claim stems from the government’s apparently unprecedented use of an unusual procedural vehicle to attempt to compel a suspect to give evidence in advance of potential criminal charges. Specifically, the government took resort not to a grand jury, but to a magistrate judge pursuant to the All Writs Act, 28 U.S.C. § 1651. (Ex. F at 1).
It is black letter law that the All Writs Act never supplies “any federal subject-matter jurisdiction in its own right[.]” Sygenta Crop Protection, Inc. v. Henson, 537 U.S. 28, 31 (2002) (citation omitted). It is equally well-settled that the Act has no application where other provisions of law specifically address the subject matter concerned. Pennsylvania Bureau of Correction v. United States Marshals Service, 474 U.S. 34, 40-42 (1985). The compelled production of evidence in advance of criminal charges is specifically addressed by Rules 6 and 17 of the Federal Rules of Criminal Procedure, which authorize the issuance and enforcement of grand jury subpoenas; and by 28 U.S.C. § 1826(a), which specifies the authorized penalties for a witness who refuses without good cause to give the evidence demanded by the grand jury.
As it stands now, Doe is still being held in contempt of court for refusing to decrypt his devices for investigators. The district court that held him in contempt has refused direct appeal of that order, resulting in the labyrinthine legal strategy of using the government's case against Doe's devices as a vehicle for challenging the lower court's contempt order.
Doe has not been charged, yet he's in prison. Backing up the government's assertions for holding him in contempt are two dubious pieces of hearsay. One is from his estranged sister, who claims to have seen child porn on Doe's computer, but can't actually say whether it was located on the devices the government is seeking to have decrypted. The other is from some sort of law enforcement encryption whisperer, who can apparently see things in the scrambled bits.
The government’s second witness was Detective Christopher Tankelewicz, a forensic examiner with the Delaware County District Attorney’s Office. He testified only that it was his “best guess” child pornography would be found on the hard drives. (Ex. J at 346). According to Tankelewicz’s understanding of the Freenet online network (in which he admits having no training), there were signs on an Apple Mac Pro computer seized with the hard drives of a user accessing or trying to access message boards with names suggestive of child pornography. (Ex. J at 306, 311-312, 339-340). In rather ambiguous testimony, Tankelewicz did not appear to say this meant any image traded over these boards was on the hard drives. (See Ex. J at 303-317, 336-340, 345-350). Instead, he identified a single image he believed there to be a “possibility” was on the drives. (Ex. J at 308-309). As he described it, the image was of “a four or five-year-old girl with her dress lifted up, but the image itself was small so you really couldn’t see what was going on with the image.” (Ex. J at 308).
No one wants to see a sex offender walk away from charges, but at this point, Doe hasn't even been officially charged with anything more than contempt. The problem with that charge is it has no end date. He can either stay in jail or comply with the order, even when the order conjures jurisdiction out of nowhere and violates his Fifth Amendment rights. If the government doesn't have enough evidence to pursue a case against Doe, it should cut him loose until it does.
That's what makes its recent editorial, The Encryption Farce (possibly behind a paywall, though the version I just opened showed up fine), so remarkable. It completely bashes the FBI over its attempts to force Apple to build a backdoor into its encryption -- though mainly because of the ridiculous fact that in the two most high profile cases, the DOJ magically got into the phones just as the cases got serious. The WSJ editorial doesn't pull any punches, asking what the hell is going on at the Justice Department:
If history repeats itself first as tragedy and then as farce, what does the FBI have in store next for its encryption war with Apple? After withdrawing its demands in San Bernardino and then reopening hostilities with a drug prosecution in Brooklyn, the G-men abruptly dumped the second case over the weekend too. Is anyone in charge at the Justice Department, or are junior prosecutors running the joint?
The editorial goes on to mock the FBI's claim that these cases are all about getting into just that phone, and notes that constantly finding ways in at the last minute are destroying the FBI's credibility.
This second immaculate conception in as many months further undermines the FBI’s credibility about its technological capabilities. Judges ought to exercise far more scrutiny in future decryption cases even as Mr. Comey continues to pose as helpless.
It goes on to suggest that the FBI stop bringing these cases, and that the President and the DOJ should put an end to this ridiculous attack on encryption:
Yet forgive us if this “conversation” now seems more like a Jim Comey monologue. The debate might start to be productive if the FBI Director would stop trying to use the courts as an ad hoc policy tool and promised not to bring any more cases like the one in Brooklyn.
Meanwhile, the White House has taken the profile-in-courage stand of refusing to endorse or oppose any encryption bill that Congress may propose. If the Obama team won’t start adjusting to the technological realities of strong and legal encryption, they could at least exercise some adult supervision at Main Justice.
On its own, such an editorial might not seem like a huge deal, but coming from the Wall Street Journal -- a source that has previously championed much greater surveillance and even supported backdoors -- it's a surprising shift. And it shows just how badly the DOJ and FBI miscalculated in their attempts to use the courts to get their desired results in breaking encryption.
As part of our funding campaign for our coverage of encryption, we reached out to some companies that care about these issues to ask them to show their support. This post is sponsored by Golden Frog, a company dedicated to online privacy, security and freedom.
James Clapper, Director of National Intelligence, is claiming that, according to NSA estimates the Snowden revelations sped up the adoption rate of encryption by 7 years. Apparently, that's based on NSA estimates of the adoption curve of encryption. As reported by Jenna McLaughlin at the Intercept:
“As a result of the Snowden revelations, the onset of commercial encryption has accelerated by seven years,” James Clapper said during a breakfast for journalists hosted by the Christian Science Monitor.
The shortened timeline has had “a profound effect on our ability to collect, particularly against terrorists,” he said.
When pressed by The Intercept to explain his figure, Clapper said it came from the National Security Agency. “The projected growth maturation and installation of commercially available encryption — what they had forecasted for seven years ahead, three years ago, was accelerated to now, because of the revelation of the leaks.”
Of course, it's worth noting that, in the past few months, it seemed as if the NSA and the intelligence community was moving away from its kneejerk hatred of encryption, pushing back against the FBI's argument that we need to backdoor encryption. But, apparently they're not willing to go quite this far. Basically, the NSA wants strong encryption out there, but it doesn't really want you to use it.
Asked if that was a good thing, leading to better protection for American consumers from the arms race of hackers constantly trying to penetrate software worldwide, Clapper answered no.
“From our standpoint, it’s not … it’s not a good thing,” he said.
Yup. James Clapper would prefer that the American public be less safe by not using encryption, rather than protecting their digital lives.
Of course, many other people do think it's a very, very good thing. Including Ed Snowden:
So, the guy in the US government is upset that the public is more safe, and the guy that people want to accuse of being a traitor is proud of helping Americans to better protect themselves. Maybe we ought to reverse their roles...
While so much of the attention had been focused on the case in San Bernardino, where the DOJ was looking to get into Syed Farook's iPhone, we've pointed out that perhaps the more interesting case was the parallel one in NY (which actually started last October), where the magistrate judge James Orenstein rejected the DOJ's use of the All Writs Act to try to force Apple to help unlock the iPhone of Jun Feng, a guy who had already pled guilty on drug charges, but who insisted he did not recall his passcode.
There were some oddities in the case. Feng had pled guilty and there was some issue over whether or not there was still a need to get into the iPhone. The DOJ insisted yes, because Feng's iPhone might provide necessary evidence to find others involved in the drug ring. The other oddity: Feng's iPhone was running iOS7. While the device itself was a newer model iPhone than the one in the Farook case, it still has an older operating system, where it was known that Apple (and others) could easily get in. So it made no sense that the FBI couldn't get into this phone. In fact, Apple's latest filing in the case, just over a week ago was basically along those lines, noting that the DOJ claimed Apple's assistance was "necessary," but that seemed unlikely.
And... late on Friday, the DOJ did the exact same "run away!" move it did in the Farook case, telling the judge that it had suddenly been given the passcode, so there was no need to move forward with the case at all.
The government respectfully submits this letter to update the Court and the
parties. Yesterday evening, an individual provided the passcode to the iPhone at issue in this
case. Late last night, the government used that passcode by hand and gained access to the
iPhone. Accordingly, the government no longer needs Apple’s assistance to unlock the
iPhone, and withdraws its application.
According to a (paywalled) WSJ article, Feng, who has been waiting for his sentencing, and thinking that his case was otherwise over, only just found out that there was this big fuss around his own case... and told the DOJ he miraculously remembered the passcode. Hallelujah. A miracle... and the DOJ was magically saved from a precedent it didn't want.
The Wall Street Journal reported last week that Mr. Feng only recently learned his phone had become an issue in a high-stakes legal fight between prosecutors and Apple. Mr. Feng, who has pleaded guilty and is due to be sentenced in the coming weeks, is the one who provided the passcode to investigators, according to people familiar with the matter.
Of course, it's worth noting, however that while this particular case may be effectively over, it's not that great for the DOJ, in that no one got to officially review magistrate judge James Orenstein's fairly epic smackdown of the DOJ earlier in the case. That, of course, has no value as a precedent, but that doesn't mean it won't be quoted or pointed to in other, similar cases.
On the flip side, of course, there's the argument that every time the case starts looking bad for the DOJ, they miraculously get into the phone in question. At the very least, this ought to raise questions about why the DOJ keeps insisting that it needs Apple's help... But the fact is these cases are going to keep coming.
It appears that more fully encrypting messaging and content is really catching on. Following Whatsapp's big move to roll out end-to-end encryption, the super popular communications app Viber has announced it intends to do the same for its 700 million (and growing) users. It's already testing encryption in a few markets, before rolling it out globally. The company claims that the encrypted system will also let you know if your content is encrypted based on color coding.
Unfortunately, Viber is not entirely clear on what encryption tools they're using. With Whatsapp, the company was upfront in saying that it was using the popular and tested open source encryption from Open Whisper Systems. Viber doesn't say what it's using, leading some to speculate that the company tried to roll its own (generally not a good idea -- and likely means there are serious security flaws). The company, however, says that they're doing "open source plus," but have not yet named what open source tools it's pulling from:
“We built [our end-to-end encryption] based on the concept of an established open-source solution with an extra level of security developed in-house,” a Viber spokesperson says, refusing to be more specific.
There are some that will argue that an opaque/unknown encryption system can, in some ways, be worse than no encryption, in that users may think their communications are private, when they really are not. So, the lack of an open, audited encryption solution is definitely a concern here.
However, what's encouraging is that we're seeing more and more apps embracing end-to-end encryption for communications, as well as strong disk encryption for data at rest. This is something that cryptographers and security experts pushed for for years without much actual support or adoption. However, it's finally starting to become a necessary piece of the puzzle for communications service providers, and that's a good thing.
from the we-can't-have-people-bad-mouthing-the-government-and-getting-away-with-it dept
Here comes the inevitable government backlash against WhatsApp rolling out end-to-end encryption for one billion users worldwide: if governments can no longer demand access to communications, the next best thing is to demand access to WhatsApp users.
According to India resident Prasanto K. Roy, local governments are demanding that administrators of WhatsApp groups (the latest beneficiaries of the encryption rollout) register with the local magistrate, and will apparently hold them accountable for any "irresponsible remarks" or "untoward actions" by members of the group.
The government's unsubtle man-in-the-middle approach to accessing WhatsApp communications also involves placing a literal government man in the middle, according to the Times of India.
The spokesperson also said that a government representative might also have to be added to the WhatsApp group as an admin. "If any government admin is present in a WhatsApp group, it will immediately prevent any sort of rumour-mongering," he said.
Whenever a government agency develops an overweening urge to curb "rumor-mongering," one can be sure that particular government is fucking something up somewhere. And, indeed, that is the case here.
The government had imposed a blackout on mobile internet in the troubled area after clashes between security forces and protestors claimed the lives of five people. The area had seen protests after the alleged molestation of a teenager by security personnel. The mobile internet blackout had been aimed at curbing the spread of potentially inflammatory messages that could spark further tension in the area.
It would seem to me the tension was created by the alleged molestation, the government's lack of interest in investigating/punishing the wrongdoer and the killing of five people. The government appears to be more interested in saving itself from its constituency, so the obvious move is to shut down any communication platform that it can't monitor or control. It can't kill WhatsApp, so it's demanding to be inserted into these conversations -- either directly or by lurking just offscreen whispering legal threats.
Not only that, but the quelling of dissent extends to the government itself. The flier also notes punishment awaits government employees who find the registration demand heavy-handed.
Govt. Employees serving in the district are directed to restrain from making any comments/remarks with regard to the policies and decisions of government on these WhatsApp groups running in the district and if anyone found involved in such activities, strict action will be initiated against them as required under rules.
Looking beyond this local dispute that has managed to drag in the world's most popular messaging service, one can see why it is essential that citizens have communication platforms that keep the government locked out. Encryption doesn't just "protect" criminals from law enforcement and innocent people from criminals. It also protects the innocent from their governments' self-serving overreach.
When you testify before Congress, it helps to actually have some knowledge of what you're talking about. On Tuesday, the House Energy & Commerce Committee held the latest congressional hearing on the whole silly encryption fight, entitled Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives. And, indeed, they did have witnesses presenting "industry" and "law enforcement" views, but for unclear reasons decided to separate them. First up were three "law enforcement" panelists, who were free to say whatever the hell they wanted with no one pointing out that they were spewing pure bullshit. You can watch the whole thing below (while it says it's 4 hours, it doesn't actually start until about 45 minutes in):
Lots of craziness was stated -- starting with the idea pushed by both chief of intelligence for the NYPD, Thomas Galati and the commander of the office of intelligence for the Indiana State Police, Charles Cohen -- that the way to deal with non-US or open source encryption was just to ban it from app stores. This is a real suggestion that was just made before Congress by two (?!?) separate law enforcement officials. Rep. Morgan Griffith rightly pointed out that so many encryption products couldn't possibly be regulated by US law, and asked the panelists what to do about it. You can watch the exchange here:
You see Cohen ridiculously claim that since Apple and Google are gatekeepers to apps, that the government could just ban foreign encryption apps from being in the app stores:
Right now Google and Apple act as the gatekeepers for most of those encrypted apps, meaning if the app is not available on the App Store for an iOS device, if the app is not available on Google Play for an Android device, a customer of the United States cannot install it. So while some of the encrypted apps, like Telegram, are based outside the United States, US companies act as gatekeepers as to whether those apps are accessible here in the United States to be used.
This is just wrong. It's ignorant and clueless and for a law enforcement official -- let alone one who is apparently the "commander of the office of intelligence" -- to not know that this is wrong is just astounding. Yes, on Apple phones it's more difficult to get apps onto a phone, but it's not impossible. On Android, however, it's easy. There are tons of alternative app stores, and part of the promise of the Android ecosystem is that you're not locked into Google's own app store. And, really, is Cohen literally saying that Apple and Google should be told they cannot allow Telegram -- one of the most popular apps in the world -- in their app stores? Really?
Galati then agreed with him and piled on with more ignorance:
I agree with what the Captain said. Certain apps are not available on all devices. So if the companies that are outside the United States can't comply with same rules and regulations of the ones that are in the United States, then they shouldn't be available on the app stores. For example, you can't get every app on a Blackberry that you can on an Android or a Google.
Leaving aside the fact he said "Android or a Google" (and just assuming he meant iPhone for one of those)... what?!? The reason you can't get every app on a BlackBerry that's on other devices has nothing to do with any of this at all. It's because the market for BlackBerry devices is tiny, so developers don't develop for the BlackBerry ecosystem (and, of course, some BlackBerries now use Android anyway, so...). That comment by Galati makes no sense at all. Using the fact that fewer developers develop for BlackBerry says nothing about blocking foreign encryption apps from Android or iOS ecosystems. It makes no sense.
Why are these people testifying before Congress when they don't appear to know what they're talking about?
Later in the hearing, when questioned by Rep. Paul Tonko about how other countries (especially authoritarian regimes) might view a US law demanding backdoors as an opportunity to demand the same levels of access, Cohen speculated ridiculously, wildly and falsely that he'd heard that Apple gave China its source code:
Here's what Cohen says:
In preparing for the testimony, I saw several news stories that said that Apple provided the source code for iOS to China, as an example. I don't know whether those stories are true or not.
Yeah, because they're not. He then goes on to say that Apple has never said under oath whether or not that's true -- except, just a little while later, on the second panel, Apple's General Counsel Bruce Sewell made it quite clear that they have never given China its source code. Either way, Cohen follows it up by saying that Apple won't give US law enforcement its source code, as if to imply that Apple is somehow more willing to help the Chinese government hack into phones than the US government. Again, this is just blatant false propaganda. And yet here is someone testifying before Congress and claiming that it might be true.
Thankfully, at the end of the hearing, Rep. Anna Eshoo -- who isn't even a member of the subcommittee holding the hearing (though she is a top member of the larger committee) joined in and quizzed Cohen about his bizarre claims:
She notes that it's a huge allegation to make without any factual evidence, and asks if he has anything to go on beyond just general "news reports." Not surprisingly, he does not.
Elsewhere in the hearing, Cohen also insists that a dual key solution would work. He says this with 100% confidence -- that if Apple and law enforcement had a shared key it would be "just like a safety deposit box." Of course, this is also just wrong. As has been shown for decades, when you set up a two key solution, you're introducing vulnerabilities into the system that almost certainly let in others as well.
And then, after that, Rep. Jerry McNerney raises the point -- highlighted by many others in the past -- that rather than "going dark," law enforcement is in the golden age of surveillance and investigation thanks to more and new information, including that provided by mobile phones (such as location data, metadata on contacts and more). Cohen, somewhat astoundingly, claims he can't think of any new information that's now available thanks to mobile phones:
Sir, I'm having problems thinking of an example of information that's available now that was not before. From my perspective, thinking through investigations that we previously had information for, when you combine the encryption issue along with shorter and shorter retention periods, in a service provider, meaning they're keeping their records, for both data and metadata, for a shorter period of time, available to legal process. I'm having difficulty finding an example of an avenue that was not available before.
Huh?!? He can't think of things like location info from mobile phones? He can't think of things like metadata and data around unencrypted texts? He can't think of things like unencrypted and available information from apps? Then why is he on this panel? And the issue of data retention? Was he just told before the hearing to make a point to push for mandatory data retention and decided to throw in a nod to it here?
At least Galati, who went after him, was willing to admit that tech has provided a lot more information than in the past -- but then claimed that encryption was "eliminating those gains."
Cohen is really the clown at the show here. He also claims that Apple somehow decided to throw away its key and that it was "solving a problem that doesn't exist" in adding encryption:
There he's being asked by Rep. Yvette Clarke if he sees any technical solutions to the encryption issue, and he says:
The solution that we had in place previously, in which Apple did hold a key. And as Chief Galati mentioned, that was never compromised. So they could comply with a proper service of legal process. Essentially, what happened is that Apple solved a problem that does not exist.
Again, this is astoundingly ignorant. The problem before was that there was no key. It wasn't that Apple had the key, it's that the data was readily available to anyone who had access to the phone. That put everyone's information at risk. It's why there was so much concern about stolen phones and why stolen phones were so valuable. For a law enforcement official to not realize that and not think it was a real problem is... astounding. And, again, raises the question of why this guy is testifying before Congress.
It also raises the question of why Congress put him on a panel with no experts around to correct his many, many errors. At the very least, towards the beginning of the second panel, Apple GC Sewell explained how Cohen was just flat out wrong on these points:
If you can't see that, after his prepared remarks, Sewell directly addresses Cohen's claims:
That's where I was going to conclude my comments. But I think I owe it to this committee to add one additional thought. And I want to be very clear on this: We have not provided source code to the Chinese government. We did not have a key 19 months ago that we threw away. We have not announced that we are going to apply passcode encryption to the next generation iCloud. I just want to be very clear on that because we heard three allegations. Those allegations have no merit.
A few minutes later, he's asked directly about this and whether or not the Chinese had asked for the source code, and Sewell says that, yes, the Chinese have asked, and Apple has refused to give it to them:
Seems like they could have killed 3 hours of ignorant arguments presented to Congress, if they had just not allowed such ignorance to be spewed earlier on.
BlackBerry still has not commented directly to Motherboard or VICE News on the specifics of the investigation, but CEO John Chen published a blog post on Monday addressing the report in broad strokes… very broad strokes.
“Regarding BlackBerry’s assistance,” Chen wrote instead, “I can reaffirm that we stood by our lawful access principles. Furthermore, at no point was BlackBerry’s BES server involved.”
BES is BlackBerry Enterprise Server -- the only option available where customers can lock BlackBerry out of access to communications. With BES, encryption keys are set by users, which means BlackBerry can no longer decrypt messages using its global PIN encryption key. Notably, this option is only available to corporate or government customers. Everyone else gets vanilla encryption, which can be decrypted by BlackBerry for law enforcement. Or, as appears to be the case in Canada, the key can be handed out to law enforcement agencies, allowing them to decrypt at will… because there's only one encryption key for all non-BES users.
According to BlackBerry CEO John Chen, the ends justify the means he pointedly won't be discussing in detail.
We have long been clear in our stance that tech companies as good corporate citizens should comply with reasonable lawful access requests.
This very belief was put to the test in an old case that recently resurfaced in the news, which speculated on and challenged BlackBerry’s corporate and ethical principles. In the end, the case resulted in a major criminal organization being dismantled.
BlackBerry continues to play both sides of the equation, providing "regular" users with less secure communications while claiming to be the "gold standard" in encrypted communications -- a privilege it only extends to some of its customers, unlike Apple or Google, which provide encryption to all of their customers.
The company has nothing to offer customers in the way of assurances, but it does seem to be going out of its way to soothe the nerves of law enforcement officials frustrated by smartphone encryption. It may make a big deal about its fight against Pakistan and its demands for access (Chen highlights this in his blog post), but it seems less than likely to go to bat for a majority of its users when faced with overreach by more "acceptable" governments.
Yesterday morning, things kicked off with a ridiculous tweet from the NY Police Department, announcing that it "stood with" the Manhattan DA in calling for "encryption" legislation. Of course, that's inaccurate. What it was really calling for was anti-encryption legislation.
But, suddenly we discovered that not only was Manhattan District Attorney -- and proudly technologically ignorant -- Cyrus Vance continuing to push his dangerous anti-encryption views, but he had somehow created a hashtag and a logo for it (I've sent in a FOIA request to see how much tax payer dollars were spent on the logo, though I doubt I'll get a response). Vance held quite the grandstanding press conference over this, in which he repeated the same misleading claims as in the past about how horrible encryption is, and then trotted out some sob stories of cases where law enforcement failed to do their job, and then blamed it on encryption.
You can watch the half-hour press conference below if you have the stomach for it:
Of course, just about everything about this is ridiculous. It took place just a few days after Patrick O'Neill, over at the DailyDot, revealed some details of a FOIA request he'd made with Vance's office about all those cases he claimed he needed to get into phones for -- and found that, of the ones that were listed all had resulted in convictions anyway, even without getting into the phones. And most didn't appear to be for really serious crimes.
Meanwhile, as is often the case, an attempt by law enforcement to co-opt whatever "the kids these days" are doing by setting up a hashtag failed spectacularly. First off, Vance's office just happened to pick a hashtag that was already in use. Even worse, it was in use by the Quakers to push for criminal justice reform that would "start to reverse the failed 40-year 'war on drugs.' Ooops.
Then, of course, the folks who actually understand technology took the hashtag and ran with it, explaining why Vance's campaign was idiotic.
Remember: encryption protects the families of police too. If you break it, you put them at risk. #unlockJustice
After going through lots and lots of tweets, I have to admit that I couldn't find any -- outside of those from the DA's office and various law enforcement people that were actually supportive of the campaign. It really makes you wonder, just who does Cyrus Vance think he's protecting?