Brazilian judges are apparently not very big fans of the popular messaging app Whatsapp, which is owned by Facebook (but run independently). Judge Marcel Montalvao has ordered the app blocked entirely across Brazil, because Whatsapp has refused to provide data (which it likely does not have) to help out with a drug investigation. Any phone companies that don't block Whatsapp will be fined about $143,000 per day.
If this sounds familiar, it's because we went through this back in December in another case with another judge. And, of course, in March a Facebook (not Whatsapp) exec was arrested over a similar issue in a different case. When Whatsapp again refused to turn over information, because it could not, the judge had the exec arrested (another judge freed the exec pretty quickly).
Once again, Whatsapp points out that it's cooperated as much as possible:
“After cooperating to the full extent of our ability with the local courts, we are disappointed a judge in Sergipe decided yet again to order the block of WhatsApp in Brazil,” WhatsApp said in a statement. “This decision punishes more than 100 million Brazilians who rely on our service to communicate, run their businesses, and more, in order to force us to turn over information we repeatedly said we don’t have.”
The order is shutting down Whatsapp for 72 hours, but considering just how widely the app is used there (it is basically the way many Brazilians communicate) the impact is pretty massive. As Glenn Greenwald and Andrew Fishman over at the Intercept note, this is a ridiculous move that harms many people, but is also a sign of what's to come as governments continue to freak out over encrypted communications:
It is stunning to watch a single judge instantly shut down a primary means of online communication for the world’s fifth-largest country. The two Brazilian communication experts in the NYTwrote of the first WhatsApp shutdown: “the judge’s action was reckless and represents a potentially longer-term threat to the freedoms of Brazilians.” But there is no question that is just a sign of what is to come for countries far from Brazil: there will undoubtedly be similar battles in numerous countries around the world over what rights companies have to offer privacy protections to their users.
Senators Richard Burr and Dianne Feinstein are not giving up that quickly on their ridiculous and technically ignorant plan to outlaw real encryption. The two have now penned an op-ed in the WSJ that lays out all the same talking points they've laid out before, without adding anything new. Instead, it just continues to make statements that show how incredibly ignorant they are. The piece is called Encryption Without Tears (and may be paywalled, though by now everyone knows how to get around that), which already doesn't make any sense. What they're pushing for is ending basic encryption, which will lead to many, many tears.
It starts out with their standard ridiculous line, pretending that because a company builds end-to-end encryption, it's acting "above the law."
In an increasingly digital world, strong encryption of devices is needed to prevent criminal misuse of data. But technological innovation must not mean placing individuals or companies above the law.
People have gone over this time and time again: this is not about anyone being "above the law." It's about whether or not companies can be forced to directly undermine the safety and security of their products (and the public). A paper shredder can destroy evidence. A paper shredder maker is not "above the law" when it decides not to build a system for piecing back together the shreds.
And speaking of "above the law" I still don't see Feinstein or Burr commenting on the FBI/DOJ announcing that it will ignore a court order to reveal how it hacked into computers over Tor. That is being above the law. That involves a situation where a court has asked for information that the FBI absolutely has. The FBI is just saying "nope." If Burr and Feinstein are really worried about being "above the law," shouldn't they worry about this situation?
Over the past year the two of us have explored the challenges associated with criminal and terrorist use of encrypted communications. Two examples illustrate why the status quo is unacceptable.
I love this. They give two examples that have been rolled out a bunch in the last few weeks. The attack in Garland, Texas, where the attackers supposedly exchanged some messages with potential ISIS people, and the case of Brittney Mills, who was tragically murdered, and whose case hasn't been solved. Mills had her smartphone, but no one can get into it. Of course, it took nearly two years of fretting before law enforcement could dig up these two cases, and neither make a very strong argument for why we need to undermine all encryption.
It's a simple fact that law enforcement never gets to have all of the evidence. In many, many, many criminal scenarios, that's just the reality. People destroy evidence, or law enforcement doesn't find it or law enforcement just doesn't understand it. That's not the end of the world. This is why we have police detectives, who are supposed to piece together whatever evidence they do have and build a picture for a case. Burr and Feinstein are acting like in the past, law enforcement immediately was handed all evidence. That's never been the way it works. Yes, law enforcement doesn't get access to some information. That's how it works.
You don't go and undermine the very basis of computer security just because law enforcement can't find a few pieces of evidence.
Our draft bill wouldn’t impose a one-size-fits-all solution on all covered entities, which include device manufacturers, software developers and electronic-communications services. The proposal doesn’t define the technological solutions or tell businesses how to solve the problem.
This is also misleading. The bill requires an end to real encryption. That's it. Real encryption means that only one person has the key. This is what Burr and Feinstein don't seem to get. They seem to think it's trivial to leave a key with Apple or whoever. But as basically every crypto expert has explained, it is not. Doing so creates a vulnerability... and worse, it's a vulnerability that cannot be patched. That's hellishly dangerous. Sure, the bill doesn't tell them exactly how to do this, but it does make it clear: you cannot offer real encryption, you can only offer something that can be hacked. That's a problem.
We want to provide businesses with full discretion to decide how best to design and build systems that maintain data security while at the same time complying with court orders.
We want to provide businesses with full discretion to decide how best to travel back in time, in order to prevent crimes.
Seriously: this is basically the same thing that Burr and Feinstein are saying here. They're asking for something that's impossible, and acting like it's a routine suggestion. If they need to comply with these All Writs Act style orders, they cannot build systems that maintain data security. That's a fact. It's mind-boggling that Burr and Feinstein still can't understand this.
Critics in the industry suggest that providing access to encrypted data will weaken their systems. But these same companies, for business purposes, already maintain and have access to vast amounts of encrypted personal information, such as credit-card numbers, bank-account information and purchase histories.
Argh. This paragraph shows that whatever poor staffer Burr and Feinstein assigned to write this drivel doesn't understand even the first thing about what he or she is talking about. Storing encrypted passwords, credit card info, bank account info, etc. is a totally different thing. Those are encrypted to keep them safe, and part of the reason they're encrypted is so that even those companies cannot reveal them. This point is making the opposite point of what Burr and Feinstein think. Companies encrypt passwords and credit card info and the like so that they're not storing the plaintext info, and there's no easy way for anyone to get that info. This protects user data, and the companies cannot actually provide the plaintext. They're comparing hashes. That's what keeps it safe.
If we received a court order demanding our users' passwords, we couldn't provide them. Because they're encrypted. We don't know our users' passwords and can't give them to you. When someone logs in to our website, we can compare a hash of their password to our hashed version and then if they match, we let them in. But we don't know what their password is. So this is a terrible example that actually goes against what Burr and Feinstein are saying. Those encrypted stores of information would be illegal under this bill!
We are not asking companies to provide law enforcement with unfettered access to encrypted data. We aren’t even asking companies to tell the government how they gain access to this encrypted data. All we are doing is asking companies to find a way to keep their data secure while also cooperating with law enforcement in terrorism and criminal investigations.
Again, that last line is impossible. They're asking the impossible -- and in the process, making everyone less safe. The only way to provide such info to law enforcement is to no longer keep the data truly secure. And the big concern is not unfettered access for law enforcement, but rather whatever this backdoor means for those with malicious intent, who will be very, very, very focused on finding these vulnerabilities and exploiting them.
President Obama said earlier this year, “You cannot take an absolutist view on this.” We agree—and believe that strong data security and compliance with the justice system don’t have to be mutually exclusive.
Because you don't know what you're talking about.
American technology companies have done some amazing things that are the envy of the world. We think that finding a way to achieve both goals simultaneously is not beyond their capabilities.
So, in the end, despite basically every cryptography expert telling them this is impossible, Burr and Feinstein come back with "NERD HARDER, NERDS!"
Once the DOJ told the court in San Bernardino that it had succeeded in hacking into the iPhone of Syed Farook, the big question people asked is whether or not the FBI would then tell Apple about the vulnerability. After all, the administration set up the so-called "Vulnerabilities Equities Policy" (VEP) with the idea of sharing most vulnerabilities it discovers with companies. The White House directly stated:
One thing is clear: This administration takes seriously its commitment to an open and interoperable, secure and reliable Internet, and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest. This has been and continues to be the case.
This spring, we re-invigorated our efforts to implement existing policy with respect to disclosing vulnerabilities – so that everyone can have confidence in the integrity of the process we use to make these decisions. We rely on the Internet and connected systems for much of our daily lives. Our economy would not function without them. Our ability to project power abroad would be crippled if we could not depend on them. For these reasons, disclosing vulnerabilities usually makes sense. We need these systems to be secure as much as, if not more so, than everyone else.
Still, one could make a strong case that this vulnerability should be disclosed... even if almost no one expected it to be. Amusingly, just a few days ago, Apple revealed that the FBI used the VEP to disclose a vulnerabilityfor the very first time, on April 14th, just as everyone was arguing about this. Of course, the flaw it revealed was not about hacking into the iPhone, and was actually about a flaw that Apple had discovered and fixed... nine months ago. But, again, if this is the very first time the FBI has disclosed something to Apple, it certainly suggests that the VEP process generally means nothing gets disclosed. In fact, the timing of this really suggests that someone in the DOJ recently flipped out and realized that there's now going to be scrutiny on the VEP, so they might as well disclose something. Thus, they found an old bug that had already been patched and "revealed" it.
“The F.B.I. purchased the method from an outside party so that we could unlock the San Bernardino device,” Amy S. Hess, executive assistant director for science and technology, said in a statement.
“We did not, however, purchase the rights to technical details about how the method functions, or the nature and extent of any vulnerability upon which the method may rely in order to operate. As a result, currently we do not have enough technical information about any vulnerability that would permit any meaningful review” by the White House examiners, she said.
Now, some are arguing that this suggests absolutely terrible bargaining on the side of the DOJ/FBI. But, another interpretation is that it's how the DOJ knew that it wouldn't have to reveal the flaw to Apple. Of course, this might also explain why the DOJ at one point appeared to claim that the hack in question only worked for Farook's phone. They later claimed that was a misstatement, and it really meant that it only applied to that iPhone configuration. But, if the FBI never actually got the details, then in some sense they'd be right that for the FBI the crack only worked for that one phone. And if they wanted to do it on another phone, they'd have to shell out another ~$1 million or so...
A Philadelphia man suspected of possessing child pornography has been in jail for seven months and counting after being found in contempt of a court order demanding that he decrypt two password-protected hard drives.
The suspect, a former Philadelphia Police Department sergeant, has not been charged with any child porn crimes. Instead, he remains indefinitely imprisoned in Philadelphia's Federal Detention Center for refusing to unlock two drives encrypted with Apple's FileVault software in a case that once again highlights the extent to which the authorities are going to crack encrypted devices. The man is to remain jailed "until such time that he fully complies" with the decryption order.
The Fifth Amendment should prevent the government from punishing a person for not testifying against themselves, which is what's being argued by the defendant's representation in its appeal to the Third Circuit. (Although it's actually indirect representation. The government's case is actually against Doe's devices ["United States of America v. Apple MacPro Computer, et al"] and his lawyer is hoping for a stay of the contempt order during the appeal process.)
Mr. Doe… has a strong likelihood of success on the second issue: whether compelling the target of a criminal investigation to recall and divulge an encryption passcode transgresses the Fifth Amendment privilege against self-incrimination. Supreme Court precedent already instructs that a suspect may not be compelled to disclose the sequence of numbers that will open a combination lock — clearly auguring the same rule for any compelled disclosure of the sequence of characters constituting an encryption passcode.
Doe's rep also argues that the All Writs order obtained by the government has no jurisdiction over Doe or his devices.
Mr. Doe’s first claim is that the district court lacked subject matter jurisdiction. The claim stems from the government’s apparently unprecedented use of an unusual procedural vehicle to attempt to compel a suspect to give evidence in advance of potential criminal charges. Specifically, the government took resort not to a grand jury, but to a magistrate judge pursuant to the All Writs Act, 28 U.S.C. § 1651. (Ex. F at 1).
It is black letter law that the All Writs Act never supplies “any federal subject-matter jurisdiction in its own right[.]” Sygenta Crop Protection, Inc. v. Henson, 537 U.S. 28, 31 (2002) (citation omitted). It is equally well-settled that the Act has no application where other provisions of law specifically address the subject matter concerned. Pennsylvania Bureau of Correction v. United States Marshals Service, 474 U.S. 34, 40-42 (1985). The compelled production of evidence in advance of criminal charges is specifically addressed by Rules 6 and 17 of the Federal Rules of Criminal Procedure, which authorize the issuance and enforcement of grand jury subpoenas; and by 28 U.S.C. § 1826(a), which specifies the authorized penalties for a witness who refuses without good cause to give the evidence demanded by the grand jury.
As it stands now, Doe is still being held in contempt of court for refusing to decrypt his devices for investigators. The district court that held him in contempt has refused direct appeal of that order, resulting in the labyrinthine legal strategy of using the government's case against Doe's devices as a vehicle for challenging the lower court's contempt order.
Doe has not been charged, yet he's in prison. Backing up the government's assertions for holding him in contempt are two dubious pieces of hearsay. One is from his estranged sister, who claims to have seen child porn on Doe's computer, but can't actually say whether it was located on the devices the government is seeking to have decrypted. The other is from some sort of law enforcement encryption whisperer, who can apparently see things in the scrambled bits.
The government’s second witness was Detective Christopher Tankelewicz, a forensic examiner with the Delaware County District Attorney’s Office. He testified only that it was his “best guess” child pornography would be found on the hard drives. (Ex. J at 346). According to Tankelewicz’s understanding of the Freenet online network (in which he admits having no training), there were signs on an Apple Mac Pro computer seized with the hard drives of a user accessing or trying to access message boards with names suggestive of child pornography. (Ex. J at 306, 311-312, 339-340). In rather ambiguous testimony, Tankelewicz did not appear to say this meant any image traded over these boards was on the hard drives. (See Ex. J at 303-317, 336-340, 345-350). Instead, he identified a single image he believed there to be a “possibility” was on the drives. (Ex. J at 308-309). As he described it, the image was of “a four or five-year-old girl with her dress lifted up, but the image itself was small so you really couldn’t see what was going on with the image.” (Ex. J at 308).
No one wants to see a sex offender walk away from charges, but at this point, Doe hasn't even been officially charged with anything more than contempt. The problem with that charge is it has no end date. He can either stay in jail or comply with the order, even when the order conjures jurisdiction out of nowhere and violates his Fifth Amendment rights. If the government doesn't have enough evidence to pursue a case against Doe, it should cut him loose until it does.
That's what makes its recent editorial, The Encryption Farce (possibly behind a paywall, though the version I just opened showed up fine), so remarkable. It completely bashes the FBI over its attempts to force Apple to build a backdoor into its encryption -- though mainly because of the ridiculous fact that in the two most high profile cases, the DOJ magically got into the phones just as the cases got serious. The WSJ editorial doesn't pull any punches, asking what the hell is going on at the Justice Department:
If history repeats itself first as tragedy and then as farce, what does the FBI have in store next for its encryption war with Apple? After withdrawing its demands in San Bernardino and then reopening hostilities with a drug prosecution in Brooklyn, the G-men abruptly dumped the second case over the weekend too. Is anyone in charge at the Justice Department, or are junior prosecutors running the joint?
The editorial goes on to mock the FBI's claim that these cases are all about getting into just that phone, and notes that constantly finding ways in at the last minute are destroying the FBI's credibility.
This second immaculate conception in as many months further undermines the FBI’s credibility about its technological capabilities. Judges ought to exercise far more scrutiny in future decryption cases even as Mr. Comey continues to pose as helpless.
It goes on to suggest that the FBI stop bringing these cases, and that the President and the DOJ should put an end to this ridiculous attack on encryption:
Yet forgive us if this “conversation” now seems more like a Jim Comey monologue. The debate might start to be productive if the FBI Director would stop trying to use the courts as an ad hoc policy tool and promised not to bring any more cases like the one in Brooklyn.
Meanwhile, the White House has taken the profile-in-courage stand of refusing to endorse or oppose any encryption bill that Congress may propose. If the Obama team won’t start adjusting to the technological realities of strong and legal encryption, they could at least exercise some adult supervision at Main Justice.
On its own, such an editorial might not seem like a huge deal, but coming from the Wall Street Journal -- a source that has previously championed much greater surveillance and even supported backdoors -- it's a surprising shift. And it shows just how badly the DOJ and FBI miscalculated in their attempts to use the courts to get their desired results in breaking encryption.
As part of our funding campaign for our coverage of encryption, we reached out to some companies that care about these issues to ask them to show their support. This post is sponsored by Golden Frog, a company dedicated to online privacy, security and freedom.
James Clapper, Director of National Intelligence, is claiming that, according to NSA estimates the Snowden revelations sped up the adoption rate of encryption by 7 years. Apparently, that's based on NSA estimates of the adoption curve of encryption. As reported by Jenna McLaughlin at the Intercept:
“As a result of the Snowden revelations, the onset of commercial encryption has accelerated by seven years,” James Clapper said during a breakfast for journalists hosted by the Christian Science Monitor.
The shortened timeline has had “a profound effect on our ability to collect, particularly against terrorists,” he said.
When pressed by The Intercept to explain his figure, Clapper said it came from the National Security Agency. “The projected growth maturation and installation of commercially available encryption — what they had forecasted for seven years ahead, three years ago, was accelerated to now, because of the revelation of the leaks.”
Of course, it's worth noting that, in the past few months, it seemed as if the NSA and the intelligence community was moving away from its kneejerk hatred of encryption, pushing back against the FBI's argument that we need to backdoor encryption. But, apparently they're not willing to go quite this far. Basically, the NSA wants strong encryption out there, but it doesn't really want you to use it.
Asked if that was a good thing, leading to better protection for American consumers from the arms race of hackers constantly trying to penetrate software worldwide, Clapper answered no.
“From our standpoint, it’s not … it’s not a good thing,” he said.
Yup. James Clapper would prefer that the American public be less safe by not using encryption, rather than protecting their digital lives.
Of course, many other people do think it's a very, very good thing. Including Ed Snowden:
So, the guy in the US government is upset that the public is more safe, and the guy that people want to accuse of being a traitor is proud of helping Americans to better protect themselves. Maybe we ought to reverse their roles...
While so much of the attention had been focused on the case in San Bernardino, where the DOJ was looking to get into Syed Farook's iPhone, we've pointed out that perhaps the more interesting case was the parallel one in NY (which actually started last October), where the magistrate judge James Orenstein rejected the DOJ's use of the All Writs Act to try to force Apple to help unlock the iPhone of Jun Feng, a guy who had already pled guilty on drug charges, but who insisted he did not recall his passcode.
There were some oddities in the case. Feng had pled guilty and there was some issue over whether or not there was still a need to get into the iPhone. The DOJ insisted yes, because Feng's iPhone might provide necessary evidence to find others involved in the drug ring. The other oddity: Feng's iPhone was running iOS7. While the device itself was a newer model iPhone than the one in the Farook case, it still has an older operating system, where it was known that Apple (and others) could easily get in. So it made no sense that the FBI couldn't get into this phone. In fact, Apple's latest filing in the case, just over a week ago was basically along those lines, noting that the DOJ claimed Apple's assistance was "necessary," but that seemed unlikely.
And... late on Friday, the DOJ did the exact same "run away!" move it did in the Farook case, telling the judge that it had suddenly been given the passcode, so there was no need to move forward with the case at all.
The government respectfully submits this letter to update the Court and the
parties. Yesterday evening, an individual provided the passcode to the iPhone at issue in this
case. Late last night, the government used that passcode by hand and gained access to the
iPhone. Accordingly, the government no longer needs Apple’s assistance to unlock the
iPhone, and withdraws its application.
According to a (paywalled) WSJ article, Feng, who has been waiting for his sentencing, and thinking that his case was otherwise over, only just found out that there was this big fuss around his own case... and told the DOJ he miraculously remembered the passcode. Hallelujah. A miracle... and the DOJ was magically saved from a precedent it didn't want.
The Wall Street Journal reported last week that Mr. Feng only recently learned his phone had become an issue in a high-stakes legal fight between prosecutors and Apple. Mr. Feng, who has pleaded guilty and is due to be sentenced in the coming weeks, is the one who provided the passcode to investigators, according to people familiar with the matter.
Of course, it's worth noting, however that while this particular case may be effectively over, it's not that great for the DOJ, in that no one got to officially review magistrate judge James Orenstein's fairly epic smackdown of the DOJ earlier in the case. That, of course, has no value as a precedent, but that doesn't mean it won't be quoted or pointed to in other, similar cases.
On the flip side, of course, there's the argument that every time the case starts looking bad for the DOJ, they miraculously get into the phone in question. At the very least, this ought to raise questions about why the DOJ keeps insisting that it needs Apple's help... But the fact is these cases are going to keep coming.
It appears that more fully encrypting messaging and content is really catching on. Following Whatsapp's big move to roll out end-to-end encryption, the super popular communications app Viber has announced it intends to do the same for its 700 million (and growing) users. It's already testing encryption in a few markets, before rolling it out globally. The company claims that the encrypted system will also let you know if your content is encrypted based on color coding.
Unfortunately, Viber is not entirely clear on what encryption tools they're using. With Whatsapp, the company was upfront in saying that it was using the popular and tested open source encryption from Open Whisper Systems. Viber doesn't say what it's using, leading some to speculate that the company tried to roll its own (generally not a good idea -- and likely means there are serious security flaws). The company, however, says that they're doing "open source plus," but have not yet named what open source tools it's pulling from:
“We built [our end-to-end encryption] based on the concept of an established open-source solution with an extra level of security developed in-house,” a Viber spokesperson says, refusing to be more specific.
There are some that will argue that an opaque/unknown encryption system can, in some ways, be worse than no encryption, in that users may think their communications are private, when they really are not. So, the lack of an open, audited encryption solution is definitely a concern here.
However, what's encouraging is that we're seeing more and more apps embracing end-to-end encryption for communications, as well as strong disk encryption for data at rest. This is something that cryptographers and security experts pushed for for years without much actual support or adoption. However, it's finally starting to become a necessary piece of the puzzle for communications service providers, and that's a good thing.
from the we-can't-have-people-bad-mouthing-the-government-and-getting-away-with-it dept
Here comes the inevitable government backlash against WhatsApp rolling out end-to-end encryption for one billion users worldwide: if governments can no longer demand access to communications, the next best thing is to demand access to WhatsApp users.
According to India resident Prasanto K. Roy, local governments are demanding that administrators of WhatsApp groups (the latest beneficiaries of the encryption rollout) register with the local magistrate, and will apparently hold them accountable for any "irresponsible remarks" or "untoward actions" by members of the group.
The government's unsubtle man-in-the-middle approach to accessing WhatsApp communications also involves placing a literal government man in the middle, according to the Times of India.
The spokesperson also said that a government representative might also have to be added to the WhatsApp group as an admin. "If any government admin is present in a WhatsApp group, it will immediately prevent any sort of rumour-mongering," he said.
Whenever a government agency develops an overweening urge to curb "rumor-mongering," one can be sure that particular government is fucking something up somewhere. And, indeed, that is the case here.
The government had imposed a blackout on mobile internet in the troubled area after clashes between security forces and protestors claimed the lives of five people. The area had seen protests after the alleged molestation of a teenager by security personnel. The mobile internet blackout had been aimed at curbing the spread of potentially inflammatory messages that could spark further tension in the area.
It would seem to me the tension was created by the alleged molestation, the government's lack of interest in investigating/punishing the wrongdoer and the killing of five people. The government appears to be more interested in saving itself from its constituency, so the obvious move is to shut down any communication platform that it can't monitor or control. It can't kill WhatsApp, so it's demanding to be inserted into these conversations -- either directly or by lurking just offscreen whispering legal threats.
Not only that, but the quelling of dissent extends to the government itself. The flier also notes punishment awaits government employees who find the registration demand heavy-handed.
Govt. Employees serving in the district are directed to restrain from making any comments/remarks with regard to the policies and decisions of government on these WhatsApp groups running in the district and if anyone found involved in such activities, strict action will be initiated against them as required under rules.
Looking beyond this local dispute that has managed to drag in the world's most popular messaging service, one can see why it is essential that citizens have communication platforms that keep the government locked out. Encryption doesn't just "protect" criminals from law enforcement and innocent people from criminals. It also protects the innocent from their governments' self-serving overreach.
When you testify before Congress, it helps to actually have some knowledge of what you're talking about. On Tuesday, the House Energy & Commerce Committee held the latest congressional hearing on the whole silly encryption fight, entitled Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives. And, indeed, they did have witnesses presenting "industry" and "law enforcement" views, but for unclear reasons decided to separate them. First up were three "law enforcement" panelists, who were free to say whatever the hell they wanted with no one pointing out that they were spewing pure bullshit. You can watch the whole thing below (while it says it's 4 hours, it doesn't actually start until about 45 minutes in):
Lots of craziness was stated -- starting with the idea pushed by both chief of intelligence for the NYPD, Thomas Galati and the commander of the office of intelligence for the Indiana State Police, Charles Cohen -- that the way to deal with non-US or open source encryption was just to ban it from app stores. This is a real suggestion that was just made before Congress by two (?!?) separate law enforcement officials. Rep. Morgan Griffith rightly pointed out that so many encryption products couldn't possibly be regulated by US law, and asked the panelists what to do about it. You can watch the exchange here:
You see Cohen ridiculously claim that since Apple and Google are gatekeepers to apps, that the government could just ban foreign encryption apps from being in the app stores:
Right now Google and Apple act as the gatekeepers for most of those encrypted apps, meaning if the app is not available on the App Store for an iOS device, if the app is not available on Google Play for an Android device, a customer of the United States cannot install it. So while some of the encrypted apps, like Telegram, are based outside the United States, US companies act as gatekeepers as to whether those apps are accessible here in the United States to be used.
This is just wrong. It's ignorant and clueless and for a law enforcement official -- let alone one who is apparently the "commander of the office of intelligence" -- to not know that this is wrong is just astounding. Yes, on Apple phones it's more difficult to get apps onto a phone, but it's not impossible. On Android, however, it's easy. There are tons of alternative app stores, and part of the promise of the Android ecosystem is that you're not locked into Google's own app store. And, really, is Cohen literally saying that Apple and Google should be told they cannot allow Telegram -- one of the most popular apps in the world -- in their app stores? Really?
Galati then agreed with him and piled on with more ignorance:
I agree with what the Captain said. Certain apps are not available on all devices. So if the companies that are outside the United States can't comply with same rules and regulations of the ones that are in the United States, then they shouldn't be available on the app stores. For example, you can't get every app on a Blackberry that you can on an Android or a Google.
Leaving aside the fact he said "Android or a Google" (and just assuming he meant iPhone for one of those)... what?!? The reason you can't get every app on a BlackBerry that's on other devices has nothing to do with any of this at all. It's because the market for BlackBerry devices is tiny, so developers don't develop for the BlackBerry ecosystem (and, of course, some BlackBerries now use Android anyway, so...). That comment by Galati makes no sense at all. Using the fact that fewer developers develop for BlackBerry says nothing about blocking foreign encryption apps from Android or iOS ecosystems. It makes no sense.
Why are these people testifying before Congress when they don't appear to know what they're talking about?
Later in the hearing, when questioned by Rep. Paul Tonko about how other countries (especially authoritarian regimes) might view a US law demanding backdoors as an opportunity to demand the same levels of access, Cohen speculated ridiculously, wildly and falsely that he'd heard that Apple gave China its source code:
Here's what Cohen says:
In preparing for the testimony, I saw several news stories that said that Apple provided the source code for iOS to China, as an example. I don't know whether those stories are true or not.
Yeah, because they're not. He then goes on to say that Apple has never said under oath whether or not that's true -- except, just a little while later, on the second panel, Apple's General Counsel Bruce Sewell made it quite clear that they have never given China its source code. Either way, Cohen follows it up by saying that Apple won't give US law enforcement its source code, as if to imply that Apple is somehow more willing to help the Chinese government hack into phones than the US government. Again, this is just blatant false propaganda. And yet here is someone testifying before Congress and claiming that it might be true.
Thankfully, at the end of the hearing, Rep. Anna Eshoo -- who isn't even a member of the subcommittee holding the hearing (though she is a top member of the larger committee) joined in and quizzed Cohen about his bizarre claims:
She notes that it's a huge allegation to make without any factual evidence, and asks if he has anything to go on beyond just general "news reports." Not surprisingly, he does not.
Elsewhere in the hearing, Cohen also insists that a dual key solution would work. He says this with 100% confidence -- that if Apple and law enforcement had a shared key it would be "just like a safety deposit box." Of course, this is also just wrong. As has been shown for decades, when you set up a two key solution, you're introducing vulnerabilities into the system that almost certainly let in others as well.
And then, after that, Rep. Jerry McNerney raises the point -- highlighted by many others in the past -- that rather than "going dark," law enforcement is in the golden age of surveillance and investigation thanks to more and new information, including that provided by mobile phones (such as location data, metadata on contacts and more). Cohen, somewhat astoundingly, claims he can't think of any new information that's now available thanks to mobile phones:
Sir, I'm having problems thinking of an example of information that's available now that was not before. From my perspective, thinking through investigations that we previously had information for, when you combine the encryption issue along with shorter and shorter retention periods, in a service provider, meaning they're keeping their records, for both data and metadata, for a shorter period of time, available to legal process. I'm having difficulty finding an example of an avenue that was not available before.
Huh?!? He can't think of things like location info from mobile phones? He can't think of things like metadata and data around unencrypted texts? He can't think of things like unencrypted and available information from apps? Then why is he on this panel? And the issue of data retention? Was he just told before the hearing to make a point to push for mandatory data retention and decided to throw in a nod to it here?
At least Galati, who went after him, was willing to admit that tech has provided a lot more information than in the past -- but then claimed that encryption was "eliminating those gains."
Cohen is really the clown at the show here. He also claims that Apple somehow decided to throw away its key and that it was "solving a problem that doesn't exist" in adding encryption:
There he's being asked by Rep. Yvette Clarke if he sees any technical solutions to the encryption issue, and he says:
The solution that we had in place previously, in which Apple did hold a key. And as Chief Galati mentioned, that was never compromised. So they could comply with a proper service of legal process. Essentially, what happened is that Apple solved a problem that does not exist.
Again, this is astoundingly ignorant. The problem before was that there was no key. It wasn't that Apple had the key, it's that the data was readily available to anyone who had access to the phone. That put everyone's information at risk. It's why there was so much concern about stolen phones and why stolen phones were so valuable. For a law enforcement official to not realize that and not think it was a real problem is... astounding. And, again, raises the question of why this guy is testifying before Congress.
It also raises the question of why Congress put him on a panel with no experts around to correct his many, many errors. At the very least, towards the beginning of the second panel, Apple GC Sewell explained how Cohen was just flat out wrong on these points:
If you can't see that, after his prepared remarks, Sewell directly addresses Cohen's claims:
That's where I was going to conclude my comments. But I think I owe it to this committee to add one additional thought. And I want to be very clear on this: We have not provided source code to the Chinese government. We did not have a key 19 months ago that we threw away. We have not announced that we are going to apply passcode encryption to the next generation iCloud. I just want to be very clear on that because we heard three allegations. Those allegations have no merit.
A few minutes later, he's asked directly about this and whether or not the Chinese had asked for the source code, and Sewell says that, yes, the Chinese have asked, and Apple has refused to give it to them:
Seems like they could have killed 3 hours of ignorant arguments presented to Congress, if they had just not allowed such ignorance to be spewed earlier on.