James Clapper Says Nerd Magic Can Solve Terrorist Content Filtering, Create Safe Encryption Backdoors
from the harry-potter-but-for-lawful-access dept
Former Director of National Intelligence James Clapper went from having a comfortable, shadowy job in a comfortable, shadowy office to being the face of the American surveillance state after the Snowden leaks. Instead of only being periodically hassled by a couple of Intelligence Committee members (mainly Ron Wyden), Clapper was called to account for the NSA’s apparent surveillance sins. And he handled it badly.
After plenty of evasive discussion, Clapper finally said, “Oh, you mean those phone records,” and ushered in a new era of slightly less bulk metadata collection. But he still made the most of his speaking opportunities to pin the woes of the terrorized world on Snowden, noting his leaks “sped up encryption adoption by seven years.” It was an oddly precise estimate, especially given the contradicting evidence showing terrorists hadn’t really changed their communication methods in response to the Snowden leaks.
Clapper is no longer the Intelligence Director, but he’s still beating the encryption drum during interviews. And it appears he’s aligned himself with another former government employee, James Comey. Speaking to the National Press Club in Australia, Clapper called for both harder nerding and tech companies being a (possibly compelled) source of light in the growing darkness.
As governments around the world face the ongoing threat of extremism, former US Director of National Intelligence James Clapper says tech companies have a social “responsibility” to take better care of what appears on their platforms.
“I do think there is a role to play here in some screening and filtering of what appears in social media,” he said.
“In the same way that these companies very directly capitalise on the information that we make available to them and exploit it, it seems that that same ingenuity could be applied in a sensitive way to filtering out or at least identifying some of the more egregious material that appears on social media.”
How social media companies are supposed to auto-filter all terrorist content is, of course, left unexplained. When companies like Facebook can’t even filter human breasts without screwing it up, it’s a stretch to say the problem of terrorist content and communications is just a coding breakthrough away. Considering the vast amount of content posted everyday on major networks, it’s not as simple as applying a bit more mental elbow grease. Much of this is relegated to algorithms, simply because there’s not enough manpower in the world to handle the input of billions of social media users.
Clapper also called for tech companies to “work with” law enforcement to provide access to encrypted communications.
Clapper suggested that cooperation could mean “law enforcement particularly would be allowed access to encryption” if it could be done in a “safeguarded way.”
“I hear the argument about if you share once with one person and it’s forever compromised. I’m not sure I really buy into that,” he said.
It really doesn’t matter whether Clapper “buys” this or not. It’s a fact. And it’s a fact that’s been demonstrated in explicit detail by the leak of NSA software exploits. A hole is hole — one that can be used for good or for evil. The world’s top intelligence agency can’t even keep its exploits secure. How are we expected to believe law enforcement agencies are going to keep these backdoors from being discovered and exploited?
Clapper grooved on the Comey vibe during this talk, acting as though tech expertise is some sort of dark art used deliberately to stick it to The Man. Anyone who uses the phrase “miraculous technological things like iPhones” shouldn’t expect to have their assertions taken seriously. It suggests the person making them still has trouble distinguishing between innovation and magic. Consequently, it’s these sort of people who continually claim “safe” backdoors are possible, rather than being the mythical deus ex machina they actually are.