Should Wikipedia Force All Users To Use HTTPS?
from the politics-of-encryption dept
It would be something of an understatement to say that encryption is a hot topic at the moment. But leaving aside deeper issues like the extent to which the Internet’s cryptographic systems are compromised, there is a more general question about whether Web sites should be pushing users to connect using HTTPS in the hope that this might improve their security. That might seem a no-brainer, but for the Wikimedia Foundation (WMF), the organization that runs Wikipedia and related projects, it’s a more complex issue.
The problem is that HTTPS access is disabled in some countries precisely to prevent users from being able to access sites securely. So when Wikimedia introduced its HTTPS-by-default policy on August 28, it made a couple of exceptions:
Some users live in areas where HTTPS is not an easy option, most times because of explicit blocking by a government. At the request of these communities, we have made an explicit exclusion for users from those affected countries. Simply put, users from China and Iran will not be required to use HTTPS for logging in, nor for viewing any Wikimedia project site.
An interesting post by Erik Möller, the WMF’s Deputy Director, raises the question whether such exceptions help or hinder freedom in those countries:
In the long term, the Wikimedia movement is faced with a choice, which is inherently political: Should we indefinitely sustain security exceptions for regions that prevent the use of encryption, or should we shift to an alternative strategy?
Here are some of the considerations he mentions:
If we accommodate [China]’s or Iran’s censorship practices, we are complicit in their attempts to monitor and control their citizenry. If a privileged user’s credentials (e.g. Checkuser) are misused by the government through monitoring of unencrypted traffic, for example, this is an action that would not have been possible without our exemption. This could potentially expose even users not in the affected country to risks.
Möller goes on to suggest the following:
It could be argued that it’s time to draw a line in the sand — if you’re prohibiting the use of encryption, you’re effectively not part of the web. You’re subverting basic web technologies.
Drawing this hard line clearly has negative near term effects on the citizenry of affected countries. But the more the rest of the world comes together in saying “What you are doing is wrong. Stop it.” — the harder it will be for outlier countries to continue doing it.
That may have been a defensible position last week, when he wrote those words, but it certainly isn’t now. Snowden’s information about efforts by the NSA and GCHQ to undermine every form of online encryption shows how they are “subverting basic web technologies” in a profound way; it is therefore no longer possible for the West to wag a finger at countries like China that are doing the same. However, Möller also points out:
There _are_ effective tools that can be used to circumvent attempts to censor and control the Internet. Perhaps it is time for WMF to ally with the organizations that develop and promote such tools, rather than looking for ways to guarantee basic site operation in hostile environments even at the expense of user privacy.
This is surely a better strategy. It would allow Wikimedia users in countries where HTTPS is blocked to access Wikipedia and related projects in a discreet way. Such circumvention tools could also be useful for many other sites that face the same problem, and so would be a force for combating censorship and control in general. Finally, if the privacy situation in the West continues to deteriorate, the software might even come in handy there, too.
Follow me @glynmoody on Twitter or identi.ca, and on Google+
Filed Under: challenges, filtering, https, security, wikipedia
Companies: wikimedia foundation, wikipedia
Comments on “Should Wikipedia Force All Users To Use HTTPS?”
Finally, if the privacy situation in the West continues to deteriorate, the software might even come in handy there, too.
This is quite true. And incredibly depressing.
Indeed, but the depressing fact is the west is acting more like a dictatorship each year, except having elections.
Though even there, there’s conspiracy theories with some basis in reality that some elections are fraudulent to (see the Diebold election machine conspiracies that were especially common in 2004, the company doesn’t help itself by threatening to sue anyone who wants to look at their code to verify their voting machines are 100% accurate, citing trade secrets).
Re: Re: Re:
4- or 5-yearly pacification pageants? That’s a difference?
I may be cynical, but https isn’t secure anymore.
Google may have confidence a string of PlayStation 3 consoles won’t break https, but there’s nothing saying a floor full of IBM super computers can’t.
Encrypting everything, even if the where the cipher is breakable, cause a problem for governments. They probably cannot afford the computers and power needed to break everything and so have to target what they try and break. Keeping the encrypted for latter decryption is a losing games as the backlog just keeps growing. Faster computers and networks just mean more encrypted data.
While this does not prevent targeted attacks against people, or even trying to decrypt data once some becomes targeted, it does prevent the general spying on everybody. Therefore the more data is encrypted, and the stronger the encryption, the more spy agencies are forced to target their efforts on identified suspects. Breakable encryptions if widely used restores a measure of privacy to most communications, except where the spy agencies can get a decrypted feed from the service providers.
Re: Re: Re:
This right here. We have a chance to effectively put them back to the time when even collecting this information on this scale was impossible. Suddenly collecting it all won’t do them any good as they can’t read it all.
Plus once we get in the habit of of using encryption for everything then switching out current encryption for stronger types should be easier because its no longer an afterthought only some people worry about.
OpenSSL and IPsec are both broken. Neither standard has been audited, from what I understand.
Using HTTPS in that context gives no security.
Have they thought about GNU TLS v1.2 / DTLS 1.2?
Both are LGPL v3+ Open Source projects. Heavily audited, and very portable.
We have an interesting path dependence issue here.
Suppose for a moment Wikipedia goes all the way, and forces HTTPS for everyone, even anonymous readers, like github does.
Suppose then that, after a few years, a third country (not China, Iran, or the USA) decides to also block HTTPS. There would be an incredible backlash, since Wikipedia is very popular. They would not be able to convince Wikipedia to disable HTTPS for them (the only country which might be able to do that would be the USA, since it is where the Wikimedia servers are).
That is, once encryption is the default, it is sticky.
The current exception with China and Iran is the same, in the opposite direction – once encryption is banned, it is hard to enable it by default, since either Wikipedia convinces China and Iran to unblock it, or it cannot be accessed from there.
Which is why forcing encryption by default now, except for these two countries, is a great strategy – once it is done, it is hard for a third country to decide to ban it. And it goes beyond that: once the government of one of these two countries decides to allow encryption again, and Wikipedia disables the exception for that country, the country cannot go back.
This applies not only to Wikipedia, but to the web as a whole: the more sites force encryption (permanent redirect from http: to https: plus strict transport security), the harder it is to ban encryption.
The current NSA revelations are not an issue for this. TLS has what is called cryptographic agility: the algorithms used are negotiated between the endpoints, and can be swapped later for more secure ones. Even if every algorithm used by TLS has been broken, new ones can be introduced, and will be used if both endpoints allow them.
Tor users are currently banned from editing most Wikimedia projects (and have been for the last 5 years or so), even if logged in over HTTPS. So the first step in promoting circumvention tools would be to remove this ban.
Re: Tor ban
No. Tor is (badly) broken, as recent research papers and informal reports have shown. Enabling Tor for Wikimedia edits will have two effects:
1. It will reveal to third parties who’s editing Wikimedia pages.
2. It will enable copious abuse of Wikimedia.
There is, unfortunately, a persistent mythos that Tor is magic and that it works perfectly and that it’s opaque and so on: nothing could be further from the truth. The best outcome would be to take some “lessons learned” from it, throw it away, and start over.
Re: Re: Tor ban
What third parties? They’d have to break SSL (or Tor); maybe the NSA could do it, but it’s harder than just watching non-Tor SSL traffic. Without Tor even ISPs can tell who’s editing.
Even if logins are required?
If Wikimedia wants to promote circumvention tools, which tools should they promote? If such tools are just banned it’s not a very strong form of promotion.
Re: Re: Re: Tor ban
What third parties?
I suggest doing a web search for “tor nsa” and reading what you find.
Then consider that the NSA is hardly the only intelligence organization on this planet with an interest in doing so.
Consider further that 60% of Tor development is paid for by the USG, which has now been definitively shown to be deliberating introducing weaknesses/backdoors into code. Do you REALLY think that the USG would be giving money to Tor if they couldn’t break it?
Of course the problem here is that once entity A can crack Tor, entities B through Z will be aware that it’s a solvable problem and may even pick up some hints on how to replicate the work.
Even if logins are required?
Yes. Nothing stops an abuser from creating a large number of logins (either manually or via automation) — any measure that would attempt to do so would also block the legitimate creation of a single login by a non-abuser.
[…] which tools should they promote?
Usenet. With encryption, of course. It’s incredibly hard to block WHEN DEPLOYED PROPERLY and because it’s been around for a very long time, rather sophisticated anti-abuse mechanisms have been developed for it. Yes, it’s slower: that’s a feature, not a bug. Yes, it takes some clue: that’s also a feature, not a bug. But given that it can be propagated over network links, over dialup, via USB stick, via CD/DVD, via tape, via just about anything…it’s enormously resistant to disruption.
Re: Re: Re:2 Tor ban
Are you serious?
I am puzzled, how one would edit Wikimedia using Usenet?
Why would anybody want to use Usenet and old, weak and totally open to abuse and spying?
Re: Re: Tor ban
If TOR is badly broken then why did the FBI have to use malware on TOR users in order to get their IP addresses. And note that the vulnerability used by the malware lay in the browser not TOR itself.
Re: Tor ban
Anonymous proxies have been blocked on Wikipedia for a long time due to an history of abuse.
However, there is a way to bypass the block on anonymous proxies and tor exit nodes: the ipblock-exempt permission. There seem to also exist special “closed proxies” which are not blocked and which can be used to edit Wikipedia. See https://en.wikipedia.org/wiki/Wikipedia:IP_block_exemption and https://en.wikipedia.org/wiki/Wikipedia:Advice_to_users_using_Tor for more information.
Re: Re: Tor ban
Only “An editor who has genuine and exceptional need” is supposed to use that; at best, this indicates Wikimedia is (barely) tolerating circumvention tools, not promoting them. Users requesting an unblock need to provide an email address, which they may be unwilling to do if they actually think a government is monitoring them.
Will Wikipedia also force users to use TLS 1.2, since older HTTPS encryption protocols have known vulnerabilities? Or are they only trying to pretend to make their site more secure?
Note that the current stable release of Firefox does not support TLS 1.2.
httpseverywhere solves these issues. If https available then use, else fall back to http. It’s free.
Search addon: HTTPS Everywhere