Chrome Security Team Considers Marking All HTTP Pages As 'Non-Secure'

from the moving-towards-encryption dept

Back in August, we noted that Google had started adjusting its search algorithm to give a slight boost to sites that are encrypted. That is, all else equal, sites that use HTTPS will get a slight ranking boost. The company made it clear that the weight of this signal will increase over time, and this is a way of encouraging more websites to go to HTTPS by default (something that we’ve done, but very few other sites have done).

Now it appears that the Chrome Security Team is taking things even further: suggesting that all HTTP sites be marked as non-secure:

We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.

The goal of this proposal is to more clearly display to users that HTTP provides no data security.

More specifically:

UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.

This seems like it could have quite an impact in driving more sites to finally realize that they should start going to HTTPS by default. There’s really no excuse not to do so these days, and it’s good to see the Chrome Security Team make this push. The more encrypted traffic there is, the better.

Filed Under: , , , , , , ,
Companies: google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Chrome Security Team Considers Marking All HTTP Pages As 'Non-Secure'”

Subscribe: RSS Leave a comment
62 Comments
alanbleiweiss (profile) says:

I hate social engineering. If a site is informational only, there’s absolutely no reason to force it into an HTTPS status. To eventually make it APPEAR that an HTTP site is somehow BAD, or DANGEROUS is a horrific notion.

There are so many perfectly legitimate, valid, helpful and worthy sites that won’t go to HTTPS for a plethora of reasons this effort will only hurt more than it helps.

Anonymous Coward says:

Re: Re:

It’s all about the metadata.

Even on informational sites, there’s a lot that can be learned by which pages you access on a specific domain. This data is transmitted in plaintext over the Internet for HTTP, but is encrypted for HTTPS.

I think Google’s on the right track here – the less data (and metadata) being sent via plaintext over the Internet, the better.

alanbleiweiss (profile) says:

Re: Re: Re:

From the original post:

Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.

Most consumers could well make assumptions (correctly or incorrectly) from how its presented. “WARNING – THIS SITE IS UNSECURE” implies danger.

It wouldn’t be unrealistic to assume that many consumers interpret that to be “so dangerous that I should avoid it, without thinking through what this really means in this situation”.

As a User Experience professional, I have seen all too often that a majority of end users have a very difficult time applying critical thought to their online decision making process.

So the question here then – does the good outweigh the bad? I say too many site owners who are incapable of adapting will suffer.

John Fenderson (profile) says:

Re: Re:

“To eventually make it APPEAR that an HTTP site is somehow BAD, or DANGEROUS is a horrific notion.”

Not all HTTP sites are bad, of course, but all are dangerous to some degree in that they make it a lot easier for the data to be sniffed. This is important even for sites that are informational-only because of the metadata issue (I don’t want attackers to know what information I’ve looked at).

You are correct, though, in that for some sites the amount of exposure is relatively small — but it’s exposure nonetheless.

In the end, I don’t see this as social engineering at all. This is simply a browser correctly informing the users of the security status of the site, and an HTTP site is correctly characterized as not secure. Whether or not that is of any importance to the user still remains for the user to determine.

Gracey (profile) says:

Re: I hate social engineering.

I tend to agree. My photography site is not https, and really has no reason to be. I don’t use advertising, I don’t allow direct emails, and only have image galleries for view. People decide whether to call me for an estimate or not.

I see no reason to use https (no reason not to except being stubborn enough to not want to fart my my site, yet again, just to suit Google), and I see no reason to consider it “bad” or “dangerous”.

What a pain in the ass.

mattarse (profile) says:

Re: Re: Re: I hate social engineering.

Though I see your point – that’s an issue between the end user and their ISP, not me and my hosting provider.

I see it as a hassle, but see the good of it. Plus the cost is not that much, and will become cheaper over time.

I also see the point of why should my personal site visited by few people need it. I’m mainly worried about how this affects the content I link to (primarily photos on flickr), but that’s an issue I’ll have to investigate.

Or not, my site isn’t there for traffic anyway. I mainly worry that if I don’t, and google puts a warning on the site, then I’m going to get emails from my mother asking why my site isn’t safe.

jackn says:

Re: Re: Re:2 I hate social engineering.

Though I see your point – that’s an issue between the end user and their ISP, not me and my hosting provider.

Sorry thats wrong. You don’t see the point of encryption between your site and the end user.

And the fact the you mention the ISP for your user shows you don’t even understand the risks that encryption is attempting to minimize.

Chronno S. Trigger (profile) says:

Re: Re: Re:

The size of the site does matter. Ever use noscript? You’ll learn really quickly that larger sites access a lot of domains. Techdirt alone accesses 18 domains. All of which have to be secured. This is why people were complaining about old, imbedded videos and why they won’t embed new videos unless the video’s domain is HTTPs.

Anonymous Coward says:

Re: Re: Re: Site size

With regard to the main article, I would hope they also consider changing the script rules so that Javascript can only be run when it is served over HTTPS. That alone would push many of the more important sites to convert, since so many webmasters are hopelessly addicted to Javascript.

The size of the site does matter. Ever use noscript? You’ll learn really quickly that larger sites access a lot of domains. Techdirt alone accesses 18 domains.

Yes, and RequestPolicy at maximum strictness too. From that, I can say that the Techdirt page for this article accesses: ii.techdirt.com, w.soundcloud.com, sb.scorecardresearch.com, secure.quantserve.com (looks like a tracking pixel), and http://www.bizographics.com. None of those domains are required for correct functioning. I whitelisted none of them, and the only negative impact is I do not get the standard Techdirt CSS.

Whether the size of the site matters depends a lot on how much thought was put into the site at setup time. If the site uses absolute URLs everywhere, it can require a large number of substitutions to convert. If site pages are stored in a bad backing store, it can be difficult to apply a global substitution. Both of these are fundamentally a case of bad decisions coming back to haunt the site operator.

Anonymous Coward says:

Re: Re: Re:2 Site size

With regard to the main article, I would hope they also consider changing the script rules so that Javascript can only be run when it is served over HTTPS.

One of the described security states on the linked page is “Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors)”. I assume “passive” rules out Javascript, so any non-HTTP Javascript would be treated as insecure. It’s not clear what exactly that means–maybe it will still run but with some restrictions; maybe there will be an option to disable scripts in this case.

BeckaS (user link) says:

Re: Re: Re:

Size does matter when you do a lot of manual linking internally. Sure my host could implement it for me but I’d still have to find and fix all those links.

I could have handled being SSL it from the start (I have with my latest site) but my main site is updated two or three times a weeks and has been for five years without fail. That’s a lot of links.

TKnarr (profile) says:

Re: Re: Re:5 Re:

That’s going to be a problem for all external content regardless, as content moves or disappears as people maintain/update their own sites. If you care about keeping links up-to-date, you’ll catch this in your regular checks for broken links and it’s actually a lot easier to fix than most (you just need to update the protocol, rather than having to puzzle out the new path to the content or confirm that it’s no longer there).

Anonymous Coward says:

Re: Re: Re:5 Re:

just scan the database and look at the content for http:// and replace or delete. You could convert to relative as mentioned. You can use php or something to automate the whole thing. But you will probably have to do research to figure out what to replace or delete.

I ran some self hosted wordpress before, but not again. I just converted mine to blogger, its easier than hosting it yourself.

John Fenderson (profile) says:

Re: Re:

Nobody said it was easy.

I don’t want to increase your level of anxiety, but if you don’t have enough technical expertise to even understand instructions on how to secure a site, then you might want to rethink whether or not you’re the one who should be running the site. Even ignoring the HTTPS issue, there are many others that can be easily overlooked or misunderstood.

Alternatively, a legitimate decision would be to simply decide not to do go HTTPS at all.

alanbleiweiss (profile) says:

Re: Re: Re:

I audit 60 to 80 sites a year. From the smallest mom-pop to enterprise global sites with hundreds of millions of pages.

The vast majority of sites are owned and maintained by small business owners who don’t know what they’re doing, and barely can afford to implement fundamental necessities, yet they do so nonetheless, and that allows them to participate in the digital community.

As much as I personally would prefer that every site be set up, maintained and upgraded by qualified professionals, it’s not realistic in the current environment.

Worse still, I’ve seen more than several developers screw up some of what I consider the most fundamental technical changes necessary for a site to function.

Anonymous Coward says:

Re: Re: Re: Re:

The vast majority of sites are owned and maintained by small business owners who don’t know what they’re doing, and barely can afford to implement fundamental necessities, yet they do so nonetheless, and that allows them to participate in the digital community.

The vast majority of them use some sort of hosting provider. Few will use a VPS, let alone run their own server hardware. There are two types of hosting providers: those that make SSL fairly easy to set up (if not effectively automatic), and those that will go out of business as Chrome and other browser makers start ratcheting up the pressure. Those businesses that chose a poor hosting provider will need to consider moving, no different than if they chose a poor landlord for their office space.

John Fenderson (profile) says:

Re: Re: Re: Re:

Yes, I understand your pain. However, there is a middle ground here. You might spend a bit of time looking for a better hosting provider — there are many who will take care of most the security for you and provide you with easy-to-use tools and instructions for handling the parts they don’t.

Such providers are a bit more expensive, but not prohibitively so.

BeckaS (user link) says:

Re: Re: Re:2 Re:

headtilt

I don’t think any webhost would correct all my links for me. My host would fix up SSL for me I’m sure I would still have to manually edit thousands of pages.

I suppose I could do a redirect but I got the impression from the tutorials I read that that isn’t good enough and all the links must be changed to secure a site.

John Fenderson (profile) says:

Re: Re: Re:3 Re:

“I don’t think any webhost would correct all my links for me”

No, they wouldn’t. I don’t know how you have your site encoded and stored but it should be possible to do a global search and replace to fix up all your links. Hopefully, you don’t have all those links stored in a database — then you can just do it as a multifile text replacement operation. If you are using a database, it’s a bit more complex.

Using a redirect is, in my opinion, an acceptable option if your goal is a more secure site, but it doesn’t help you if your goal is to get the “right” status icon in Chrome.

Andrew Norton (profile) says:

Re: Re: Re:

I don’t think you understand some of the issues.

I’m in the same boat. I understand the theory behind SSL just fine. I understand tech pretty damned well (I’m the lead researcher for TorrentFreak for a start, my main degree is in Robotics, and I even interned at ARM) But when it comes to trying to move my own site (and the Copyright trolls wiki) to https, I too gave up after a few hours reading.

Don’t confuse inability on the ‘how’, with ignorance of the ‘what’ or ‘why’. I know what it does, and why it does it, the problem is implementing it.

That’s not a fault of ours. It’s a fault of you, the ‘expert’. So little thought is given to the ‘how’ that it becomes somewhat arcane knowledge. However, “you” (figuratively, not personally) didn’t ‘make’ SSL with “us” in mind. It was created with the thought of only certain kinds of people will need it, and they’ll have an expert handy to do it. So nothing was done on the process itself, because the feeling was that everyone that needed to do it, could.

It’s the “I’m alright, Jack” philosophy again.

Doesn’t matter how good something is, and how easy you find it to use now. If it’s not easy (or at least not awkward) to use, then it won’t be used. Look at Linux for example. Good powerful, stable OS, certainly. But it wasn’t until the likes of Lindows/Linspire, and then Ubuntu (with a sprinkling of help from knoppix and other liveCD based distros allowing people to try before they commit) where the usability problem of Linux started to be addressed, that home/personal use started to increase.

Other hosting-related things have taken the hint. Part of what makes WordPress so popular is their ‘5 minute installer’ (which can be made even easier with the likes of Softaculous – took me 2 minutes to install two wordpress installs, and a mediawiki with that)

SSL is still incredibly user-unfriendly to implement.

Anonymous Coward says:

crying wolf?

I’m not sure how I feel about this because I can see it going badly in two different ways. Either it scares uninformed people away from sites that can’t / won’t change (for whatever reasons) that they really don’t need to worry about or upon seeing that a large number of sites they visit are flagged as “not secure” those same uninformed people will just start ignoring those warnings when it really matters.

I know I know. We shouldn’t have to limit ourselves because of “stupid people” (TM) but the vast majority of Internet users are uninformed and crying wolf could potentially make thins worse.

Anonymous Coward says:

no excuse?

“There’s really no excuse not to do so these days”

Actually, there are plenty of excuses… some of them are even good ones.

I hate statements like that – they’re offhand remarks that seem to belittle people who don’t actually give a shit.

Such statements are sorta like saying: “there’s no excuse for not wearing a helmet, regardless of what you’re doing”.

I’m not sure I will ever provide SSL security for my blog that 3 people read – why should I bother?

Anonymous Coward says:

To me it seems Google hasn’t yet learned what Microsoft did with Vista. Microsoft put in so many warnings about “are you sure you want to save” that everyone got to ignoring the nag screens because it was a constant PNA.

Google’s start in this is liable to come out the same way. It will be far harder to tell the sites that serve up malware from those just unsecured because they aren’t using encrypted connections. In the end run those that figure out it is a pain to deal with all the nag screens may opt for click throughs as fast as they can, nulling any attempt to nudge more encryption into the net.

MrTroy (profile) says:

Re: Re:

To me it seems Google hasn’t yet learned what Microsoft did with Vista. Microsoft put in so many warnings about “are you sure you want to save” that everyone got to ignoring the nag screens because it was a constant PNA.

tl;dr: The Microsoft UAC debacle helped to improve computer security (slightly). If this lesson is appropriate, it suggests that website security will improve (eventually) following the proposed UX change.

Yes, people hated UAC. Yes, people often disabled it. But it’s also absolutely true that third-party Windows software is currently written much better today than it was before UAC was introduced. Nobody wants to install that piece of software that writes data into the program directory any more, because you have to deal with those stupid UAC warnings whenever you run it – go for the software that follows better principles instead.

Violated (profile) says:

Too soon.

Beyond this all being a bad idea then I can say that many sites DO want to move to SSL but there are problems in doingn so where beyond technical implementation the main problem is that SSL certificates requite third party validation. It may be nice to ensure a valid business but this is no one off charge when such services want to milk the punters for all they can with an annual subscription of a high fee.

This is why if SSL is to become the standard across a vast variety of different sites then we need FREE validation. That is exactly what the EFF plan to implement in about 6 months time when they automate the validation process.

So not much can happen until then. Sure we need to lead sites into using SSL by default but it is still wrong to claim unencrypted is somehow dangerous or harmful.

robobenklein (user link) says:

The impact on personal sites.

I know from experience that you can get a basic site online for free, assuming you have a computer and internet access. The only wall that stands in that person’s way is now the fact that their site will be negatively influenced because of how expensive a SSL certificate can be. A domain costs 12$ a year(except in big name cases, or just get a free .tk domain), and somehow an SSL cert is going to cost the owner over 5x that amount. If I had to pay that for a cert right now my site would be left in the dust. And saying that all HTTP sites are not secure is a big claim to make, as my site offers encryption in the form of JavaScript just as powerful as a normal cipher suite.

beltorak (profile) says:

I agree with the motion in part, but I think the conflation of “bad” with “insecure” is a step backwards. There should be 4 levels:

1) Insecure: exactly what it says on the tin. There are no security measures; e.g.: served over HTTP, or the certificate is invalid. A site should be presented this way if it uses a bogus or known broken “encryption” method.

2) Encrypted (and only encrypted): Separating these two alone would let small webmasters deploy self signed certificates for many things that don’t really matter. This is currently displayed as “bad”, but it really shouldn’t be in all cases. The fact that self-signed certs are presented to the user as an error is the biggest hindrance to HTTPS everywhere. And training users to ignore this error is training them to ignore MITM attempts.

3) Encrypted, Authenticated by Third Party: Site has a valid cert that signed by one of the 200 CAs (or a chain of them). This is currently what many pressure sites to achieve. Throw in EV certs here too.

4) Gold (for lack of a better term): This cannot be automatically chosen by the vendor except in very specific circumstances. This is basically user-controlled cert pinning. The user has decided that they trust communication with this site if the communication is signed with this specific cert (or any cert signed with a specific CA). If the cert ever changes the warning should be gigantic.

An education campaign focusing on “the gold standard” would be necessary so that users know to mark their banking sites. And so that they understand that just because something is “encrypted” doesn’t make it secure; it would be easy to draw up illustrations of having a secret conversation with the bad guy. “Sure, your conversation is encrypted, but you could be having an encrypted conversation with your stalker. Encryption is not enough for security, you also need *trust*.”

We absolutely must move away from the disaster that is the third-party CA “trust us, because someone paid someone who paid someone who paid us money” system if there is any hope for real security; so all of this must be controllable by the user. If a user wants to blacklist a specific encryption algorithm or whatever, then any site using that encryption should display as “insecure”. A user should be able to promote a site/cert from “encrypted only” to “gold”. And a user should be able to “untrust” any site/cert combo. Sure, some parts of this are already possible, but the UI is a fucking disaster. The UI *must* improve. Certs, PGP, and other PKI implementations already give us all the necessary tools for strong personal crypto, but the UIs around all of these tools are fucking disasters.

And one final thing; browsers should by default try HTTPS before HTTP.

Violated (profile) says:

A Dirty Society

Following my previous message then I do agree that much more needs to be socially done to promote SSL use where maybe a name and shame system is not so bad.

Take one example: Amazon.com

Secure shopping sounds like a nice idea right without hundreds of spy agencies and companies peering into your Amazon browsing and shopping.

Too bad any HTTPS connection to Amazon.com gets kicked back immediately to HTTP when Amazon is indeed NSA spy on all your shit super friendly.

We can look more into the problem here… http://ecx.images-amazon.com/images/I/51cOZo6A2bL.jpg

I can say that ecx.images-amazon.com is where they store all the product photos where naturally an SSL friendly site has to keep such location behind a valid certificate.

No such luck when change that HTTP link to HTTPS and we can see that Amazon has no certificate installed when this link defaults to the SSL certificate of their cloud hosting provider meaning site name mismatch and a huge connection warning.

I see this as disgusting when it is hardly like Amazon lack the funds to install the right certificates and to provide their visitors with the full encrypted and safe shopping experience.

So name and shame is not such a bad idea of such huge sites along with a boycott should they fail to change.

There are of course many sites out there who provide the full SSL service like Wiki do.

John Fenderson (profile) says:

Re: Re:

I honestly don’t see this as being against small self-hosters at all. While there are always some exceptions, most small self-hosters don’t have huge and complex websites, so converting them to use HTTPS isn’t very onerous at all. self-hosters that have large, complex websites should also have the skills or access to people with the skills to convert the site to HTTPS. If they don’t, then they have larger issues than the HTTPS one.

Anonymous Coward says:

I for one like my lock in the address bar. I encrypt my wireless signal, I encrypt my data, only to go on line using an unencrypted signal? When I visit a web site I do notice when other IP addresses are also connecting. I use startpage as my search and home page, they have been https compliant for some time now, and no additional connections like I get with that duck search. The internet is getting to be quite the jungle, I use an IP blocker, which only goes so far due to VPN’s and proxies to name two. A tad over 3 billion IPs blocked at present. If you can’t make the effort to secure it I won’t use javascript, or even cookies when I visit. Here is a condom son, if the need arises make sure you use it, and if it gets old replace it.

joe_public says:

Google Acting Like Microsoft

I’ve always been a fan of google but this is the kind of thing microsoft does and part of why so many people hate microsoft. You are forcing your will and making people eat the crap because you are so big no one can stop you. My website is big and challenging for my family’s small business to maintain. Downgrading our superior content to websites with crap content because they are https is evil. You are generating hate for google when you screw us over. PLEASE DON’T!

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...