Why https was a mistake, but an inevitable one

December 3, 2010

As I alluded to in a comment here, one reason that we can't expect people to understand that 'low quality encryption' is basically 'no encryption' is that browsers don't. In fact the issue shows why having a distinct and specially marked 'https' is ultimately a terrible mistake; it has trained programmers and users that https means security, when in fact it doesn't.

What would have been much better is if browsers hid the difference between the raw protocol being used and instead showed an indicator of the degree of security involved. This would have had two beneficial effects. First, it makes it much easier to deploy opportunistic encryption to foil passive eavesdroppers, since you are not telling people that the use of SSL means that they are secure (instead you are hiding that SSL is in use). Second, it lets browsers only mark things as secure if you are using SSL parameters and certificates that are actually secure, not just if you are using SSL at all.

Okay, it would also have a very important third effect: it would have at least tried to communicate to users the thing that is actually important, that being the security of your connection. Using SSL is merely a necessary prerequisite to actual security and so the current situation tells users about an implementation detail.

(Thus it is not a surprise to me that current browsers tend to de-emphasize the whole 'you are using SSL' business. Sadly this is too late in practice; people's expectations are too solidly set by now.)

Transparently attempting SSL without slowing things down has a number of technical issues, but there is a deeper reason why it wasn't really possible to do this back in the dawn of the web when SSL was being introduced. The simple summary is cryptography export controls. Back in the days, US companies were not allowed to export products with actual secure cryptography, only ones with aggressively weak keys. In a theoretical world where browser vendors are turning address bars yellow because of security instead of SSL, this presents the browser vendors with the serious issue of how to show this to users.

The honest approach is to not mark these weak ciphers as secure in any version of the browser. This is unpopular with the purchasers of SSL certificates, who are after all paying money in order to have people's address bars turn yellow; it is especially unpopular with non-US server operators, whose servers only did insecure key lengths and so would never get yellow address bars no matter what SSL certificates they bought.

The 'as secure as possible' approach is to mark these weak ciphers as secure only in the international version of the browser. Now you have a confusing user experience; the same website using the same SSL will be shown as insecure in one browser and secure in another. (This too is not so popular with non-US SSL server operators, who will never be shown as secure to US users.)

(Always marking these weak ciphers as secure even in domestic browsers makes a joke of your claim to be turning the address bar yellow only when the connection really is secure.)

Turning address bars yellow on SSL instead of security has the great advantage of avoiding all of this. You have a simple rule and your international and domestic browser versions behave the same on all SSL-supporting websites.

(PS: I'm sure that the technical issues alone were more than enough to sink the whole idea of transparent SSL back when it could have been introduced.)

Comments on this page:

From at 2010-12-03 13:58:32:

The semantics by which security levels are communicated to the end user are unimportant. Regardless of what you choose (threat color system/numeric/whatever), most users will not care. The levels are arbitrary to them. Either it's secure or it isn't.

What makes more sense is to ship browsers which enforce a secure set of ciphers (e.g. the HIGH suite as defined by openssl) out of the box. Should the TLS handshake fail, the browser could then throw a pop-up indicating that a secure set of ciphers cannot be negotiated and that an insecure set of ciphers can be attempted should they choose to. It would be trivial to make this configurable.

Regarding the URL, some browsers (Chrome) already hide the access mechanism when browsing http. They could just as easily drop the https. I'm also a fan of fan of Firefox tendency to display elements of the x509 subject in the location bar. This is a huge step forward. TLS' ability to authenticate a service is just as important as its ability to provide encryption.

By cks at 2010-12-04 00:10:35:

Two comments: first, asking users questions doesn't work. Second, I strongly disagree with the view that SSL/TLS gives any sort of meaningful authentication in practice; see this entry or this one.

(The final nail in the coffin of SSL authentication in practice is the wide variety of 'trusted' certificate authorities that are anything but, and I am not even talking about shoddy practices.)

From at 2010-12-06 18:16:58:

Are you proposing every TLS secured service which does not provide strong ciphers shows up as if it were a cleartext service?

I do not see a problem with asking end-users questions if you provide them with enough data to help them make a well informed decision. If you are asking the same question on a routine basis (e.g. Microsoft Vista aka whack-a-mole game), you have bigger problems. For what it's worth, I know for a fact that these pop-ups prompt customer service calls which in turn prompt action by providers. How many calls do you think google gets because a customer's ID button is blue instead of green? How many people hastily close their browser because the location bar is a weird color? While visual cues can be useful, they are only useful when users are properly educated. This makes them just as dangerous as pop-ups/questions.

Regarding authentication, this is not a failure of TLS but a weakness in PKI. As you know, if you cannot trust your CAs, you cannot trust any of the data they sign. The entire house of cards comes tumbling down. The CA business is not well regulated and could use an overhaul. The businesses and governments which rely upon them could easily get the ball rolling.

The trust system would benefit from multiple signatures (think PGP) or a community certificate rating system. Placing full trust in profit driven CAs just seems like a bad idea.

By cks at 2010-12-07 01:09:01:

My reply got long enough that I made it an entry, HttpVsHttpsMistakeII.

Written on 03 December 2010.
« My view of OpenSolaris and Illumos
Asking users questions never increases security »

Page tools: View Source, View Normal, Add Comment.
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Fri Dec 3 02:18:48 2010
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.