Wandering Thoughts archives

2018-07-24

I doubt Chrome's new 'not secure' warning about HTTP sites will change much (at least right away)

In theory today (July 24th) is the start of a HTTP apocalypse, because Google has launched Chrome 68 and Chrome 68 labels all HTTP sites as 'not secure'. More exactly, it adds a 'not secure' label to the URL bar (or omnibox, if you prefer that term). It's possible that Firefox will follow now that Chrome has led the way on this and in any case Chrome apparently has about 60% of the browser market, so its decision here affects a lot of people. However, I don't think this is going to be as big a deal as you might expect (and as some people fear) for three interlinked reasons.

The first reason is the same fundamental issue as the one affecting EV certificates, which is that all this is doing (right now) is changing the URL bar a little bit. We have pretty good proof (from EV certificates among other things) that very few people pay much attention to the URL bar, and the 'not secure' is even less prominent than EV certificates were (EV certificates at least used a different colour). It seems fairly unlikely that people will even notice the change, which is an obvious prerequisite for them caring.

The second reason is that people mostly don't care about this. When people go to a website, it's because they want to see the website, and they really don't care about anything in the way (as we have seen in the past when browsers let people easily override TLS certificate warnings). There aren't likely to be very many people who will change their behavior because they're suddenly being warned (a very little bit) that their connection is 'not secure'. Without the users visibly caring, many sites will not have much extra motivation to change.

(They'll have some extra motivation; the 'not secure' is a nudge. But it's not a really strong nudge, at least not now.)

The third reason is that plenty of sites are going to remain HTTP (and thus 'Not secure') for a great deal of time to come. For many people, this will make the 'not secure' label a routine thing that they see all the time, and routine things rapidly lose any power they might once have had. If even a tenth of your web browsing is 'not secure' and nothing particularly bad happens, you're likely to conclude that the 'not secure' warning is unimportant and something you can freely ignore. This feeds into the other two reasons; unimportant things get ignored, and if you are one site in a crowd of many, why go to much work to change (especially if no one seems to care).

I understand why Google and other people are enthused about this and I think it's a positive step forward to an all-HTTPS world. But in my opinion the 'not secure' label is only the tip of the iceberg as far as its importance goes and we shouldn't expect that label to do much on its own. I suspect that the long run importance of this will be how it changes the attitudes of web developers and website operators, not any changes in user behavior.

(To put it one way, the 'not secure' label is the surface sign of an increasingly broad consensus view that HTTP needs to go away (for good reasons). That we have gotten far enough along in this view that the Chrome developers can make this change without facing a big backlash is the big thing, not the label itself)

HTTPInsecureDoubts written at 23:10:40; Add Comment

2018-07-06

I'm seeing occasional mysterious POST requests without Content-Types

Sometimes I go out of my way to turn over rocks in the web server logs for Wandering Thoughts, but other times my log monitoring turns them over for me. The latter is how I know that Wandering Thoughts has been seeing periodic bursts of unusual POST requests that don't appear to have a Content-Type. I saw another such burst today, so I'm going to write this one up.

Today's burst is six requests from a single IP (86.139.145.21), POST'ing to a single entry between 12:55:12 and 12:56:08. In fact there were two burst of three POSTs each, one burst at 12:55:12 and 12:55:13 and the second at 12:56:08. DWiki's logging say that all of them lacked a Content-Type but it didn't record any other details. This specific IP address made no other requests today, or even in the past nine days. On July 2nd, it was nine POSTs to this entry from 59.46.77.82 in three bursts of three, at 21:36:20, 21:42:2[12], and 21:53:35. Both IPs used a very generic User-Agent that I believe is simply the current Chrome on Windows 10.

In all of the cases so far, the POSTs are made directly to the URL of a Wandering Thoughts entry, not to, say, the 'write a comment' page. This is noteworthy because I don't have any forms or other links that do POST submissions to entry URLs; all references to entry URLs are plain links and thus everyone should be using GET requests. Anything that's deciding to make these POST requests is making them up, either by mistake or through some maliciousness.

(In the past I've seen zero length POSTs with a valid HTML form content-type, which I believe were also for regular entry URLs although past me didn't write that explicitly in the entry.)

There's a part of me that wants to augment DWiki's logging to record, say, the claimed Content-Length for these POST requests so I can see if they claim to have content or if they're 0-length. Probably this is going further in turning over rocks than I want to, unless I'm going to go all the way to logging the actual POST body to try to see what these people are up to.

(Apparently POSTs without a Content-Type are technically legal and you're supposed to interpret the contents as the generic application/octet-stream (unless you want to attempt to guess by inspecting the data, which you don't). See eg here, pointing to the HTTP 1.1 specification. However, all of my POST forms properly specify the content-type the browser should use, so this shouldn't be happening even for proper POST requests to valid POST URLs.)

PS: Apache probably accepts POSTs with no Content-Type to static, GET-only resources because Apache will accept pretty much anything you throw at it. DWiki is more cautious, although that's basically become a mistake.

POSTWithoutContentType written at 01:35:10; Add Comment

2018-07-01

Understanding the first imperative of a commercial Certificate Authority

A lot of things about the how the CA business operates and what CAs do is puzzling from the outside, and may even lead people to wondering how exactly a CA could ever do some particular crazy thing. I've come to feel that we can understand a lot by understanding that the first imperative of a commercial CA is to sell TLS certificates, no matter what it requires.

(This is different from the CA's first job of having its root certificates included in all of the browsers, which these days absolutely must include iOS and Android.)

There are well intentioned people at many commercial CAs that care about the overall security and health of the TLS ecosystem, and some of them hold some degree of power in their respective organizations. But they cannot change the overall nature of the beast that is a commercial CA, because being commercial means that they must make a profit somehow and that means selling certificates (and in order to grow, they must sell more certificates or more expensive certificates or both).

One important consequence of this is that commercial CAs are fairly highly motivated to push the edges of trust and security, especially today (given Let's Encrypt's increasing domination). Sure, their good employees have pushed back and will push back to the extent that they can, but that can only go so far. As we've seen over and over with email spam, sooner or later the people on the side of money win those arguments, and the only real limit is the increased willingness of browsers to kick CAs to the curb. So we shouldn't be at all surprised when CAs do bad stuff, especially now. One extremely cynical view of this dynamic is that commercial CAs don't really want to securely validate things, they want to find some excuse to take your money and give you some magic bits. If they can make that excuse be secure, that's great, but it's not the most important thing.

(Although I can't find the details now, I believe there was a CA that was accepting emailed 'scans' of 'official documents' from would-be customers as proof of control of domains. This seems obviously crazy from the outside.)

Another cynical way to look at the current situation is that a commercial CA's only remaining natural market is people who can't use Let's Encrypt certificates. Sometimes this will be people who can't deal with short duration certificates, but at least some of the time it's going to be people who can't pass LE's checks for some reason, probably a good reason. Commercial CAs are quite motivated to find some way to give them a certificate anyway.

(Commercial CAs also have a legacy market in people who either haven't heard of Let's Encrypt or don't understand it, but that market is going to shrink over time. We can probably expect commercial CAs to work hard with FUD to keep these people ignorant and in the fold.)

Next, no commercial CA is going to propose or support anything that cuts its own throat, no matter how good for security it would be, and while there are some motives for supporting measures that wind up increasing your operational costs (if this is somehow a benefit to you over your competition), there are limits (and CAs may be hitting them). Commercial CAs are also likely to try to persuade browsers to do things that help out EV certificates, and they're probably going to do a lot of that persuasion in public in order to try for greater pressure.

This shades into another obvious but sad consequence, which is that commercial CAs have a great motive for encouraging ignorance, superstition, and FUD, especially over things like EV certificates (see Troy Hunt tearing apart some recent CA marketing FUD, for example). If people with money don't understand that they can just get a DV TLS certificate from Let's Encrypt and it's just as good as an EV cert (see also), you have a chance to sell them your version of this commodity.

One conclusion I draw from this is that CAs are likely to refuse to drop the maximum certificate validity period down very low, because relatively long duration certificates are one area where they have something that Let's Encrypt doesn't.

(I've probably said some variant of this in past entries, but I haven't written it up as a full entry. For various reasons I feel like doing it today.)

Immediate post-publication update: See Digicert withdrawing from the CA Security Council and the HN comments on it, especially this discussion of the background of the CASC and so on.

CAFirstImperative written at 22:57:12; Add Comment

By day for July 2018: 1 6 24; before July; after July.

Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.