Wandering Thoughts archives

2016-02-22

The university's coordination problem

In response to my entry on how we can't use Let's Encrypt in production, Jack Dozier left a comment asking if we'd looked into InCommon's Certificate Service. InCommon is basically a consortium of US educational institutions that have gathered together to, among other things, create a flat cost CA service; apparently, for $15k US or so a year, your university can get all the certificates you want (including for affiliated organizations). This sounds great, but at least here it exposes what I'm going to call the university coordination problem.

Put simply, suppose that the university spent $15k a year to get 'all you want' certificates. More specifically, this would be the central IT services group. Now, how does the central IT group get the news out to everyone here that you can get free certificates through this program?

The University of Toronto is a big place, which means that there are a dizzying number of departments, research groups, professors, and various other people who could possibly be buying TLS certificates for something they're doing. Many of these people do not deal with IT issues like TLS certificates on an ongoing basis, so they're extremely unlikely to remember the existence of a service they might have gotten an email blast about half a year ago.

(And I guarantee that if you sent that email blast to professors, most of them deleted it unread.)

Nor is there a central place where money gets spent that you can set up as a chokepoint. I mean, yes, there is a complicated university wide purchasing department, but no one sane is going to make people get pre-approval from purchasing for, say, twenty dollar expenses. The entire university would grind to a halt if you tried that (followed immediately by a massive revolt by basically everyone). TLS certificates are well under the preapproval cost threshold, so in practice people purchase most of them through university credit cards.

In theory CAs themselves might serve as a roadblock, by requiring approval from the owner of the overall university domain. In practice I believe that many CAs will issue TLS certificates if you can simply prove ownership of the subdomain you want the certificate for. CAs have an obvious motivation to do this if they can get away with it, since it means that more people are likely to buy certificates from them.

(In general, vendors of things are highly motived to let little departments and groups buy things without the involvement of any central body, because involving central things in a big company invariably slows down and complicates the process. You really want some person in some group to just be able to put your product or service on their corporate credit card, at least initially.)

This is not an issue that's unique to TLS certificates. It's a general issue that applies to basically anything relatively inexpensive that the university might arrange some sort of a site license for. The real challenge is often not buying the site license, it's insuring that it will get widely used, and the issue there is simply getting the news out and coordinating with all of the potential users. Some products are pervasive enough or expensive enough that people will naturally ask 'do we have some sort of central licensing for this', but a lot of them are not that way. And you can be surprised about even relatively expensive products.

(For that matter, I suspect that this issue comes up for things that are expensive but uncommon. For instance, we have a site license for a relatively expensive commercial anti-spam system, but I suspect that many people running mail systems here don't know about it, even if it would be useful to them.)

PS: This problem is probably not unique to universities but is shared at least in part by any sufficiently large organization. However, I do think that universities have some features that make it worse, like less central control over money.

UniversityCoordinationProblem written at 02:16:03; Add Comment

2016-02-10

The fundamental practical problem with the Certificate Authority model

Let's start with my tweet:

This is my sad face when people sing the praises of SSH certificates and a SSH CA as a replacement for personal SSH keypairs.

There is nothing in specific wrong with the OpenSSH CA model. Instead it simply has the fundamental problem of the basic CA model.

The basic Certificate Authority model is straightforward: you have a CA, it signs things, and you accept that the CA's signature on those things is by itself an authorization. TLS is the most widely known protocol with CAs, but as we see here the CA model is used elsewhere as well. This is because it's an attractive model, since it means you can distribute a single trusted object instead of many of them (such as TLS certificates or SSH personal public keys).

The fundamental weakness of the CA model in practice is that keeping the basic CA model secure requires that you have perfect knowledge of all keys issued. This is provably false in the case of breaches; in the case of TLS CAs, we have repeatedly seen CAs that do not know all the certificates they mis-issued. Let me repeat that louder:

The fundamental security requirement of the basic CA model is false in practice.

In general, at the limits, you don't know all of the certificates that your CA system has signed nor do you know whether any unauthorized certificates exist. Any belief otherwise is merely mostly or usually true.

Making a secure system that uses the CA model means dealing with this. Since TLS is the best developed and most attacked CA-based protocol, it's no surprise that it has confronted this problem straight on in the form of OCSP. Simplified, OCSP creates an additional affirmative check that the CA actually knows about a particular certificate being used. You can argue about whether or not it's a good idea for the web and it does have some issues, but it undeniably deals with the fundamental problem; a certificate that's unknown to the CA can be made to fail.

Any serious CA based system needs to either deal with this fundamental practical problem or be able to explain why it is not a significant security exposure in the system's particular environment. Far too many of them ignore it instead and opt to just handwave the issue and assume that you have perfect knowledge of all of the certificates your CA system has signed.

(Some people say 'we will keep our CA safe'. No you won't. TLS CAs have at least ten times your budget for this and know that failure is a organization-ending risk, and they still fail.)

(I last wrote about this broad issue back in 2011, but I feel the need to bang the drum some more and spell things out more strongly this time around. And this time around SSL/TLS CAs actually have a relatively real fix in OCSP.)

Sidebar: Why after the fact revocation is no fix

One not uncommon answer is 'we'll capture the identifiers of all certificates that get used and when we detect a bad one, we'll revoke it'. The problem with this is that it is fundamentally reactive; by the time you see the identifier of a new bad certificate, the attacker has already been able to use it at least once. After all, until you see the certificate, identify it as bad, and revoke it, the system trusts it.

CAFundamentalProblem written at 02:12:58; Add Comment

2016-02-07

Your SSH keys are a (potential) information leak

One of the things I've decided I want to do to improve my SSH security is to stop offering my keys to basically everything. Right now, I have a general keypair that I use on most machines; as a result of using it so generally, I have it set up as my default identity and I offer it to everything I connect to. There's no particular reason for this, it's just the most convenient way to configure OpenSSH.

Some people will ask what the harm is in offering my public key to everything; after all, it is a public key. Some services even publish the public key you've registered with them (Github is one example). You can certainly cite CVE-2016-0777 here, but there's a broader issue. Because of how the SSH protocol works, giving your SSH public key to someone is a potential information leak that they can use to conduct reconnaissance against your hosts.

As we've seen, when a SSH client connects to a server it sends the target username and then offers a series of public keys. If the current public key can be used to authenticate the username, the server will send back a challenge (to prove that you control the key); otherwise, it will send back a 'try the next one' message. So once you have some candidate usernames and some harvested public keys, you can probe other servers to see if the username and public key are valid. If they are valid, the server will send you a challenge (which you will have to fail, since you don't have the private key); if they are not, you will get a 'try the next one' message. When you get a challenge response from the server, you've learned both a valid username on the server and a potential key to target. In some situations, both of these are useful information.

(If the server rejects all your keys, it could be either that none of them are authorized keys for the account (at least from your IP) or that the username doesn't even exist.)

How do people get your SSH public keys if you offer them widely? Well, by getting you to connect to a SSH server that has been altered to collect and log all of them. This server could be set up in the hopes that you'll accidentally connect to it through a name typo, or it could simply be set up to do something attractive ('ssh to my demo server to see ...') and then advertised.

(People have even set up demonstration servers specifically to show that keys leak. I believe this is usually done by looking up your Github username based on your public key.)

(Is this a big risk to me? No, not particularly. But I like to make little security improvements every so often, partly just to gain experience with them. And I won't deny that CVE-2016-0777 has made me jumpy about this area.)

SSHKeysAreInfoLeak written at 03:25:21; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.