The modern HTTPS world has no place for old web servers

May 13, 2020

When I ran into Firefox's interstitial warning for old TLS versions, it wasn't where I expected, and where it happened gave me some tangled feelings. I had expected to first run into this on some ancient appliance or IPMI web interface (both of which are famous for this sort of thing). Instead, it was on the website of an active person that had been mentioned in a recent comment here on Wandering Thoughts. On the one hand, this is a situation where they could have kept their web server up to date. On the other hand, this demonstrates (and brings home) that the modern HTTPS web actively requires you to keep your web server up to date in a way that the HTTP web didn't. In the era of HTTP, you could have set up a web server in 2000 and it could still be running today, working perfectly well (even if it didn't support the very latest shiny thing). This doesn't work for HTTPS, not today and not in the future.

In practice there are a lot of things that have to be maintained on a HTTPS server. First, you have to renew TLS certificates, or automate it (in practice you've probably had to change how you get TLS certificates several times). Even with automated renewals, Let's Encrypt has changed their protocol once already, deprecating old clients and thus old configurations, and will probably do that again someday. And now you have to keep reasonably up to date with web server software, TLS libraries, and TLS configurations on an ongoing basis, because I doubt that the deprecation of everything before TLS 1.2 will be the last such deprecation.

I can't help but feel that there is something lost with this. The HTTPS web probably won't be a place where you can preserve old web servers, for example, the way the HTTP web is. Today if you have operating hardware you could run a HTTP web server from an old SGI Irix workstation or even a DEC Ultrix machine, and every browser would probably be happy to speak HTTP 1.0 or the like to it, even though the server software probably hasn't been updated since the 1990s. That's not going to be possible on the HTTPS web, no matter how meticulously you maintain old environments.

Another, more relevant side of this is that it's not going to be possible for people with web servers to just let them sit. The more the HTTPS world changes and requires you to change, the more your HTTPS web server requires ongoing work. If you ignore it and skip that work, what happens to your website is the interstitial warning that I experienced and eventually it will stop being accepted by browsers at all. I expect that this is going to drive more people into the arms of large operations (like Github Pages or Cloudflare) that will look after all of that for them, and a little bit more of the indie 'anyone can do this' spirit of the old web will fade away.

(At the same time this is necessary to keep HTTPS secure, and HTTPS itself is necessary for the usual reasons. But let's not pretend that nothing is being lost in this shift.)

Comments on this page:

By Blue_Monk at 2020-05-13 06:02:33:

How about using a pfsense firewall with HAproxy and certbot enabled ? Not much to do after it's set up. Maybe update it every now and then but that is it. You can serve old school website all day long and still comply with the new rules. I'm what you would call a noob but it still takes me only 30 min to get it set up and ready to go for a 1 domain basic setup(Maybe 1h if I set up a few subdomains too)

From at 2020-05-13 06:43:22:

I wonder if some people will simply say "forget this", run plain-HTTP, and just deal with the 'this site is insecure' icon that most web browsers have nowadays.

By wvh at 2020-05-13 09:49:42:

Isn't that what happened to email a long time ago? Some big players, and a lot of work and uncertainty for small fish.

I remember setting all that stuff up in the nineties, and even though my responsibilities have moved nowadays for other reasons, I'm happy I don't have to babysit email servers in this day and age.

Yet, it would be nice to be able to keep the playing field open to all.

By David Waite at 2020-05-13 12:32:48:

If you set up a HTTP server in 2000 then forgot about it, that server has likely been thoroughly defaced through known remotely exploitable compromises, and likely was taken offline by whatever hosting provider used due to the volume of malicious traffic coming from it.

If the content and hosting of that website somehow survived, you might have also now have received over a decade of complaints due to people not being able to see the Adobe Flash or Java applets you added to spruce up your content. You might also find that most modern browsers don't render the site correctly because you originally styled things based on Internet Explorer 4-5.

I don't believe it has ever been possible to stand still within a network and not accumulate layers of rust and dust.

By Andrew Reid at 2020-05-13 14:23:26:

Ran into this just today. In my case this was due to a ancient Dell PE2900 server and the associated web server to manage the PERC Raid card. It used a self signed certificate and TLS 1.0; nobody wanted to talk to it.

I had to dig out an old laptop with an equally ancient browser so I could replace a disk. And this isn't even the DRAC interface which uses a long forgotten version of Java ( !!!) which barely worked on the best of days ten years ago.

By Dan at 2020-05-13 18:06:19:

@ Well, that'll work for a few years, but my impression is that browsers don't intend "not secure" warnings for HTTP as the final step. Eventually there'll be an interstitial warning page with stronger language. At some point the "continue anyway" button on that page will go away, and you'll have to change a configuration setting. Likely in about:config or the like, they won't expose it in general preferences. At some further point they'll refuse to talk to any server that won't do TLS, no matter what.

Though given that extensive HTTPS adoption is one of the few bright spots in the security and privacy realm, I can't say I feel that bad about this. There's plenty of other "a valuable intangible quality has been lost" areas that I feel worse about. And also a lot of areas of software where there's far more churn and maintenance headaches than web-server configuration. (One of which may be what you put in the pages the web server is serving, if they're anything other than static files. But that's a separate issue.)

We should never abandon HTTP 1.0/1.1 in browsers. With HTTPS we can go "forward", but we can never go "back". When something goes wrong we need simpler tools like plain HTTP to fix or workaround it. Things always go wrong, you should not build yourself into a corner.

For an interesting (but socially important) side-thought: death of website owners. Simple websites, using things like static pages and simpler protocols like HTTP, are less of a maintenance burden. It might be possible to convince their hosts to keep a site going if it's a simple one (either pro-bono or you just throwing them a few dollars occasionally), but regardless if you ever hope that someone (anyone) will keep an old site online it's best to make their job as easy and simple as possible :)

Sadly there are some cases where the HTTP server itself has security or reliability flaws. Changing HTTP server can be anything from a nightmare to a no-op, depending on how the website is implemented. Hint to website devs: try to make your sites as HTTP server independent as possible, it helps everyone (including you) fix things down the track. Use common scripting interfaces, not ones where there is only a single implementation, and test across a few hosts. Some HTTP proxies might work-around some issues, but not always.

By mobiushorizons at 2020-05-13 19:08:30:

Well I agree this is unfortunate, I think the public internet is too malicious a place to leave servers without updating (regardless of if you have TLS support). Old hardware and non-standard operating systems will help somewhat to avoid the script kiddies and automated tools, but I think there is a real case to be made that it is irresponsible to leave servers unpatched on the internet not just for you own sake, but also for the sake of others on the internet. Your users could be getting malware if you are hacked, or your servers could be used in a DDOS attack (for instance).

Written on 13 May 2020.
« Why we have several hundred NFS filesystems in our environment
Getting my head around what things aren't comparable in Go »

Page tools: View Source, View Normal, Add Comment.
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Wed May 13 00:25:58 2020
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.