The modern web and (alleged) software stagnation in the past few decades

January 3, 2021

I was recently reading The Great Software Stagnation (via), which puts forward a simple thesis:

Software is eating the world. But progress in software technology itself largely stalled around 1996. [...]

I have a number of reactions to that, but one of them is that one specific yet obvious area of software technology has progressed hugely in the last 24 years, or even the last ten, and that is the 'application' web (which these days is not just Javascript but also CSS and HTML features that allow interactivity, animation, and so on). What you can do with the web today is quietly astounding, not just for the web but at all.

Back in 1996, software technology might have allowed you to build a global, high detail map as an application that was delivered on CD-ROM (not DVD, not in 1996). But you definitely wouldn't have been able to have that map on almost any device, and have it updated frequently, and have high resolution satellite views of much of the west included (and probably not all of the map details, either). Nor would you probably have been able to include interactive, highly responsive route planning, including for bicycling.

(If you consider the backend systems for this web application as well, much of the software technology necessary to operate them likely postdates 1996 as well.)

Maps are everyone's go-to example of web application technology, but I have another one that is much closer to home for me. Here in 2021, I can easily deliver to my co-workers (in a very small organization) a whole set of custom monitoring dashboards with custom graphs, information tables, and other visualization displays that I can update frequently and that are available on basically any computer you care to name (this would be our Grafana dashboards). There's an entire ecology of software technologies that enables all of this, and almost none of them existed in 1996 in any meaningful form.

(I will argue that not even Javascript existed in 1996 in meaningful form; the Javascript of 1996 is significantly different from the Javascript of these past five years or so.)

Could you have theoretically done this in 1996? Yes. Could I have practically done this in 1996? No. The web's software technologies have made it possible to build this and the sea change in the viability of the web itself has made it possible to deliver this (including ongoing updates to how the dashboards work, adding new dashboards, and so on).

(There were monitoring dashboards in 1996, and I know the university had some of them, watched by operators in our central machine room. But they were not delivered over the web, and I'm pretty certain they were very expensive enterprise software and much more time consuming (and expensive) to customize and operate than our setup.)

These are not the only web applications that more or less couldn't have existed in 1996 in any form. Even some ostensibly relatively plain websites could not have existed with 1996 software technology even if you gave them 2020 hardware technology, because of their sheer scope. People have been talking to each other over the Internet for a long time (as I'm very familiar with), but Twitter's global scale and activity create a whole new set of problems that require post-1996 software technology to deal with, often in areas that are genuinely new.

(Much of this software technology is less obviously sexy than new languages with new models of programming. But it's also quite sophisticated and represents real progress in the state of the art in things like distributed consensus and large scale data stores.)

In looking at all of this, I'm strongly reminded of another article I read recently, Dan Luu's Against essential and accidental complexity. This basically takes the other side of the 'things have stalled' argument by walking through some drastic changes in programmer productivity over the past several decades. Dan Luu's starting point is roughly 1986 for reasons covered in the article, but many of the changes Luu points to are from after 1996.

PS: Another web-related area that software technology has made huge strides in since 1996 is almost everything related to cryptography. My strong impression is that much of this progress has been driven by the existence of the HTTPS web, due to the web being where most cryptography is used (or more broadly, TLS, which is driven by the web even if it's used beyond it).


Comments on this page:

I agree with the sentiment of the essay, but its examples and arguments are all poor. Jon Blow made a much more thorough and convincing argument in Preventing the Collapse of Civilization. His counterargument about Google Maps would be that it's not software innovation but hardware innovation. You couldn't transport this software back to 1996 and hope to run it on hardware of the era, at least not without essentially re-engineering it from scratch. The current version already runs poorly, if at all, on computers from 2011. Software has gotten a free ride on hardware innovation (i.e. Moore's law), which is why software generally feels no faster today than it was in 1996 despite hardware being literally 10,000 times faster.

By frankg at 2021-01-03 23:40:33:

If computation is better because hardware is better (not disagreeing) and hardware is better because the laws of miniaturization have dictated they will be (looking at you Moore) have we had any real innovation at all? Sure the web and everything is better, faster, stronger but that's expected. It feels like a pause, or just catching up with capability.

By cks at 2021-01-04 00:02:08:

I'm pretty certain that if you gave 1996 the client and server hardware that's used to access and run modern sites, both large ones and smaller ones like our Grafana dashboards, 1996 would not be able to build equivalent services with anything short of major multi-year efforts. Some of this would be incremental software improvements that you might argue were presaged by technology that existed by 1996 (even if it wasn't necessarily widespread), but I'm pretty sure some significant software technologies would have to be invented from scratch.

(For mapping and so on, I'll assume that 1996 gets the data along with the hardware.)

I agreed with the original article. Seeing cretins boast about how C compiler error messages and other garbage have improved since only reveals their ignorance.

But you definitely wouldn't have been able to have that map on almost any device

That is, almost any device which runs one of two operating systems. It's much less impressive when stated this way. The world isn't the WWW, and its browsers are only growing worse.

and have it updated frequently

Don't forget the surveillance opportunities this enables.

and have high resolution satellite views of much of the west included (and probably not all of the map details, either).

That's a hardware advancement.

Building WWW garbage which requires me to install a giant WWW browser, or perhaps the one other browser kept around to avoid some criticism of being a monopoly, is hardly anything good.

Twitter's global scale and activity create a whole new set of problems that require post-1996 software technology to deal with, often in areas that are genuinely new.

Giant centralized systems eat resources like a fat child eats candy, sure. That doesn't make it impressive, and the world would be better without that horrible website anyway.

My strong impression is that much of this progress has been driven by the existence of the HTTPS web, due to the web being where most cryptography is used

None of that cryptography is sufficient for hiding from a government or sufficiently-well connected business, but it's good enough to protect machines from purchasers and ensure advertisements aren't modified, certainly.

By Nieve at 2021-01-04 01:00:06:

@cks

I think the data giveaway is even bigger than it might look since the infrastructure to capture and update the map data continuously didn't exist in any relevant form in 1996. Driver's map books were still a thing and you were lucky if a change last year made it into this year's edition (if they were even yearly). Mapping software and sites with map data weren't much better. Catching a temporary road closure or reroute was very rare and traffic data mostly limited to electronic road signs & radio reports. This isn't just Google, or Bing, or OSM & Apple, it's also all those state and local agencies working together to get data distributed quickly in reasonably consistent formats. You're absolutely right, no amount of 2021 hardware is going to make that happen.

Written on 03 January 2021.
« An interesting and puzzling bit of Linux server utilization
TLS Certificate Authority root certificates and their dates »

Page tools: View Source, View Normal, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Sun Jan 3 01:38:11 2021
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.