Classical "Single user computers" were a flawed or at least limited idea

May 15, 2025

Every so often people yearn for a lost (1980s or so) era of 'single user computers', whether these are simple personal computers or high end things like Lisp machines and Smalltalk workstations. It's my view that the whole idea of a 1980s style "single user computer" is not what we actually want and has some significant flaws in practice.

The platonic image of a single user computer in this style was one where everything about the computer (or at least its software) was open to your inspection and modification, from the very lowest level of the 'operating system' (which was more of a runtime environment than an OS as such) to the highest things you interacted with (both Lisp machines and Smalltalk environments often touted this as a significant attraction, and it's often repeated in stories about them). In personal computers this was a simple machine that you had full control over from system boot onward.

The problem is that this unitary, open environment is (or was) complex and often lacked resilience. Famously, in the case of early personal computers, you could crash the entire system with programming mistakes, and if there's one thing people do all the time, it's make mistakes. Most personal computers mitigated this by only doing one thing at once, but even then it was unpleasant, and the Amiga would let you blow multiple processes up at once if you could fit them all into RAM. Even on better protected systems, like Lisp and Smalltalk, you still had the complexity and connectedness of a unitary environment.

One of the things that we've learned from computing over the past N decades is that separation, isolation, and abstraction are good ideas. People can only keep track of so many things in their heads at once, and modularity (in the broad sense) is one large way we keep things within that limit (or at least closer to it). Single user computers were quite personal but usually not very modular. There are reasons that people moved to computers with things like memory protection, multiple processes, and various sorts of privilege separation.

(Let us not forget the great power of just having things in separate objects, where you can move around or manipulate or revert just one object instead of 'your entire world'.)

I think that there is a role for computers that are unapologetically designed to be used by only a single person who is in full control of everything and able to change it if they want to. But I don't think any of the classical "single user computer" designs are how we want to realize a modern version of the idea.

(As a practical matter I think that a usable modern computer system has to be beyond the understanding of any single person. There is just too much complexity involved in anything except very restricted computing, even if you start from complete scratch. This implies that an 'understandable' system really needs strong boundaries between its modules so that you can focus on the bits that are of interest to you without having to learn lots of things about the rest of the system or risk changing things you don't intend to.)


Comments on this page:

I grew up in NL having access to an Atari ST, and the (German) Atari ST Profibuch which in my memory covered all there was to these machines and taught me a bit of German too. It was a great combination to learn about computers and worked very well because this book would apply in full to all computers of that series. These machines were simple enough that as a twelve year old, I was able to add a physical pause switch to the computer that would halt the CPU, so I could join dinner without losing progress in the game I was playing. This required soldering on the motherboard! This kind of easy hackery is no longer possible with modern computers, and there is so much complexity that it is easy to get lost in details and not see the whole picture or give up because of that complexity.

For enthusing / teaching a kid we now have things like Arduino and ESP, with the benefit that they are ridiculously cheap. I think these are the modern equivalent of single person computers, except they are not computers :)

Now to find the ESP8266 Profibuch equivalent!

Sure, but the accreted complexity of computing nowadays makes it hard for beginners. I learned on an Apple ][+, a very simple machine. Nowadays even a RaspberryPi is a very complex machine with orders of magnitude more stuff to learn, including path-dependent design choices that are much less comprehensible to someone who wasn't there when they happened.

The last computer I could fully understand, from the processor up to the ROM and the operating system, was an Apple II running DOS 3.3. It's been downhill from there.

I can appreciate the issues with a Smalltalk (or Lisp) environment where everything is exposed to modification at all times. In Smalltalk you used to be able to make true := false and lock the image immediately (nothing would happen because the machine entered a philosophical crisis when it realized nothing was true).

But what if the environment is inpectable, but not immediately changeable? Even though we don't fully understand the video compositing pipeline and how it uses the hundreds of GPU cores, or how the CPU addresses the memory those GPU cores think is the whole universe, we can still explore the parts we care about.

Anyway, not having a workable web browser and an SSH client on Smalltalk rules it out as my daily driver.

By Nobody in particular at 2025-05-16 14:01:21:

ISTM you're painting with too broad a brush. Early PC systems were too resource-constrained for abstraction/modularity/etc. to be practical, and so they were programmed in error-prone, low-level languages. Because the hardware also didn't do much by way of protection, isolation, or preemptive multitasking, many sorts of errors could have arbitrarly large impact.

A couple years ago I bought a Lisp Machine (out of intellectual curiousity). It's not at all accurate to say there's no separation, isolation, abstraction, or modularity in that system. For instance, programs don't draw on the screen by directly twiddling video hardware or write a file by directly poking a hard drive. Instead, programs call well-defined, high-level user interface or file system APIs, respectively. It's true that programs can interfere with one other or corrupt the system, but that doesn't tend to happen due to a coding accident or user error: no more so than than a forkbomb does, I'd say. Unless you're deliberately trying to screw things up, when you make a mistake, you will land in a debugger (where you might be able to fix things and resume), but other programs will keep running. The LispM is neither secure nor resilient in any modern sense, but it's astonishingly robust for a system that omitted conventional timesharing features.

(I don't think the old-school LispMs make sense as end-user systems in the modern world, though, I think one could argue they'd make sense as development environments for Unikernel-style VMs, which, modulo the "V", is basically what the LispM vendors hoped people would use them for.)

By Walex at 2025-05-17 09:21:36:

The arguments above seem to me to to have two limitations:

  • The assumption seems to be that cellphones and tablets are not computers.
  • I suspect there is a confusion between "single environment" and "single user" as in the Lisp Machines were "single environment". Reverse example: Amiga/MacOS were "single user".
By mikeful at 2025-05-19 11:02:25:

People don't want single user or simple computers because they are limited by isolation or other nice modern features. They want these because currently Windows, MacOS and some Linux distributions (looking at you Ubuntu) try to make the computer their platform and not yours.

Written on 15 May 2025.
« Two broad approaches to having Multi-Factor Authentication everywhere
Let's Encrypt drops "Client Authentication" from its TLS certificates »

Page tools: View Source, View Normal.
Search:
Login: Password:

Last modified: Thu May 15 22:33:11 2025
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.