What X11's TrueColor means (with some history)

November 11, 2017

If you've been around X11 long enough and peered under the hood a bit, you may have run across mentions of 'truecolor'. If you've also read through the manual pages for window managers with a sufficiently long history, such as fvwm, you may also have run across mentions of 'colormaps'. Perhaps you're wondering what the background of these oddities are.

Today, pixels are represented with one byte (8 bits) for each RGB color component, and perhaps another byte for the transparency level ('alpha'), partly because that makes each pixel 32 bits (4 bytes) and computers like 32-bit things much better than they like 24 bit (3 byte) things. However, this takes up a certain amount of memory. For instance, a simple 1024 by 768 display with 24 bits per pixel takes up just over 2 megabytes of RAM. Today 2 MB of RAM is hardly worth thinking about, but in the late 1980s and early 1990s it was a different matter entirely. Back then an entire workstation might have only 16 MB of RAM, and that RAM wasn't cheap; adding another 2 MB for the framebuffer would drive the price up even more. At the same time, people wanted color displays instead of black and white and were certainly willing to pay a certain amount extra for Unix workstations that had them.

If three bytes per pixel is too much RAM, there are at least two straightforward options. The first is to shrink how many bits you give to each color component; instead of 8-bit colour, you might do 5-bit color, packing a pixel into two bytes. The problem is that the more memory you save, the fewer colors and especially shades of gray you have. At 5-bit colour you're down to 32 shades of gray and only 32,768 different possible colors, and you've only saved a third of your framebuffer memory. The second is to do the traditional computer science thing by adding a layer of indirection. Instead of each pixel directly specifying its colour, it specifies an index into a colormap, which maps to the actual RGB color. The most common choice here is to use a byte for each pixel and thus to have a 256-entry colormap, with '24 bit' colour (ie, 8-bit RGB color components). The colormap itself requires less than a kilobyte of RAM, your 1024 by 768 screen only needs a more tolerable (and affordable) 768 KB of RAM, and you can still have your choice out of 16 million colors; it's just that you can only have 256 different colors at once.

(Well, sort of, but that's another entry.)

This 256-color indirect color mode is what was used for all affordable colour Unix workstations in the 1980s and most of the 1990s. In X11 terminology it's called a PseudoColor display, presumably because the pixel 'colour' values were not actually colors but instead were indexes into the colormap, which had to be maintained and managed separately. However, if you had a lot of money, you could buy a Unix workstation with a high(er) end graphics system that had the better type of color framebuffer, where every pixel directly specified its RGB color. In X11 terminology, this direct mapping from pixels to their colors is a TrueColor display (presumably because the pixel values are their true color).

(My memory is that truecolor systems were often called 24-bit color and pseudocolor systems were called 8-bit color. Depending on your perspective this isn't technically correct, but in practice everyone reading descriptions of Unix workstations at the time understood what both meant.)

Directly mapped 'truecolor' color graphics supplanted indirect pseudocolor graphics sometime in the late 1990s, with the growth of PCs (and the steady drop in RAM prices, which made two extra bytes per pixel increasingly affordable). It's probably been at least 15 years since you could find a pseudocolor graphics system on (then) decent current hardware; these days, 'truecolor' is basically the only colour model. Still, the terminology lingers on in X11, ultimately because X11 is at its heart a very old system and is still backward compatible to those days (at least in theory).

(I suspect that Wayland does away with all of the various options X11 has here and only supports the directly mapped truecolor model (probably with at least RGB and YUV). That would certainly be the sane approach.)

PS: It's true that in the late 1990s, you could still find Sun and perhaps SGI selling workstations with pseudocolor displays. This wasn't a good thing and contributed to the downfall of dedicated Unix workstations. At that point, decent PCs were definitely using truecolor 24-bit displays, which was part of what made PCs more attractive and most of the dedicated Unix workstations so embarrassing.

(Yes, I'm still grumpy at Sun about its pathetic 1999-era 'workstations'.)

Written on 11 November 2017.
« A systemd mistake with a script-based service unit I recently made
The fun of X11 PseudoColor displays and window managers »

Page tools: View Source, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Sat Nov 11 01:05:45 2017
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.