What X11's TrueColor means (with some history)

November 11, 2017

If you've been around X11 long enough and peered under the hood a bit, you may have run across mentions of 'truecolor'. If you've also read through the manual pages for window managers with a sufficiently long history, such as fvwm, you may also have run across mentions of 'colormaps'. Perhaps you're wondering what the background of these oddities are.

Today, pixels are represented with one byte (8 bits) for each RGB color component, and perhaps another byte for the transparency level ('alpha'), partly because that makes each pixel 32 bits (4 bytes) and computers like 32-bit things much better than they like 24 bit (3 byte) things. However, this takes up a certain amount of memory. For instance, a simple 1024 by 768 display with 24 bits per pixel takes up just over 2 megabytes of RAM. Today 2 MB of RAM is hardly worth thinking about, but in the late 1980s and early 1990s it was a different matter entirely. Back then an entire workstation might have only 16 MB of RAM, and that RAM wasn't cheap; adding another 2 MB for the framebuffer would drive the price up even more. At the same time, people wanted color displays instead of black and white and were certainly willing to pay a certain amount extra for Unix workstations that had them.

If three bytes per pixel is too much RAM, there are at least two straightforward options. The first is to shrink how many bits you give to each color component; instead of 8-bit colour, you might do 5-bit color, packing a pixel into two bytes. The problem is that the more memory you save, the fewer colors and especially shades of gray you have. At 5-bit colour you're down to 32 shades of gray and only 32,768 different possible colors, and you've only saved a third of your framebuffer memory. The second is to do the traditional computer science thing by adding a layer of indirection. Instead of each pixel directly specifying its colour, it specifies an index into a colormap, which maps to the actual RGB color. The most common choice here is to use a byte for each pixel and thus to have a 256-entry colormap, with '24 bit' colour (ie, 8-bit RGB color components). The colormap itself requires less than a kilobyte of RAM, your 1024 by 768 screen only needs a more tolerable (and affordable) 768 KB of RAM, and you can still have your choice out of 16 million colors; it's just that you can only have 256 different colors at once.

(Well, sort of, but that's another entry.)

This 256-color indirect color mode is what was used for all affordable colour Unix workstations in the 1980s and most of the 1990s. In X11 terminology it's called a PseudoColor display, presumably because the pixel 'colour' values were not actually colors but instead were indexes into the colormap, which had to be maintained and managed separately. However, if you had a lot of money, you could buy a Unix workstation with a high(er) end graphics system that had the better type of color framebuffer, where every pixel directly specified its RGB color. In X11 terminology, this direct mapping from pixels to their colors is a TrueColor display (presumably because the pixel values are their true color).

(My memory is that truecolor systems were often called 24-bit color and pseudocolor systems were called 8-bit color. Depending on your perspective this isn't technically correct, but in practice everyone reading descriptions of Unix workstations at the time understood what both meant.)

Directly mapped 'truecolor' color graphics supplanted indirect pseudocolor graphics sometime in the late 1990s, with the growth of PCs (and the steady drop in RAM prices, which made two extra bytes per pixel increasingly affordable). It's probably been at least 15 years since you could find a pseudocolor graphics system on (then) decent current hardware; these days, 'truecolor' is basically the only colour model. Still, the terminology lingers on in X11, ultimately because X11 is at its heart a very old system and is still backward compatible to those days (at least in theory).

(I suspect that Wayland does away with all of the various options X11 has here and only supports the directly mapped truecolor model (probably with at least RGB and YUV). That would certainly be the sane approach.)

PS: It's true that in the late 1990s, you could still find Sun and perhaps SGI selling workstations with pseudocolor displays. This wasn't a good thing and contributed to the downfall of dedicated Unix workstations. At that point, decent PCs were definitely using truecolor 24-bit displays, which was part of what made PCs more attractive and most of the dedicated Unix workstations so embarrassing.

(Yes, I'm still grumpy at Sun about its pathetic 1999-era 'workstations'.)


Comments on this page:

This direct mapping from pixels to their colors is a TrueColor display (presumably because the pixel values are their true color).

No, when pixel values directly encode the color, this is simply called direct color (appropriately enough).

TrueColor was sort of a marketing term – akin to Retina now. The coinage referred to the fact that 24-bit direct color displays can display more than 10 million colors simultaneously (if you had that many pixels!), which was widely touted as the number of colors the human eye can distinguish.

In actual fact, that story is far more complicated, but the marketing was appropriate insofar as 16-bit direct color (called HighColor) is merely close to enough for photography – at that color resolution, color banding is still a prominent problem. With 24-bit direct color, it largely disappears from view (and the necessity of dithering with it).

I have no idea why PseudoColor was called that. My uninformed guess would be that it’s because the colormap entries are still 24 bit wide – making it a kind of TrueColor-but-not-really display. (Because in the rest of the computing world, colormaps were often coarser. The 256 entries in the VGA palette were 18 bit wide, and the 16 entries in the EGA palette before it, a mere 6 bits.)

By cks at 2017-11-13 00:53:11:

The X11 TrueColor name may have come from the marketing term, but it may also have been adopted because X11 also has a DirectColor colour type. Surprisingly for its name, this colour type does not have pixels that directly specify colour values; instead it is another indirectly indexed type, but this time the colormaps are separate for each pixel (so the R pixel indexes an R colormap that's independent from the G and B colormaps). In the charming X way, TrueColor is officially described as:

  • TrueColor is treated the same way as DirectColor except that the colormap has predefined, read-only RGB values. These RGB values are server-dependent but provide linear or near-linear ramps in each primary.

(From Xlib Visual Types.)

Although it sounds peculiar to me, I assume that there was real hardware that had a DirectColor display (well, graphics hardware). Possibly this was back in the days before everything was 8-bit (both RGB pixels and, where applicable, colormap values).

Wow, I do not recall ever seeing those other visual types in X11 at all. I must have scrolled past them at some point, but if so then they evidently didn’t register at all.

Probably because all of them seem bizarre; the only plausible motivation I can think of for designing DirectColor (and GrayScale) like this is as an over-generalised mechanism for giving software control of the gamma curve. But it’s hard to guess whether there was hardware whose designers came up with this idea, which the X11 people felt obliged to accommodate, or whether this was architecture astronautics by the X11 committee. At first blush it seems like the latter, because DirectColor appears to be a really expensive way of specifying a color (three-byte pixels and not just one but three lookups and composing the color?)… but maybe it can be implemented as three LUTs in front of a DAC’s inputs or something where the performance impact is minimal? (I’m not a hardware person.) So I am hard-pressed to guess. I wonder if any record around these design decisions survives.

The X11 TrueColor name may have come from the marketing term, but it may also have been adopted because X11 also has a DirectColor colour type.

Probably both? The other visual types described in similar fashion as TrueColor are called StaticColor and StaticGray, so it seems like the natural name for this type would have been something containing “Static”, but since StaticColor as a derivative of PseudoColor is already taken, they needed some extra distinction anyway. It seems like the already-established marketing term would have been the obvious choice in such a situation. OTOH there’s the seeming complementary pair of PseudoColor/TrueColor, with several other things defined in terms of the former… making it hard to guess which terms were derived from which other ones. With this mix of names it’s even conceivable they started from TrueColor and proceeded from there. And again I wonder.

Written on 11 November 2017.
« A systemd mistake with a script-based service unit I recently made
The fun of X11 PseudoColor displays and window managers »

Page tools: View Source, View Normal, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Sat Nov 11 01:05:45 2017
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.