Wandering Thoughts archives

2017-11-13

X11 PseudoColor displays could have multiple hardware colormaps

When I talked about PseudoColor displays and window managers, I described things assuming that there was only a single hardware colormap. However, if you read the X11 documentation you'll run across tantalizing things like:

Most workstations have only one hardware look-up table for colors, so only one application colormap can be installed at a given time.

(Emphasis mine.)

'Most' is not all, and indeed this is the case; there were Unix workstations with PseudoColor displays that had multiple hardware colormaps. As it happens I once used such a machine, my SGI R5K Indy. As a sysadmin machine we bought the version with SGI's entry level 8-bit XL graphics, but that was still advanced enough that it had multiple hardware colormaps instead of the single colormap that I was used to from my earlier machines.

When I was using the Indy I didn't really notice the multiple hardware colormaps, which is not too surprising (people rapidly stop noticing things that don't happen, like your display flashing as colormaps have to be swapped around), but in retrospect I think they enabled some things that I didn't think twice about at the time. I believe my Indy was the first time I used pictures as desktop backgrounds, and looking at the 1996 desktop picture in the appendix of this entry, that picture is full colour and not too badly dithered.

(As it happens I still have the source image for this desktop background and it's a JPEG with a reasonably large color range. Some of the dithering is in the original, probably as an artifact of it being scanned from an artbook in the early 1990s.)

In general, I think that having multiple hardware colormaps basically worked the way you'd expect. Any one program (well, window) couldn't have lots of colors, so JPEG images and so on still had to be approximated, but having a bunch of programs on the screen at once was no problem (even with the window manager's colors thrown in). I used that Indy through the era when websites started getting excessively colourful, so its multiple hardware colormaps likely got a good workout from Netscape windows.

(In 1996, Mozilla was well in the future.)

At the time and for years afterward, I didn't really think about how this was implemented in the hardware. Today, it makes me wonder, because X is normally what I'll call a software compositing display system where the X server assembles all pixels from all windows into a single RAM area and has the graphics display that (instead of telling the graphics hardware to composite together multiple separate bits and pieces). This makes perfect sense for a PseudoColor display when there's only one hardware colormap, but when you have multiple hardware colormaps, how does the display hardware know which pixel is associated with which hardware colormap? Perhaps there was a separate additional mapping buffer with two or three bits per pixel that specified the hardware colormap to use.

(Such a mapping buffer would be mostly static, as it only needs to change if a window with its own colormap is moved, added, or removed, and it wouldn't take up too much memory by 1996 standards.)

X11MultipleHWColormaps written at 00:20:28; Add Comment

2017-11-12

The fun of X11 PseudoColor displays and window managers

Yesterday, I described how X11's PseudoColor is an indirect colormap, where the 'colors' you assigned to pixels were actually indexes into a colormap that gave the real RGB colour values. In the common implementation (an 8-bit 'colour' index into a 24-bit colormap), you could choose colours out of 16 million of them, but you could only have 256 different ones in a colormap. This limitation creates an obvious question: on a Unix system with a bunch of different programs running, how do you decide on which 256 different colours you get? What happens when two programs want different sets of them (perhaps you have two different image display programs trying to display two different images at the same time)?

Since X's nominal motto is 'mechanism, not policy', the X server and protocol do not have an answer for you. In fact they aggressively provide a non-answer, because the X protocol allows for every PseudoColor window to have its own colormap that the program behind the window populates with whatever colours it wants. Programs can inherit colormaps, including from the display (technically the root window, but that's close enough because the root window is centrally managed), so you can build some sort of outside mechanism so everyone uses the same colormap and coordinates it, but programs are also free to go their own way.

(For example, I believe that X desktops like Motif/CDE had standard colormaps that all of their normal applications were expected to share.)

Whenever you have a distributed problem in X that needs some sort of central coordination, the normal answer is 'the window manager handles it'. PseudoColor colormaps are no exception, and so there is an entire X program to window manager communication protocol about colormap handling, as part of the ICCCM; the basic idea is that programs tell the window manager 'this window needs this colormap', and then the window manager switches the X server to the particular colormap whenever it feels like it. Usually this is whenever the window is the active window, because normally the user wants the active window to be the one that has correct colors.

(In X terminology, this is called 'installing' the colormap.)

The visual result of the window manager switching the colormap to one with completely different colors is that other windows go technicolour and get displayed with false and bizarre colors. The resulting flashing as you moved back and forth between programs, changed images in an image display program, or started and then quit colour-intensive programs was quite distinctive and memorable. There's nothing like it in a modern X environment, where things are far more visually stable.

The window manager generally had its own colormap (usually associated with the root window) because the window manager generally needed some colours for window borders and decorations, its menus, and so on. This colormap was basically guaranteed to always have black and white color values, so programs that only needed them could just inherit this colormap. In fact there was also a whole protocol for creating and managing standard (shared) colormaps, with a number of standard colormaps defined; you could use one of these standard colormaps if your program just needed some colors and wasn't picky about the exact shades. A minimal case of this was if your program only used black and white; as it happens, this describes many programs in a normal X system (especially in the days of PseudoColor displays), such as xterm, xclock, Emacs and other GUI text editors, and so on. All of these programs could use the normal default colormap, which was important to avoid colours changing all of the time as you switched windows.

(For much of X's life, monochrome X displays were still very much a thing, so programs tended to only use colour if they really needed to. Today color displays are pervasive so even programs that only really have a foreground and a background colour will let you set those to any colour you want, instead of locking you to black and white.)

One of the consequences of PseudoColor displays for window managers was that (colour) gradients were generally considered a bad idea, because they could easily eat up a lot of colormap entries. Window managers in the PseudoColor era were biased towards simple and minimal colour schemes, ideally using and reusing only a handful of colours. When TrueColor displays became the dominant thing in X, there was an explosion of window managers using and switching over to colour gradients in things like window title bars and decorations; not necessarily because it made sense, but because they now could. I think that has fortunately now died down and people are back to simpler colour schemes.

X11PseudocolorAndWMs written at 02:21:13; Add Comment

2017-11-11

What X11's TrueColor means (with some history)

If you've been around X11 long enough and peered under the hood a bit, you may have run across mentions of 'truecolor'. If you've also read through the manual pages for window managers with a sufficiently long history, such as fvwm, you may also have run across mentions of 'colormaps'. Perhaps you're wondering what the background of these oddities are.

Today, pixels are represented with one byte (8 bits) for each RGB color component, and perhaps another byte for the transparency level ('alpha'), partly because that makes each pixel 32 bits (4 bytes) and computers like 32-bit things much better than they like 24 bit (3 byte) things. However, this takes up a certain amount of memory. For instance, a simple 1024 by 768 display with 24 bits per pixel takes up just over 2 megabytes of RAM. Today 2 MB of RAM is hardly worth thinking about, but in the late 1980s and early 1990s it was a different matter entirely. Back then an entire workstation might have only 16 MB of RAM, and that RAM wasn't cheap; adding another 2 MB for the framebuffer would drive the price up even more. At the same time, people wanted color displays instead of black and white and were certainly willing to pay a certain amount extra for Unix workstations that had them.

If three bytes per pixel is too much RAM, there are at least two straightforward options. The first is to shrink how many bits you give to each color component; instead of 8-bit colour, you might do 5-bit color, packing a pixel into two bytes. The problem is that the more memory you save, the fewer colors and especially shades of gray you have. At 5-bit colour you're down to 32 shades of gray and only 32,768 different possible colors, and you've only saved a third of your framebuffer memory. The second is to do the traditional computer science thing by adding a layer of indirection. Instead of each pixel directly specifying its colour, it specifies an index into a colormap, which maps to the actual RGB color. The most common choice here is to use a byte for each pixel and thus to have a 256-entry colormap, with '24 bit' colour (ie, 8-bit RGB color components). The colormap itself requires less than a kilobyte of RAM, your 1024 by 768 screen only needs a more tolerable (and affordable) 768 KB of RAM, and you can still have your choice out of 16 million colors; it's just that you can only have 256 different colors at once.

(Well, sort of, but that's another entry.)

This 256-color indirect color mode is what was used for all affordable colour Unix workstations in the 1980s and most of the 1990s. In X11 terminology it's called a PseudoColor display, presumably because the pixel 'colour' values were not actually colors but instead were indexes into the colormap, which had to be maintained and managed separately. However, if you had a lot of money, you could buy a Unix workstation with a high(er) end graphics system that had the better type of color framebuffer, where every pixel directly specified its RGB color. In X11 terminology, this direct mapping from pixels to their colors is a TrueColor display (presumably because the pixel values are their true color).

(My memory is that truecolor systems were often called 24-bit color and pseudocolor systems were called 8-bit color. Depending on your perspective this isn't technically correct, but in practice everyone reading descriptions of Unix workstations at the time understood what both meant.)

Directly mapped 'truecolor' color graphics supplanted indirect pseudocolor graphics sometime in the late 1990s, with the growth of PCs (and the steady drop in RAM prices, which made two extra bytes per pixel increasingly affordable). It's probably been at least 15 years since you could find a pseudocolor graphics system on (then) decent current hardware; these days, 'truecolor' is basically the only colour model. Still, the terminology lingers on in X11, ultimately because X11 is at its heart a very old system and is still backward compatible to those days (at least in theory).

(I suspect that Wayland does away with all of the various options X11 has here and only supports the directly mapped truecolor model (probably with at least RGB and YUV). That would certainly be the sane approach.)

PS: It's true that in the late 1990s, you could still find Sun and perhaps SGI selling workstations with pseudocolor displays. This wasn't a good thing and contributed to the downfall of dedicated Unix workstations. At that point, decent PCs were definitely using truecolor 24-bit displays, which was part of what made PCs more attractive and most of the dedicated Unix workstations so embarrassing.

(Yes, I'm still grumpy at Sun about its pathetic 1999-era 'workstations'.)

X11TruecolorHistory written at 01:05:45; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.