Wandering Thoughts archives

2010-01-27

AT&T's mad unbundling and the damage it did to Unix

Once upon a time, AT&T had Unix, and Unix was big (by the standards of the time); it contained all sorts of things that made it useful. AT&T decided that it wanted to make as much money as possible from Unix, so it had a brilliant idea: it would sell a bunch of pieces of 'Unix' separately, as additional cost extras. After all, Unix systems were primarily used to run canned vendor-developed software (or at least AT&T no doubt maintained), so a lot of pieces weren't strictly necessary and thus could be split off, and you could even tell the customers that they were saving money by only buying what they needed.

This created AT&T's mad unbundling of Unix components, where they separated out previously all-included things like the C compiler, troff and associated commands (as the 'Documenter's Workbench'), and so on. Pretty soon the baseline version of (AT&T) Unix was significantly stripped down compared to its predecessors.

(AT&T also repeated this most every time that they made substantial improvements to commands, so you had 'new awk', improved versions of ksh, and so on, all of which cost more money.)

This had both immediate and long term effects. The immediate effect was to create annoyingly crippled versions of Unix, ones where you could not do things like compile your own programs or even read third-party manpages, and to divert a great deal of people's efforts to getting around these limitations. Naturally, people used to BSD Unix (which included C compilers, troff, and so on) hated these environments.

(The height of such workarounds is Henry Spencer's awf, a nroff clone written in old-style awk.)

The long term effect was to make Unix, all Unixes, less attractive and less useful at a crucial time for Unix in general. By maximizing its theoretical income, AT&T determinedly minimized (AT&T) Unix's appeal in general. The C compiler situation was an especially clever own goal, and it was saved from being a complete disaster only by the existence of the FSF's GNU C Compiler and the accompanying great effort to bootstrap GCC on as many Unixes as possible and make available the precompiled binaries. Without GCC, I really think that Unix would have done much worse in many places, and certainly important Unix augmentations like Perl would have spread significantly more slowly.

(Unix vendors could have crippled the GCC effort by not shipping the header files for the standard library; fortunately, they by and large did not. Note that this was by and large in the pre-Internet era, where many people could not just download precompiled binaries for Perl and so on from some Internet site. Even assuming that they would or could have installed lots of precompiled binaries from random third parties.)

Similar effects happened with the improved commands that AT&T developed; because they were extra cost, they spread through the Unix world only very slowly (if at all), and they almost might not have been written at all. Certainly their improvements did nothing to make Unix more attractive, because for most people they weren't part of their Unix, just as troff wasn't.

unix/ATTUnixUnbundlingDamage written at 01:56:14; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.