2006-06-22
An extreme example of C preprocessor (ab)use
A while back, thedailywtf.com ran this entry on C preprocessor abuse. Unfortunately (and in fact as noted in the comments), this is nothing to the true devotee of turning C into a different language.
This preprocessor trail was blazed early on, in the V7 Bourne shell, written by Steve Bourne. The story goes that Bourne didn't really like C all that much, and instead much preferred Algol. Armed with the new C preprocessor he saw the opportunity to fix this problem and created what later generations would come to call 'Bournegol'.
For a long time this could only be properly appreciated by people with a V7 source license. These days, tuhs.org has changed all that, so I can show you this 100% genuine typical excerpt (taken from cmd.c):
LOCAL VOID prsym(sym)
{
IF sym&SYMFLG
THEN REG SYSPTR sp=reserved;
WHILE sp->sysval
ANDF sp->sysval!=sym
DO sp++ OD
prs(sp->sysnam);
ELIF sym==EOFSYM
THEN prs(endoffile);
ELSE IF sym&SYMREP THEN prc(sym) FI
IF sym==NL
THEN prs("newline");
ELSE prc(sym);
FI
FI
}
(I have changed the original tab-based indentation to four spaces, so that this fits better on your screen.)
The Bourne shell was also infamous for its clever approach to memory
allocation, where it didn't so much allocate memory as use it and wait
for the SIGSEGVs to come rolling in as a sign that it should expand
the process's memory with setbrk(). One result of this was that
sh served as an excellent stress test for a new Unix port's signal
handling; another was that if you hit a real SIGSEGV due to a Bourne
shell bug (and there were a few), you (eventually) got an error message
about being out of memory instead of anything useful like a core dump.
2006-06-20
How to improve programming productivity
Improving programming productivity is a big business, producing a never ending parade of methodologies and processes and research and whatnots. But over the past 40 odd years I think we've learned that there's only four real ways to do it:
- reduce connections
- write less code
- meetings kill kittens
- get better people
(to put them all in punchy short forms)
None of these should be new to people, since Fred Brooks covered them all 30 years ago in The Mythical Man-Month (with much better writing, and using different labels).
It's interesting to look at popular methodologies and think about how they try to attack each of these. (These days, mostly they work on #2 and #3; #4 is mostly the domain of management advice books.)
For example, I see a good part of Extreme Programming as a clever attack on #3, since test driven development (and pair programming, and no code ownership) mean it's easier to change things without talking to other people. (And at the same time it needs a base level of #1 and #2, which make it feasible to modify code that you're not intimately familiar with.)
The transition from assembler to high level languages was an attack on both #2 and #1 (since assembler is what you could call a highly connected language). Structured programming, modularity, and OO continued the attack on #1, although many of them did nothing much for #2. High level languages don't just let you write less code, they also attack #1 as they sidestep whole classes of connections (pointer manipulation, manual garbage collection).
And on my favorite soapbox, I note that parallel programming is currently mired deep in the swamps of #1; the simple threaded shared memory programming model is just loaded with connections. This implies that we're only going to get serious productivity improvements when we move to models with less connections.