2005-07-15
First Irritations with Fedora Core 4
I can't call this a review or even first impressions of Fedora Core 4, because I haven't used it enough yet (and may not for a while). So this is my first irritations with Fedora Core 4, gained from banging my head against it repeatedly in the process of preparing our new OS load for a planned late-August upgrade/reinstall of a bunch of workstations.
- The Anaconda installer is buggy.
(Still with no sign of an update.)
- Once X starts, it kills the normal text consoles. Depending on your
machine you get either blank consoles or scrambled consoles. The
root cause is the decision to use GCC 4 in FC4, which either
miscompiles or exposes as incorrect some code in X's
libvgahw.a
(which it is depends on who you ask). Bugzilla #161242. - Matrox cards don't work very well (if at all).
This is probably related to the
libvgahw.a
bug, but it's not clear; it's not fixed by some things that fix the former bug. Bugzilla #163331, to the extent that there is an organized single bug for this. (We like Matrox cards here. Whoops.) - The default desktop background is about 75% black on a decent sized display (1280x1024). I don't know about you, but a black background makes me nervous that something is wrong. Past Red Hat releases used a much nicer background.
The following are not entirely Red Hat's fault, since I believe they come from Gnome. But still:
- The core menus take a long time to appear the first time you click
on them, presumably as the gnome-panel code runs all over the system
doing XML magic. Dear Gnome: please build those menus in the
background when you start, because otherwise it annoys everyone sooner
or later.
- If you unwarily try to use your Gnome configuration from Fedora Core
2, the result looks like overgrown ass; you pretty much have to delete
it and let the system give you the defaults. (I hope none of our few
thousand students had any customizations they really cared about,
because they're about to lose them.)
- The Gnome default layout now uses two stripes of the screen, one at
the top and one at the bottom. Dear Gnome: my screen space is a limited
and therefor precious resource. Please stop stealing bits of it.
- Whose bright idea is it to make the terminal window's cursor blink? Dear Gnome people: humans are reflexively attracted to blinking things, because it's a form of apparent motion and change. However, the cursor is simply not that important; making it blink is like having a four year old jumping up and down going 'I'm here! I'm here! I'm here!'.
Overall, I am sufficiently unhappy with the X bugs that I am not currently planning on upgrading my own machines. Although now that I write it up, this list of irritations is smaller and less impressive than I thought when I was banging my head against them.
(Note that I don't consider Fedora Core 4 shipping without support for MP3s, or without Flash and Sun's Java and Adobe Acrobat and a pile of other commercial software, to be an 'irritation' as such. Fedora Core can't ship with those; see the 'commercial software' bit. (Yes, even MP3 decoders; MP3 decoding in patented in all places that actually allow software patents.))
How AMD killed the Itanium
I've been telling people versions of this story for a while, so I figure I might as well write it down for posterity (or at least entertainment).
When Intel started Itanium development work in the mid 1990s, it had a straightforward story: the x86 architecture was about to run into a performance cap, because of all the ugly warts it had inherited from its predecessors (a very limited supply of registers, a highly irregular instruction format, etc). To get the future performance that customers wanted, a modern warts-free RISC-oid architecture was needed: the IA-64.
This was no different from the stories all the other CPU vendors were telling at the same time. Unlike those CPU vendors, Intel realized something important: very few people buy new CPUs to run only new software. Even in the mid 1990s, most people were using Intel x86 CPUs to run their programs, so that was where the big CPU dollars were.
So Intel promised that there would be a magic x86 part glued on the side of the first few generations of Itaniums that would run all of your existing important programs. Because very few people are ever interested in upgrading to a computer that runs their existing programs slower, Intel needed this magic x86 part to run at least as fast as their real x86 chips.
Intel could get away with this for two reasons. First, x86 chips were relatively simple compared to the design that Intel was planning, so it should be easy to glue the core of one on the side of the new CPU. Second, Intel could make the x86 performance story come true simply by moving most of their CPU design manpower (and money) to the IA-64.
Then AMD showed up to ruin Intel's party by competing directly with them. It didn't matter that AMD didn't have faster CPUs to start with; AMD's existence meant that if Intel left them alone, AMD would surpass Intel and kill Intel's main revenue source. Intel had to crank up x86 performance to make sure that didn't happen. This probably had three effects:
- people got diverted from IA-64 CPU design back to x86 CPU design;
- because x86 got faster, Itanium had to get faster too;
- the only way to make x86 faster was to make it more complicated, which made it harder to integrate a current-generation x86 into Itanium.
Naturally, the schedule for delivering a faster, more complicated Itanium slipped. Which kept making the problem worse, especially when making x86 chips go really fast started to require serious amounts of design talent. Instead of designing one high-performance CPU and doing a small amount of work on another CPU architecture, Intel was trapped in an increasingly vicious race to design two vastly different high-performance CPUs at the same time, and one of them had to be backwards compatible with the other.
It's no wonder the Itanium shipped years late, with disappointing performance and very disappointing x86 compatibility performance. (And heat issues, which didn't help at all.)
With AMD's recent x86-64 64-bit extension of the x86 architecture, Intel couldn't even claim that Itanium was your only choice if you needed 64-bit memory space and suchlike. Intel's capitulation to making its own almost 100% compatible x86 64-bit extension was inevitable, but probably the final stake in Itanium's heart. (And likely a very bitter pill for Intel to swallow.)
And that's how AMD killed the Itanium.