2007-02-24
The problem with browser minimum font size settings
Long ago, when I griped about Slashdot's redesign, Oscar del Rio left a comment suggesting that I use Firefox's minimum font size setting to cut this off. This isn't an approach that I like for a relatively simple reason: I'm willing to have some text set small, just not the main text.
I want websites to be able to set less important things in small font sizes, but I don't want them shrinking down the text I'm actually here to read. If I set Firefox's minimum font size up large enough that the main text is always readable, I completely take out the small font sizes on those less important things and they wind up too big.
(Yes, this is a hideously belated followup. Part of the fun of writing
it was trying to figure out if I'd already written an entry talking
about this; I ultimately resorted to trawling the archives with grep.
Sometimes I am not the most organized blog-writing person in the world.)
Most world-editable wikis are doomed
The Linux iSCSI project keeps its documentation in a world-editable wiki. I should really say kept, because it's hard to find much usable documentation in the wiki at the moment; most of the pages are overgrown with wiki spam. Some pages have had a thousand edits in two days, all of them spam. All of this makes the project's wiki an unfortunately excellent illustration of why most open wikis are doomed.
The problem is that there are just more spammers out there automating their attacks than most wikis have people to fix the damage. Wikipedia survives because it has a critical mass of people who look after it, but it's an exception; very few wikis attract that many people. With a critical mass, you can block spammers and fix spam damage fast enough to discourage spammers and keep your wiki attractive; without it, you drown under a slowly rising tide of spam (and there is some evidence that existing spam attracts more spammers).
(It's not enough to have some dedicated people; you need to have enough that none of them have to spend too much time tending the wiki. Cleaning out spammers is drudge-work, and too much drudge-work burns people out.)
It's possible that the iSCSI wiki was so significantly hit because it
doesn't use rel="nofollow" on external links. On the other hand,
there's a fair amount of evidence that spammers just don't care about
that and will hit anything within reach. And open-edit wiki pages are
eminently within reach.
I don't have any answers for how a new wiki is supposed to survive long enough to (potentially) get a critical mass of users, although I wish I did. I just know that I can't imagine running an open-edit wiki myself if I had any choice in the matter, and I continue to be glad I didn't try to build one.
2007-02-20
Getting around LiveJournal's new minimum page width
LiveJournal recently started forcing at least some pages, such as the default style for individual entries, to have a minimum width. Worse, the forcing is done in such a way that text is wrapped to the width even if your browser is narrow, which makes the pages pretty unreadable. I noticed this change right away, because they picked a minimum page width that was somewhat larger than the width my browser is basically fixed at.
Today I got irritated enough to do something about it. First I had to
find what CSS setting on what element was doing this. Some digging with
FireBug led me to the
navigation toolbar's CSS, where I found a 'min-width: 760px;'; pulling
down the entire CSS file and doing a brute force search found another
one on the <body> element.
(I'd say that fame and fortune awaits the person who writes an extension that explains why Firefox laid out a particular element the way it did, but I'm not sure it's possible.)
Once I knew the CSS rules doing the damage I used Stylish to override them, with the rule:
@namespace url(http://www.w3.org/1999/xhtml);
@-moz-document domain("livejournal.com") {
#Navigation {min-width: 100px !important;}
body {min-width: 100px !important;}
}
(The 100px is arbitrary, and I don't know if the !important bits are necessary; I tend to spray them on all of my user CSS. Also, this has finally made Stylish one of my essential extensions, at least for my main browser environment.)
One of the interesting things about this whole episode was that I hadn't noticed how much this one small change was pissing me off until I read LiveJournal with it fixed and realized I wasn't gritting my teeth any more. I've known for a while that I'm very picky about small bits of interfaces, but it's one thing to know it intellectually and another thing to have it shoved in my face.
(Being quite picky about small things has its downsides; for example, I am probably soon going to have to give up the font that I've been browsing the web with for the past ten years or so. I am very attached to it, so this is probably going to be traumatic.)
2007-02-08
The danger of validating your XHTML
The danger of validating XHTML is that the validation is almost certainly not doing what you believe it's doing.
The problem is that all the common online validators ignore the HTTP
Content-Type of what your web server returns when validating your page,
and use only the DOCTYPE. This is completely wrong in the case of XHTML,
because browsers only treat pages as XHTML if they are served as
application/xhtml+xml. All the DOCTYPE does is let the browser decide
what sort of XHTML it has, since there are now several flavours.
(No less authority than the W3C says that browsers should behave this way; see this mailing list message, or this Safari blog entry, or even the W3C's XHTML media types note.)
So if you serve your beautifully validated XHTML as text/html, what
browsers will actually see it as is good old HTML tag soup. And
validators won't tell you this; they will happily tell you that your
text/html page is valid XHTML, when an honest answer is that it is
invalid HTML.
(In some future world it may be valid HTML5.)
If you are merely using XHTML validation as a good housekeeping seal of approval, you should write to HTML 4.01 Strict instead; it is just as strong, and browsers will actually interpret your pages the way you think they are, saving you various headaches.
If you are using XHTML validation to prepare for serving your pages
as application/xhtml+xml in the future, you are probably fooling
yourself, because your pages may or may not actually work as real XHTML
(see some of the links here for a full explanation).
If you are trying to serve real XHTML to selected browsers and your XHTML document as HTML to Internet Explorer, and you are not using either MathML or SVG, you are a masochist.