Wandering Thoughts archives

2009-04-24

The difference between Web 1.0 and Web 2.0

Here's one view of the difference, put in a punchy short form:

  • web 1.0: all of this user attention is going to be worth money!
  • web 2.0: all of this user attention is going to be worth money, and this time around we're going to make sure that the users stick around for it.

In classic 'Web 1.0' operations (with the apex of this being the various portals), it turned out to be very hard to monetize all of the user attention that you were getting. Pretty much all of the things you could do to get money from your users generally made the experience worse for them, and so they left for somewhere else.

Classic 'Web 2.0' operations like Twitter are once again getting huge valuations based on the view that all of this user attention has to be worth money. But this time around people have learned from the previous failures, so they are making sure that the users will be stuck to their websites as tightly as possible so that they can't leave when the monetization starts and the experience goes downhill.

There's lots of crude ways to do this, but the best one (and one that is everywhere in Web 2.0) is social stickyness. Once you have a community using a service, like Twitter, they are all hooked; a large part of what they're getting out of the service is their connections, and if someone move to a different service they lose all of that.

Web1vsWeb2 written at 01:15:06; Add Comment

2009-04-12

A hairshirt too far: on always avoiding CSS

Recently I wound up reading the Webless initiative (via), which advocates a very strict CSS-less sort of web design. To coin a phrase, I think that this is a hairshirt too far; not only does it make it harder to create good web pages, but it can damage the web viewing experience for your visitors.

First, I think that it's clear that 'graphics', broadly interpreted, are superior to plain monospaced text even for text content, as they allow you to do important but subtle things that increase readability and usability. This starts with proportionally spaced fonts, but good graphics design goes further than that; for example, I hope that everyone agrees that a table in a graphical web browser, with borders and so on, is more readable than a table rendered in ASCII.

Once we accept graphics at all on the web, the remaining question is about the best way to implement them. While I am no fan of CSS in general, my own experience is that CSS can be the best way to implement various useful graphical effects, and sometimes it is the only way to achieve certain things; there are worthwhile things that you can do in CSS that have no equivalent in even presentational HTML.

(The best of these are subtle effects. Good design is often invisible.)

And yes, my ox is a little gored here because WanderingThoughts does use CSS for some things. For some of them it was the easiest or best way to achieve certain results; for others (eg), it was the only way to get what I wanted.

HairshirtTooFar written at 00:39:59; Add Comment

2009-04-11

The advantage of having an (XML) sitemap

I've had a sitemap for a fairly long time (long enough that the format changed out from under me a few times). In theory I created it for Google, as a way of steering them around WanderingThoughts (and CSpace as a whole), but I've never been sure if Google was really getting anything out of it. As it turns out, that doesn't really matter, because having a sitemap has turned out to be really useful for me.

Why it's so useful is neatly encapsulated in what it is; at least for me, a sitemap is an automatically generated, easily parsed list of all of the important URLs in my dynamically generated website. This is a great thing to have to feed to various sorts of testing systems; for example, if I change the code that converts wikitext into HTML, I test it by rendering all of the URLs in my sitemap with both the old and the new code and looking for differences.

(Using auto-generated sitemaps for testing is a great incentive to make sure that they include all of your important pages. For example, I had to do some tweaks to my initial sitemap generator to make sure that it included URLs that would show all comments. This is good for me and, if Google is paying attention, is good for Google too.)

The one thing I wish for with sitemaps is an autodiscovery protocol that did not involve robots.txt and instead was more like syndication feed autodiscovery (which uses magic things in the <head> section of ordinary HTML pages). Editing a single global file like robots.txt is simply not scalable if you have lots of sub-sites, each of which will generate their own sitemap, and people can create such sub-sites on their own.

(Translation: I do not want to be editing our robots.txt each time a user adds a sitemap to their home page or to some sub-area of their home page, or removes such a sub-area that they decided they didn't want any more, or changes software in a way that changes the sitemap URL, or etc etc etc.)

SitemapUsage written at 01:39:46; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.