Web analytics versus
GET parameter security
I have recently run into an interesting collision between typical web analytics practices (and applying this to random URLs) and good security and robustness. The straightforward manifestation is that links to WanderingThoughts entries from the Planet Sysadmin Twitter feed don't work; trying to follow one gets you remarkably terse error messages from DWiki (the software behind WanderingThoughts).
DWiki is very cautious. One of the ways that this manifests is that it doesn't accept random query parameters on requests; it knows what query parameters each URL accepts, and anything else is an error. I maintain that this is both secure and robust; certainly my logs have a constant parade of attempts to exploit the willingness of bad PHP applications to accept additional random (and, as it turns out, dangerous) query parameters. The abrupt error messages are happening because of extra query parameters.
The extra query parameters aren't directly visible in the URLs in the Twitter feed, which uses bit.ly to shorten the URLs, and they aren't in the original form of the entries on Planet Sysadmin. Instead the shortening process is adding them on.
(I suspect that the direct culprit is twitterfeed's analytics features, which I further suspect are enabled by default.)
PS: I've let the Planet Sysadmin people know about this, so it'll presumably get fixed at some point. Assuming that twitterfeed and all of the other moving parts involved in this allow you to turn it off.