A case for breaking the web server ownership guidelines
I have to admit that there actually is a reason to break the good practice for webserver file ownership: the 404 handler trick for lightweight caching of semi-dynamic websites.
The 404 handler trick comes from three observations:
- most dynamic websites are in practice not very dynamic; the actual pages served change rarely.
- Apache (and other web servers) are many times faster at serving static content than dynamic content.
- Apache (and etc) can let you take arbitrary actions on 'page missing' errors.
So the 404 handler trick is to make your dynamic website run as Apache's page missing handler. Once you've checked that the URL is a real page and generated the page's content, you write it into the static document area as well as returning it to the user; the next time around Apache will just serve it directly. And if something changes that means you need to regenerate the page, you just delete the page from the static document area and the next time it's accessed it'll get rebuilt automatically.
(Disclaimer: I didn't invent this. I believe I got it from Sam Ruby, but the practice is widespread.)
However, this requires that your 404 handler be running as a user that can write into some part of the document area, so that it can actually write out those static files for Apache to serve. You can make this less dangerous by restricting Apache to only serve static files (no PHP and so on) from that area, although this may be tricky.
(Note: less dangerous, not safe. There are some dangers even with just static files, but a discussion of the issue doesn't fit into this margin.)