Chris's Wiki :: blog/web/ConditionalGETAndCaching Commentshttps://utcc.utoronto.ca/~cks/space/blog/web/ConditionalGETAndCaching?atomcommentsDWiki2014-03-11T20:28:17ZRecent comments in Chris's Wiki :: blog/web/ConditionalGETAndCaching.By Chris Siebenmann on /blog/web/ConditionalGETAndCachingtag:CSpace:blog/web/ConditionalGETAndCaching:0589788a16786f7551f39718052fa02d9e428175Chris Siebenmann<div class="wikitext"><p>All of <a href="https://utcc.utoronto.ca/~cks/space/dwiki/DWiki">DWiki</a>'s caching, including the full page cache, sits 'behind'
it. It's actually relatively hard to do otherwise for full page caches on
real dynamic websites because you need a way of forcing the webserver and
your application to agree on the <code>ETag</code> for something that is dynamically
generated and then cached.</p>
<p>(Synchronization of timestamps is usually relatively easy, but not so
much for <code>ETag</code>s. Web servers usually like <code>ETag</code> schemes for static
files that don't involve actually reading them and often don't document
exactly what the scheme is so that you can reproduce it. And the scheme
invariably depends on what exact web server you're running under, which
has various drawbacks.)</p>
</div>2014-03-11T20:28:17ZBy Ewen McNeill on /blog/web/ConditionalGETAndCachingtag:CSpace:blog/web/ConditionalGETAndCaching:67da3ed4014bc096417c5c478db1b2457f07bc70Ewen McNeill<div class="wikitext"><p>Your cache files are (I assume) static files, served in a (semi-)static-file way. Just saying. (Serving cache files until manually invalidated is a moderately well known front end caching strategy, that makes good sense in some situations -- especially if targeted at certain files/file types, and in some cases a automated regeneration via a suitable GET.)</p>
<p>FWIW, the kludge that Wordpress seems to use for comments count is to include an IMG reference which fetches an image with the comment count (rendered in text) in it. I'm not sure if that's overall a load/bandwidth win (over updating the syndication feed), but it does at least offer another tradeoff to play with.</p>
<p>I just wanted to offer you the perspective on other approaches given that a couple of recent posts made it sound like your syndication feed generation tradeoffs weren't an ideal match to current requests/criteria. ("ideal match" is hard to achieve :-) )</p>
<p>Ewen</p>
</div>2014-03-11T19:22:21ZBy Chris Siebenmann on /blog/web/ConditionalGETAndCachingtag:CSpace:blog/web/ConditionalGETAndCaching:18c4ccb6f03b5ca2efb2b4584583973ea588c0aaChris Siebenmann<div class="wikitext"><p>There's three problems with your idea for here:</p>
<ul><li>there's too many syndication feeds for me to want to generate them
by hand or by make/etc (and in fact many of the possible syndication
feeds in <a href="https://utcc.utoronto.ca/~cks/space/">the overall wiki-thing here</a> are never requested).</li>
<li>even if I did, I'd need a new static file generator for them and a
bunch of infrastructure around it and around serving them.</li>
<li>because a count of the comments on an entry is part of it in the
syndication feed, syndication feed entries can change at random times.</li>
</ul>
<p>(Cached syndication feeds have a predictable name pattern under the
cache area, so they are flushed with 'find ... | xargs rm'. A similar
trick can be done with any cache where you can inventory the current
keys.)</p>
<p>If syndication feeds for <a href="https://utcc.utoronto.ca/~cks/space/blog/">Wandering Thoughts</a> were a major
resource consumer even after all of <a href="https://utcc.utoronto.ca/~cks/space/dwiki/DWiki">DWiki</a>'s optimizations it would
be worth such major surgery in order to deal with them. But they
would probably have to be requested several orders of magnitude more
frequently than they are now in order for that to be necessary and
it really would be a hack.</p>
<p>(In general, serving static files well requires a web site with an
URL layout that is designed for this. Essentially you want something
that is the reverse of <a href="https://utcc.utoronto.ca/~cks/space/blog/web/ADynamicSitePeril">ADynamicSitePeril</a>.)</p>
</div>2014-03-11T04:54:41ZBy Ewen McNeill on /blog/web/ConditionalGETAndCachingtag:CSpace:blog/web/ConditionalGETAndCaching:e410d3ecf0fef38eeceffc860bda65cd2a691c4aEwen McNeill<div class="wikitext"><p>If you're already flushing the cache of syndication feeds by hand each time you post, and are willing to continue to do so, then you could just pre-generate them to a file and let the conditional GET hit that file instead. (This is basically how Ikiwiki works for feeds -- they get rendered out to files too when the wiki/blog is regenerated.)</p>
<p>In the case of posting to my Ikiwiki-based blog, committing to the git tree that holds the blog causes a git post-hook to run that regenerates everything that needs regenerating (including syndication feed files) and makes the updated files public. Which means I get "behaves like a static file" syndication feeds for no extra effort.</p>
<p>Ewen</p>
</div>2014-03-11T04:19:53Z