September 5, 2007 9:06 AM
As I mentioned previously, I recently switched a good portion of Tapirtype Blog—everything other than the front page index and the entries themselves—to dynamic publishing. I’ve been collecting my impressions, although I’m generally waiting for the upcoming move on to Solaris at TextDrive, my hosting provider for any final judgments about performance.
I like the idea of it, but I’m still not sure on all the details of the execution. This is partly because Movable Type uses Smarty for it’s template system, so in order to understand what’s going on, I have to understand both what Movable Type is doing and what Smarty is doing. Note, this isn’t a criticism, I think Smarty is perfect for the job, and I’m glad they didn’t try to reinvent the wheel, making their own php template compiler. The biggest concern when switching to dynamic publishing is that the efficiency equation shifts. During static publishing, it makes sense to make trade offs that make build time just slightly longer for the sake of convenience, simplicity, or feature-fullness. Using dynamic publishing, however, this could lead to intolerable page load times, so caching becomes very important in order to make only the first load any slower.
Smarty has two levels of caching. First it complies the templates into php files on the first request. In theory this should eliminate any disadvantage of a template based system as compared to Wordpress style php “templates”. Second, as pages are requested the output is stored in a cache folder, so the second request for a given page should be able to simply echo that cache file with no computation necessary. Actually there’s a third level of caching, though I’m not sure whether it is a Smarty or a Movable Type feature, which is that the blog will instruct the web browser that the page has not changed since the last request, allowing the browser to fetch it from it’s own cache.
But the difficulty with caching is that any caching system is only as good as it’s ability to tell when it needs to recompute the content. Preferably, the system would be smart enough to understand which pieces of the content have updated and which are still the same, but at the bluntest it must flush the entire cache every time any change is made. And here’s where things have been falling down for me. In a given session, I can sit at the blog, load a bunch of pages, and see that after the first load, the loads get much quicker. However, inevitably, the next day, with no changes made the the blog, I’d find the caches recomputing. With no changes published to the blog, I can’t see why this would be the case. The only thing I could think of was the fact that several trackback requests had come in and were marked as junk, resulting in no changes to the published blog. It would be quite silly if that was causing the cache to reset, but just to test, I’ve disabled trackbacks now. As much as I like the idea of trackbacks, they’ve never really worked out, anyway, and since adding a captcha to anonymous comments, they are my only current source of spam (although none ever get published due to the insistence that the trackback request come from the same IP address to which it points).