birchy - 9:49 pm on Jul 11, 2013 (gmt 0)
Did a little more searching and I guess the answer for me is "no". I'll probably give caching another try.
My problem is that I have 1.5 million versions of a page that used to take 9 secs to build and server, but I've added almost twice the data and with Bing and Googlebot crawling all those pages night and day, the serving is getting bogged down and doubling the serve time. During lulls in the crawling, the longer pages load as fast as the shorted pages did.
I really don't want the crawlers to slow down (it's been taking Gbot about 3 months to cycle through the site, before Bing joined the party last week, anyway), can't afford to add server capacity. The pages don't change too often. Each page may get viewed once a week and have content changed by users once a month.