birchy - 1:49 am on Jul 12, 2013 (gmt 0)
I've already split the 120 million records into 239 tables to get them into 500k records ea. Each page is an average 80 (0-300 range) record subset of the data in one of those tables. I guess caching is my best bet. Getting the intial cache saved while serving both bots and humans is still a big problem, probably take 3 months to get 80% cached. Maybe revert back to old version and run caching scripts at night.