Hi Everyone, I have a few websites that rely very heavily on mysql queries to build and display site content. I am currently working on re-working the queries and such so that they are more optimized (limiting queries per page as well as storing pre-calculated values in the DB rather than calculating after retrieving the data).
What I'm doing will help, I'm sure. But I would really like to take a step further and see what else I can do to speed up page load to both speed up the sites for end users/spiders as well as reduce the toll on my server. I've currently got a few sites containing anywhere from a few hundred thousand pages to just over 2 million pages - almost all dynamically built, obviously.
Starting to look into caching and would like to know what more experienced individuals ideas are on this and perhaps other methods for speeding up sites that are so heavily database driven.
Any links, direction and/or guidance is greatly appreciated!
@Dinkar - Thanks for the link! Read it and took notes on the parts where I can probably be best improved. Am also going to be reading through the associated links the the article as well.
@Hoople - That is a good idea, thanks for bringing that up :)
As far as Caching is concerned do any of you have recommendations as far as what way to go with. Example - 3rd party API's like Memcached or hard coding it my own?
Kinda new topic to me so not sure exactly what the positives/negatives are for each and what type of caching to really set up...from what I've read so far you can cach your php scripts, DB stuff and also on client site - any pointers would be great! Reading up more on the topic now as well, Thanks!