Forum Moderators: coopster
I run a site that has a good deal of dynamically generated content. each time a page is hit, PHP connects to a MySQL database and performs 40-50 queries (the data is relational) before returning a page.
The thing is, the data isn't really dynamic: it's more or less a dictionary of terms that only gets updated so often.
I was thinking of switching to an XML/XSLT system, in which the XML is static (generated from the database and cached), but the pages are dynamically generated using XSLT, depending on what the user wants to see.
My question is: given the relatively large number of queries per page, would a static XML/XSLT using Sablotron on PHP system be faster than a MySQL + PHP setup? In other words, which requires more overhead: multiple redundant queries to the database, or XSLT parsing?
(XSLT parsing on the client not an option.)
Any help would be appreciated.
I currently work with xsl/xslt/xsl-fo we use castor as our parser.
We use this to create PDF documents on the fly, and some can be quite large, but the time difference between them being displayed on the client-side has never struck us as major!
I think XML/XSL is the way forward ;)
HTH,
-gs
Your setup would be similar only you will be caching the info as XML and not going to the database all the time, therefor it should be quicker.