Forum Moderators: coopster

Message Too Old, No Replies

Relatively static data ...txt file or mysql

         

mgm_03

12:47 am on Dec 14, 2004 (gmt 0)

10+ Year Member



I'm making a small CMS for someone and there's a links page with 5 categories. All the data is stored in the database. over time they will add more links (up to approx 150 or so). Therefore, the content is not static but not necessarily changing day to day either.

My question is rather than query the database to display the page each time, is it more efficient to read the links data from an included text file? ...don't know if that is faster/slower. I could pull the data from mysql to create the file when it needs to be updated.

Elsewhere on the site, mysql queries will be used a great deal. In a best case scenario, the site will receive 5000 hits /mo. (I doubt the links page will see much usage). I'm sure mysql can handle that load but ...I'd like to practice "best known methods".

any opinions are appreciated.

timster

3:34 pm on Dec 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Using both MySQL databases and text files will make things more complicated when you (or worse, your newly hired worker) tries to find them later.

I'd suggest making the SQL database the one definitive source for your website data.

If you want to keep load on your SQL server down for semi-static data, you can write a little script that creates static web pages out of your MySQL data, and run it as needed (maybe scheduling it to run periodically).

mgm_03

3:59 pm on Dec 14, 2004 (gmt 0)

10+ Year Member



...you can write a little script that creates static web pages out of your MySQL data ..

Thanks for the suggestion. Haven't created static pages (via database) before. I'm guessing it goes something like...

open file (links_page.html)
write header stuff
pull data from mysql, format data into HTML links, write data
write footer stuff
close file

or perhaps,

Just create a pre-formated HTML page and insert [link_info] [/link_info] that encloses the links. Then parse the file when link info changes.

bfillmer

5:13 pm on Dec 14, 2004 (gmt 0)

10+ Year Member



The latter would probably be easiest. The PHP page that inserts the data into the html would be able to use a simple regular expression to find whatever unique identifiers you want to use to label where you are inserting the data.

Then either make the PHP page automated on a cron job or run whenever the admin adds information to the db.

timster

9:38 pm on Dec 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I think either way would do it.

If you can control how the link info gets changed in the database, you might even be able to set up a little admin page where you change the data with a PHP page, and it automatically parses out the new web pages. If that's worth the time, that is.

mgm_03

3:01 pm on Dec 22, 2004 (gmt 0)

10+ Year Member



I wanted to close this thread by mentioning that through my investigation, I learned that the Smarty Templating engine is a good solution to this need.

In addition to other benefits of a templating engine, smarty caches data (and caching can be controlled). So, if I have a list of links in the database that do not change very often, the output of the query can be cached. And, when a change is made ...say a link is added, the script can re-generate the cache.