Forum Moderators: coopster
I am applying cache technique using cache_lite to main pages of my site that are database driven...
So i am worried about the number of files it can reach in the cache folder...
So have you any idea when could it start to be slow searching if the cache file exist/expired?
and how many files could be much for a directory on linux..
I am thinking of removing all cache files by cron every week or something but waiting to hear your opinions!
Thanks in advance
For those interested in using PEAR pack I do find the PEAR manual very
hermetic, almost reserved to an elite of alchemist!
No example, no how to.
But I found a great article on “Devshed” about cache_lite and how using it
This could be a good start on how using a PEAR pack.
As far as how many pages you can store, that's up to your hard disk.
Using cache_lite to cache static pages (normal HTML) is not advised, because you introduce an extra operation (querying the cache) which increases your server load without providing any benefit ... the cached static page is exactly the same as the static page itself, so you might as well just let the page be served.
Server-side caching is good for dynamic data that doesn't need to be regenerated for each request. The benefit there is that you perform the resource-intensive query for the dynamic data one time and then cache it so the results are available through the cache without needing to execute another query. Caching in that case makes sense, because the data query consumes more resources than the cache query.
I suppose slow cache queries may be the result of a large number of objects in the cache, but if you manage the cache_lite contents using the "lifeTime" attribute, you can keep that number down to a manageable size without needing a cronjob or other mechanism.
I am using caching for highly dynamis pages!
I am setting the lifetime for 30min for some pages that can change anytime and 1 week for others...
But the thing is that when the lifetime ends, the files stay there in the folder but they are just expired! then they get regenerated again with new requests but in all cases they are not deleted!
So wouldnt that need a cron!
Your script should be checking to see if the requested data (file) is already stored in the cached object. If the file contents have not been stored as part of the cached object data, then a normal delivery of the data (file) occurs and your script should then store the contents in the cached object for the next request.
Cache_lite is all about using PHP/PEAR constructs to store data for efficient delivery. It doesn't create new files.
You are exactly right, and a cronjob seems like a good idea unless you want to build in a "exec('rm -f cache/*')" (or something) function into your script based on if a cached item exists and has expired.
It shouldn't get any slower, as the cached objects aren't as numerous as the cached files, including the expired ones. But your hard disk may fill up ...
Yeesh. Time for a nap! Sorry, again.
[edited by: StupidScript at 10:03 pm (utc) on May 10, 2007]