Forum Moderators: coopster

Message Too Old, No Replies

php searching files

         

smagdy

10:28 am on May 10, 2007 (gmt 0)

10+ Year Member



Hello,

I am applying cache technique using cache_lite to main pages of my site that are database driven...

So i am worried about the number of files it can reach in the cache folder...

So have you any idea when could it start to be slow searching if the cache file exist/expired?

and how many files could be much for a directory on linux..

I am thinking of removing all cache files by cron every week or something but waiting to hear your opinions!

Thanks in advance

henry0

11:49 am on May 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do not have your answer although this is an interesting question.

For those interested in using PEAR pack I do find the PEAR manual very
hermetic, almost reserved to an elite of alchemist!
No example, no how to.
But I found a great article on “Devshed” about cache_lite and how using it
This could be a good start on how using a PEAR pack.

smagdy

12:26 pm on May 10, 2007 (gmt 0)

10+ Year Member



Thanks, but i read that article several times b4 i use it but no word about maximum number of files or slow searching or something :(

smagdy

4:54 pm on May 10, 2007 (gmt 0)

10+ Year Member



Please if anybody know any info about this subject, then don't hesitate to share it......

Thanks

StupidScript

9:29 pm on May 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you are using cache_lite, you can specify how long the documents are cached when you create the cached object by using the "lifeTime" attribute.

As far as how many pages you can store, that's up to your hard disk.

Using cache_lite to cache static pages (normal HTML) is not advised, because you introduce an extra operation (querying the cache) which increases your server load without providing any benefit ... the cached static page is exactly the same as the static page itself, so you might as well just let the page be served.

Server-side caching is good for dynamic data that doesn't need to be regenerated for each request. The benefit there is that you perform the resource-intensive query for the dynamic data one time and then cache it so the results are available through the cache without needing to execute another query. Caching in that case makes sense, because the data query consumes more resources than the cache query.

I suppose slow cache queries may be the result of a large number of objects in the cache, but if you manage the cache_lite contents using the "lifeTime" attribute, you can keep that number down to a manageable size without needing a cronjob or other mechanism.

smagdy

9:36 pm on May 10, 2007 (gmt 0)

10+ Year Member



Thanks for clearing things!

I am using caching for highly dynamis pages!

I am setting the lifetime for 30min for some pages that can change anytime and 1 week for others...

But the thing is that when the lifetime ends, the files stay there in the folder but they are just expired! then they get regenerated again with new requests but in all cases they are not deleted!

So wouldnt that need a cron!

StupidScript

9:45 pm on May 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The files should stay where they are ... their contents are being cached as PHP/PEAR objects. "lifeTime" removes the data from the object, it doesn't remove the files, themselves.

Your script should be checking to see if the requested data (file) is already stored in the cached object. If the file contents have not been stored as part of the cached object data, then a normal delivery of the data (file) occurs and your script should then store the contents in the cached object for the next request.

Cache_lite is all about using PHP/PEAR constructs to store data for efficient delivery. It doesn't create new files.

smagdy

9:56 pm on May 10, 2007 (gmt 0)

10+ Year Member



All this is fine and works great, but i am worried about the growing number of cached files.. will it get slow with certain amount of files..? that was my original question!

Thanks for ur effort!

StupidScript

10:01 pm on May 10, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sorry, I wrote incorrect information. I got all caught up in talking about the process and <doh> did indeed forget about the cached files! (Ever have one of those days?) I should have looked again at the cache_lite info instead of relying on my BRAIN!

You are exactly right, and a cronjob seems like a good idea unless you want to build in a "exec('rm -f cache/*')" (or something) function into your script based on if a cached item exists and has expired.

It shouldn't get any slower, as the cached objects aren't as numerous as the cached files, including the expired ones. But your hard disk may fill up ...

Yeesh. Time for a nap! Sorry, again.

[edited by: StupidScript at 10:03 pm (utc) on May 10, 2007]

smagdy

10:08 pm on May 10, 2007 (gmt 0)

10+ Year Member



Thanks...

I think i will keep it grow and watch it closely and if its getting slower or something then i set a cron to delete when the number of files reach a certain amount!

Good night!