Welcome to WebmasterWorld Guest from 50.19.156.133

Forum Moderators: open

Message Too Old, No Replies

Database hit or File Cache?

     

GrAfiXoNeR

6:44 pm on Sep 15, 2005 (gmt 0)

10+ Year Member



Sorry if this has been asked, I've read lots on the benifits of both, however I am puzzled as to my current delimia.

Background:
I am starting an Affiliate website to a niche market. I am using lots of caching of zones, ads, filters and other things which is working great. However also my program parses a page and grabs a top Keyword off the page and then currently I'm caching that to a file like url_b54a7438c437732dc0698055b9326f11.cache which is an MD5 of the URL this way it doesn hit the database and scrape the page except every few hours which is processor intensive. This file contains 1 Keyword or keyword group 1-3 words.

Environment:
Linux Redhat Enterprise
Dual athlon box
1 gig of ram
2 80gig HD's
Using mysql/php/javascript

Question/delima:
Now to my question. On the site I'm beta testing with it has a forum so now there are over 30,000 cached files each not bigger than 8-9 bytes and the largest 15 bytes. However I'm worried that once I launch and I get 50 or so sites each with 30,000 files cached that it is going to slow down the file system defeating the whole purpose of the cache. So....

Would it be better to just set up a table that has like 4 fields (id, date, MD5(URL), keywords) or is the file caching better. I'm expecting the number of URL File/database entries to get into the millions very quickly.

webprofessor

3:42 am on Sep 19, 2005 (gmt 0)

5+ Year Member



this isn't a direct answer but I recently faced the same issue. I ended up writing a crawler and simulating traffic and recording the appropriate metrics. The testing showed me the answer when no one else could of. Perhaps you should do the same.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month