Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
I am starting an Affiliate website to a niche market. I am using lots of caching of zones, ads, filters and other things which is working great. However also my program parses a page and grabs a top Keyword off the page and then currently I'm caching that to a file like url_b54a7438c437732dc0698055b9326f11.cache which is an MD5 of the URL this way it doesn hit the database and scrape the page except every few hours which is processor intensive. This file contains 1 Keyword or keyword group 1-3 words.
Linux Redhat Enterprise
Dual athlon box
1 gig of ram
2 80gig HD's
Now to my question. On the site I'm beta testing with it has a forum so now there are over 30,000 cached files each not bigger than 8-9 bytes and the largest 15 bytes. However I'm worried that once I launch and I get 50 or so sites each with 30,000 files cached that it is going to slow down the file system defeating the whole purpose of the cache. So....
Would it be better to just set up a table that has like 4 fields (id, date, MD5(URL), keywords) or is the file caching better. I'm expecting the number of URL File/database entries to get into the millions very quickly.