Forum Moderators: DixonJones
I'm tired of trying products that won't crunch my logs. I don't want to have to turn off images, throw away data over a month old, etc. I simply want an inexpensive solution to crunch my large logs. We generate over 1BB hits a month so needless to say we have big logs since they include refferer data, arguments, etc.
We get about 1.2BB hits a month. We're not super graphic intensive so there are alot of html pages there that we need to know more about from a traffic standpoint. We also get many uniques. I've tried a bunch of the shareware type log analyzers and they just can't crunch my logs. A perfect solution would be something that could crunch logs that were years old. For this I am certainly open to a MySQL backend. My server environment is all Linux so I need software that crunches .gz logs on a linux box using MySQL.
My budget is limited. I really want to spend less than $1,000 on the software but will listen to alternatives because I need to solve this problem.
I'm really tired of trying all the software that can't handle my web log files sizes. It's extremely frustrating to wait hours or overnight to see a small subset only to get output errors due to ram problems, log software issues, etc.
I really need some help here, can you guys make some suggestions?