Forum Moderators: DixonJones
I have been checking posts here, and site search returns some interesting posts, including specifically:
[webmasterworld.com...]
My issue though is that I am on a shared hosting server. In this case, can I run my own site specific logrotate command? I did a "man logrotate" but it doesnt really help.
I want to have my own script to read the log files and make my conclusions (which are very site specific, I am not interested in the usual access log type stuff) -- and to do this, I run a cronjob to glean these statistics every 2 hours. However, I want to be able to flush the access log afetr making these calculations, so that next time I do the calculations, I only have new accesses.
Any suggestions would be deeply appreciated.
Thanks!
CM
You could just have the cron empty the file. Or open it write the data to archive and then dump the existing one.
with PHP you can open with the "w+" flag and then close it and that truncates the file to 0 length.
Have you considered keeping some sort of variable to remember the last entry in the log that you processed, and then skip to that entry in the log to run your next go?
thanks, this is exactly the wall i ran into! i cannot delete the file, and it is kept open by apache anyway.
so, if i were to read the file, and then remember the last line -- could i do this based on, say, the timestamp? how do you suggest i designate the "last line"?
thanks for any ideas!
I would remember the line number of the last line your script read, and store it in a local file. This would be pretty dependable, as log files grow as they are written.
Then when you reopen the log a few hours later, you can loop until you've read that same line, and then continue with the processing.
If your logs rotate automatically on a daily/weekly basis, you could add logic that would see if the file is smaller (say you store the file size in addition to the last line number read), then you would know to start at the top of the log as it was a new log file that you hadn't see before.