That's a large text file. Notepad will open it if you have enough memory on your machine... but will take a gosh awful long time to get'er done. I managed to open a 150mb logfile... Would suggest a log rotation/save at 50mb if you are managing your own log files. All my sites are set to rotate once a week (smaller sites) to once a day (more active sites). None of mine are in the hourly rotate category (NASA size, etc...). All that so it would not take forever and a day to open a log file.
Other option, and does not take that much time, is to dump into Access or MySQL then nav to the bottom of the list. Works a charm.
Linux (or Unix with GNU utilities installed) as a "split" command that can put a specified number of bytes or lines (which is what you want here) into each output file.
You can prevent the problem from occurring by piping output to a utility that creates a new file for each day. I cannot remember, but there was an example on the lighttpd website for splitting access logs that way.
thanks everyone! I tried a file-splitter shareware app, and it worked. I ended up with over 3,000 little numbered files with 1000 lines in each. Quite digestible :)
I may try UNIX "tail" or SQL next time. But more significantly, I found the PHP bug that was producing the errors, so my error.log file isn't growing any more.
Incidentally, it was an undefined index warning, such as $array['key'] where 'key' didn't exist in $array; when echoed, the result is an empty string which is what I intended, but it also threw an error into the log at the same time. Just one of those buggers, combined with PHP error logging, massive loads of traffic, and me not paying attention... lesson learned!