|how to increase mem limits and run time for perl script|
I've just installed wampdeveloper pro to run a few specific maintenance scripts written in perl. It's on my windows 7 (x64) machine -- but the scripts stop with no error.
This also happens if I try to run the scripts on my websites hosted on a shared server, but the scripts WILL run on my xp laptop using a dfferent server package, so it's not the scripts.
I'm positive it's a resource issue, time, memory, or something, but I know almost nothing about apache, cgi and perl.
Is there a simple way to increase limits? I've searched for hours and tried a few things, but it's a little complex.
Are you using WampDeveloper v4.1, or the older v4.0?
Is this Perl script being run from the command line or via the web server?
If the latter, make sure your Perl script has the correct she-bang line, which is the first line of the script that tells the web server where to find perl.exe.
For WampDeveloper its:
There might be some switches at the end such as "-X", try it without those switches. (execute: perl.exe -h, to see all the options)
Make sure it's ending in .pl and it's in the Websites\domain.here\cgi-bin folder (not in Websites\domain.here\webroot\cgi-bin)
You can also enable the script log in config file:
Your script could require a Perl module or might be suppressing error messages.
Thanks, TowerofPower. It's 4.0x, and the script is via the webserver. The script runs, but on both shared and wampdeveloper it stops at the same spot each time. Should have mentioned that. I've been told that the reason it stops like that is some kind of limits on run time (or memory).
The script being executed is in the cgi-bin dir, and has a .cgi extension.
I could run it on my laptop, under a different webserver implementation (that doesn't run on Win7), since it all works properly there, so I'm thinking there's something in the apache/cgi configs that limits jobs. This is just a development server open to me, so it's not like I have to worry about load.
Driving me nuts.
Is the script giving you good results until it stops?
If the script is stopping each time after 5 minutes exactly, than that could be the Apache Timeout directive of 300 seconds affecting it... Though I would be surprised to know that since Timeout is a client response timeout, not script runtime timout.
mod_cgi has no timeout/resource directives itself. And there are no Apache limits set on memory consumption except for the default stacksize.
Try clearing the script's shebang line switches to see if that has any effect.
And you really should check the website's and Apache error.log files and turn scriptlog on and check that too.
Is the script something that can be run from the command line vs using Apache CGI and a browser?
If so, try running it direct from the command line and if it completes, you know something else interfered, and if it doesn't complete, you've quickly narrowed it down to your script and/or the server environment itself.
Thanks. There are no switches set on the shebang line, and the script works fine until it just stops. No errors in the apache log, although I haven't figured out how to enable the cgi log yet. Can't run it from the command line.
I'm pretty sure it's timing out. The script is basically a url checker, so for each entry in the flat data base table it uses, it has to check the header response codes.
So, I think the question is: can I increase the apache timeout directory?
The Timeout is set here:
The other settings in this file are not relevant.
Restart Apache for changes to take effect.
Thanks, all. Followed Tower's advice, and changed the timeout to 3000, and looks like it ran properly...well...I lost my internet connection during the run, so have to redo it, but it seems to have completed.