homepage Welcome to WebmasterWorld Guest from 54.226.213.228
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
Forum Library, Charter, Moderators: coopster & jatar k & phranque

Perl Server Side CGI Scripting Forum

    
perl script creates blank screen
jcmoon




msg:3764905
 10:02 pm on Oct 13, 2008 (gmt 0)

We have a Perl script which takes in a raw text file, does a lot of tedious fixes to it, and then exports a text file later used to load into our MySQL database. Works in a web browser, and generally works well, except ...

Sometimes, after it's taken a long time to do its processing and update the browser, it just gives up. The browser shows it's done, but the browser window is blank white, and hitting View Source shows nothing is there.

I'm used to getting some feedback from errors; a blank screen doesn't give me much to work with.

What can cause Perl to do this? What's the best way to debug from here? I've tried the server error log, and it hasn't been much help.

 

rocknbil




msg:3765654
 8:09 pm on Oct 14, 2008 (gmt 0)

SSH in to the command line of your server, if you can.

Type ps aux right after you begin the script. You should see it in the list of processes.

When the screen goes white in the browser, repeat ps aux - I will bet anything your script will be listed as "defunct" meaning the process died or timed out.

If you have access to this script, put some print commands in there somewhere. Not too many, don't make it print every line of a large file, just at key points.

At the top, if it's not already there:
print "content-type:text/\html\n\n";

then


....
$counter=$totalines=0;
print "opening file<br>\n";
open (FILE,"yourfile");
while ($line =<FILE>) {
# whatever you're doing
if ($count >= 1000) {
$totalines+=$count;
print "Processed $total_lines <br>\n";
$count = 0;
}
}
close (FILE);
print "file closed<br>\n";
.....

And so on. It will at least let you know where it's timing out. I have one of these on the table as well at the moment, when dealing with files this huge it's REALLY important to know what's going on at every point and have a method of stopping the process if it's hogging up resources or dieing.

[edited by: phranque at 8:47 pm (utc) on Oct. 14, 2008]
[edit reason] disabled smileys ;) [/edit]

phranque




msg:3765758
 11:26 pm on Oct 14, 2008 (gmt 0)

have you done a timing test to see how long it takes to lose the connection?
which browser are you using?
have you tried one of the failures on another browser?

jcmoon




msg:3766131
 2:32 pm on Oct 15, 2008 (gmt 0)

Generally I use Firefox, and haven't tried other browsers. That could be quite helpful. Tricky part is that it doesn't always hang up, just sometimes.

I haven't done a timing test - is there a simple way of doing this, or does this involve just hitting "reload" and watching a clock? I'll try the ps aux command soon - thanks for that tidbit.

One thing I have tried: in the phpMyAdmin console, the query "SHOW PROCESSLIST" shows what queries are running. This gives me a sense of where the script is ... but only when it's a script that does a lot of queries. Which isn't the case when it's simply digesting a giant file and saving the results to another file.

The statement to print content type=text/html is actually toward the top; it's usually the starting point for my browser-based Perl scripts.

And I like the advice about putting in a few print commands, giving feedback about how far it's gotten. That might be difficult because my usual scripting strategy is to have the script do a bunch of work, creating HTML pieces in the process, and then at the very end arrange all the pieces and print them all out at once. While this allows me the freedom to rearrange the end-result at will (without trying to get all the print statements to go at the right time) the drawback is obvious -- if the script hangs like this, I get nothing whatsoever.

This leads me to a different idea, though: for these print commands that give feedback about how far it's gotten, I could print them to a file instead of to the browser. That could work, and I wouldn't have to make changes to how normal output is printed.

Thanks for the advice; I'll give these a try.

rocknbil




msg:3766518
 9:55 pm on Oct 15, 2008 (gmt 0)

I usually do my print at the end, so throughout the script you can do

$debug .= "this ";
....
$debug .= " that ";
.....
$debug .= " da other thing ";

print "content-type: text/html\n\n";
print $normal_output;
print qq/<!-- $debug -->/;

Or put the debug variable in the last bit of $normal_output.

jcmoon




msg:3820580
 11:12 pm on Jan 6, 2009 (gmt 0)

Alright, so the issue has shown its ugly head once more. I took the above advice: one at a time, I added print statements at strategic points in the code, then hit "Reload", and then downloaded the text file to see how far the script executed (complete with timestamps).

Here's the fun part -- the entire script executes! A dozen or so print statements -- the last one right before the exit; -- all work and show up in the file.

I even took the HTML that should appear rendered in the browser, and printed it to the file, too ... and yet, the browser is blank, and when I hit "View Source", the source-code is blank.

So the whole script appears to run just fine (in 30 seconds, no less), but for some reason the output that is supposed to reach my browser, doesn't make it.

Any thoughts?

Side note: a google search for perl blank browser leads to this thread as the 2nd result. Not sure if I should be proud, embarrassed, or afraid ...

For the record, the script does about a half-dozen different things; it's only one of these things which causes the behavior above, so it's not as though the script never works, period ...

Key_Master




msg:3820600
 11:40 pm on Jan 6, 2009 (gmt 0)

I suspect the browser session is timing out before Perl can release the STDOUT buffer to the browser. For more info, do a search for perl stdout buffer. To get around STDOUT buffering, add the code in red below, immediately above the Content-Type line in your script. (Change the broken pipe to a solid pipe first)

$=1;
print "Content-Type: text/html\n\n";

jcmoon




msg:3820617
 12:20 am on Jan 7, 2009 (gmt 0)

Fascinating reading (found an article called "Suffering from Buffering?").

I note that almost all of our scripts contain these lines shortly before the Content-Type line:
BEGIN {
$=1;
use CGI::Carp('fatalsToBrowser');
}

After reading that article, I've a better idea what these line do. And ... I'm afraid I'm still stuck with the same issue (blank browser). I even tried putting $=1; right before the Content-Type line anyway, but no dice.

I'm not certain if this site lets errors appear in the browser, so I tried redirecting STDERR to a file, but it comes up empty.

Thanks for the advice, though. I think I'm on the right path.

I know errors sometimes make it to the server's error log, but the fun part here is that it's inaccessible to me -- I can only see yesterday's log, and that won't get created until it's 3am in Tokyo, i.e. tomorrow mid-morning where I am.

Key_Master




msg:3820631
 12:43 am on Jan 7, 2009 (gmt 0)

No need to wait on those logs since you're using CGI::Carp already. You can send those error messages to your own private log file using the carpout() function.

[perldoc.perl.org...]

krugs




msg:3820662
 1:58 am on Jan 7, 2009 (gmt 0)

Does a simple script that just prints something like "Hello World" and nothing else work?

If there is no output because of a buffering problem then redirecting errors using CGI::Carp could experience the same problem, no output.

jcmoon




msg:3821368
 8:30 pm on Jan 7, 2009 (gmt 0)

I found a way in the host to see real-time errors; when the script executes this step, these errors occur:
[Wed Jan 07 09:19:16 2009] [warn] [client IP.AD.DR.ESS] Timeout waiting for output from CGI script /virtual/IP.AD.DR.ESS/cgi-bin/this-script.pl, referer: [domain.com...]
[Wed Jan 07 09:19:16 2009] [error] [client IP.AD.DR.ESS] (70007)The timeout specified has expired: ap_content_length_filter: apr_bucket_read() failed, referer: [domain.com...]

The site has about a dozen other scripts, all of which work just fine, and when this particular script executes other steps, it's also fine. It's only this particular step that doesn't.

The amazing part is that sometimes, other steps take 6 minutes ... yet this one, which stops after 30 seconds, doesn't send its output to the browser.

Note: this script also sends an email in this step, successfully (and now, prints to a text file), even while it's not sending its output to the browser.

krugs




msg:3821564
 1:33 am on Jan 8, 2009 (gmt 0)

Those appear to be apache errors/warnings.

phranque




msg:3821586
 2:10 am on Jan 8, 2009 (gmt 0)

maybe you have a recursion problem.
have you looked at how many processes are created during that 30 seconds?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved