homepage Welcome to WebmasterWorld Guest from 54.204.59.230
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / WebmasterWorld / Webmaster General
Forum Library, Charter, Moderators: phranque & physics

Webmaster General Forum

    
"the connection was reset" problem
as result dropped from google
FromBelgium




msg:4548394
 7:09 pm on Feb 24, 2013 (gmt 0)

I get "the connection was reset" (Firefox) or "Internet Explorer cannot display the webpage" (IE) every time I open my website.

If I then refresh, it opens correctly. I have same problem with different computers, different browsers and different locations.

As a result Google has dropped my website from its index a few days ago.

What could be the problem?

 

phranque




msg:4548582
 10:31 am on Feb 25, 2013 (gmt 0)

i would first try rebooting any modems/routers/etc.
if that doesn't fix it i would next look to ping or traceroute for clues.

lucy24




msg:4548593
 11:43 am on Feb 25, 2013 (gmt 0)

Obvious first step: What if anything do your logs say? Look at both access logs and error logs; there's a good bit of overlap but each will say some things that the other doesn't.

Do you know any other sites that live on the same server? Are they behaving normally? Is it your own server or shared hosting?

I assume you haven't changed anything-- even something completely unrelated-- in the last few days, or you'd have said so.

FromBelgium




msg:4548631
 2:06 pm on Feb 25, 2013 (gmt 0)

Thanks for your answers!

I have 2 sites on the shared host (Enom!): one is Windows and the other one is Linux hosted. Only the Linux one is giving that problem.

Info from log files (I changed domain name):

Access.log
Hundreds of Lines like this:
109.129.210.8 - - [25/Feb/2013:03:25:26 -0800] "GET /stylesheet.css HTTP/1.1" 200 1678 "http://www.mydomain.com/" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"

Error.log
Only one line:
[Mon Feb 25 04:44:32 2013] [error] [client 84.199.79.50] File does not exist: /var/www/vhosts/mydomain.com/httpdocs/page.php


Traceroute gives time out at hop 8. But when I refresh the page (or open another page) it loads instantly. It is only the first request that fails. I also get the error when I open first an non-php file, such as robots.txt.

This is message in Google Webmaster Tools: "Over the last 24 hours, Googlebot encountered 46 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 9.2%."

FromBelgium




msg:4548903
 7:26 am on Feb 26, 2013 (gmt 0)

Still not working. No reply from enom or its reseller!

But I got more errors:

[Mon Feb 25 04:44:32 2013] [error] [client 84.199.79.50] File does not exist: /var/www/vhosts/mydomain.com/httpdocs/page.php
[Mon Feb 25 05:53:02 2013] [error] [client 81.84.200.80] ModSecurity: [file "/etc/httpd/modsecurity.d/10_asl_antimalware.conf"] [line "60"] [id "360000"] [rev "5"] [msg "Atomicorp.com Malware Blocklist: Malware Site detected in URL/Argument (AE)"] [data "http:/"] [severity "CRITICAL"] Access denied with code 403 (phase 2). Matched phrase "fdman.fortunecity.com/" at REQUEST_URI. [hostname "www.mydomain.com"] [uri "/page.php"] [unique_id "3A8wXgoHRioAACE0tdsAAAAI"]
[Mon Feb 25 11:36:14 2013] [error] [client 157.55.33.19] File does not exist: /var/www/vhosts/mydomain.com/httpdocs/ToPcK
[Mon Feb 25 13:18:51 2013] [error] [client 91.207.4.186] ModSecurity: [file "/etc/httpd/modsecurity.d/20_asl_useragents.conf"] [line "293"] [id "330131"] [rev "2"] [msg "Atomicorp.com WAF Rules: Fake Mozilla User Agent String Detected"] [severity "CRITICAL"] Access denied with code 403 (phase 2). Pattern match "(?:$mozilla^|mozilla/[45]\\\\.[1-9])" at REQUEST_HEADERS:User-Agent. [hostname "www.mydomain.com"] [uri "/page.php"] [unique_id "FmXKgwoHRioAAZGRt6sAAAAF"]
[Mon Feb 25 17:48:27 2013] [warn] [client 62.146.124.12] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 19:15:57 2013] [warn] [client 157.56.93.202] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 20:12:16 2013] [warn] [client 94.224.90.73] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 21:40:34 2013] [warn] [client 65.55.24.243] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 21:52:59 2013] [warn] [client 199.21.99.76] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 22:45:06 2013] [warn] [client 65.55.24.243] Timeout waiting for output from CGI script /var/www/cgi-bin/cgi_wrapper/cgi_wrapper
[Mon Feb 25 22:45:06 2013] [error] [client 65.55.24.243] (70007)The timeout specified has expired: ap_content_length_filter: apr_bucket_read() failed

lucy24




msg:4549163
 7:49 pm on Feb 26, 2013 (gmt 0)

You may need to spend a little time learning how to read logs. Practice with logs from the site that is working as intended. You need to distinguish between

#1 intended errors:* anything intercepted by mod_security is a very evil robot trying something BAD. If mod_security is preventing your ordinary code from functioning, you need to either change a setting (not ideal) or fix the code so it will run without punching holes in your security. The same goes for anything that got a 403. If it was supposed to get the 403, everything is fine.

and

#2 unintended errors: "file does not exist" (404/410 message) may be just that, or it can mean that you made a mistake in a link somewhere.

In your case, all those timeouts from assorted IPs-- including three attempts from what looks like the bingbot-- are the root of the problem. Or rather, they point to where the root of the problem is. You need to find out why your script is timing out. Is your time limit simply too low or is there a problem with the script file?


* People who have their own servers may need to be reminded that error logs in shared hosting generally list all 403s and 404s. Log levels aren't set separately for each user so you have to compromise.

FromBelgium




msg:4549188
 8:50 pm on Feb 26, 2013 (gmt 0)

Thanks Lucy for your help. I don't use SGI scripts. With ftp I can access the "cgi-bin" folder, but it is empty.

And I also get this time out problem with simple text files, such as robots.txt.

No usefull reply from Enom, they say that it is a local routing issue.

lucy24




msg:4549243
 10:42 pm on Feb 26, 2013 (gmt 0)

If the bingbot is timing out repeatedly and the problem is at their end, we are all in trouble :) But the IPs in the sample you posted aren't even from the same continent-- unless by weird coincidence they are all using proxies based in Redmond-- so I tend to suspect that your host is talking through its hat.

FromBelgium




msg:4550599
 7:27 am on Mar 3, 2013 (gmt 0)

Moved to new host. Problem solved.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Webmaster General
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved