Msg#: 4548392 posted 2:06 pm on Feb 25, 2013 (gmt 0)
Thanks for your answers!
I have 2 sites on the shared host (Enom!): one is Windows and the other one is Linux hosted. Only the Linux one is giving that problem.
Info from log files (I changed domain name):
Access.log Hundreds of Lines like this: 18.104.22.168 - - [25/Feb/2013:03:25:26 -0800] "GET /stylesheet.css HTTP/1.1" 200 1678 "http://www.mydomain.com/" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
Error.log Only one line: [Mon Feb 25 04:44:32 2013] [error] [client 22.214.171.124] File does not exist: /var/www/vhosts/mydomain.com/httpdocs/page.php
Traceroute gives time out at hop 8. But when I refresh the page (or open another page) it loads instantly. It is only the first request that fails. I also get the error when I open first an non-php file, such as robots.txt.
This is message in Google Webmaster Tools: "Over the last 24 hours, Googlebot encountered 46 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 9.2%."
Msg#: 4548392 posted 7:49 pm on Feb 26, 2013 (gmt 0)
You may need to spend a little time learning how to read logs. Practice with logs from the site that is working as intended. You need to distinguish between
#1 intended errors:* anything intercepted by mod_security is a very evil robot trying something BAD. If mod_security is preventing your ordinary code from functioning, you need to either change a setting (not ideal) or fix the code so it will run without punching holes in your security. The same goes for anything that got a 403. If it was supposed to get the 403, everything is fine.
#2 unintended errors: "file does not exist" (404/410 message) may be just that, or it can mean that you made a mistake in a link somewhere.
In your case, all those timeouts from assorted IPs-- including three attempts from what looks like the bingbot-- are the root of the problem. Or rather, they point to where the root of the problem is. You need to find out why your script is timing out. Is your time limit simply too low or is there a problem with the script file?
* People who have their own servers may need to be reminded that error logs in shared hosting generally list all 403s and 404s. Log levels aren't set separately for each user so you have to compromise.
Msg#: 4548392 posted 10:42 pm on Feb 26, 2013 (gmt 0)
If the bingbot is timing out repeatedly and the problem is at their end, we are all in trouble :) But the IPs in the sample you posted aren't even from the same continent-- unless by weird coincidence they are all using proxies based in Redmond-- so I tend to suspect that your host is talking through its hat.