|Webmaster Tools - DNS lookup timeout on broken links|
| 4:22 am on Jul 1, 2009 (gmt 0)|
Some time ago, a single link from some site has created a "sea" of broken links reported in Google Webmaster Tools.
The problem was the link in this form:
The ending slash was like a nuke - that is how I felt it, because Google started looking for endless links that had multiplied subfolders in them. Over 3,000 404s!
I fixed it by creating a redirect in .htaccess that pointed to the right page.
Anyhow, that was back in January 2009, and it took me not less then three months of begging via email until they finally fixed it. I believe it was a case of ignorance, not any kind of spam.
Now, I just saw 17 entries described as "DNS lookup timeout" under "time out" section.
Again, broken links that eventually redirect to the same page from the above, based on my redirect from January.
But, why time out, and where those links would come from? There are no reports under 404 or anything else where I could find the source of the problem (i.e. a referring link).
On another note, still inside Webmaster tools, I see that links from Google AdWords with that "gclid" variable have become a common thing.
Why? Do you think it's a bug (mistake) or...
| 4:31 am on Jul 1, 2009 (gmt 0)|
If the DNS is timing out, then the googlebot request never even gets to your domain's actual server. So your server logs can't show anything at all - your website's server never even recevied the request.
The technical problem, if there is one at all, should rest with your DNS settings. And if the problem is only showing up for urls that would get a 404 from your server anyway, then it shouldn't hurt your legtimate urls rankings at all.
But I cannot easily see how only incorrect urls would get a timeout in DNS - I assume that the domain name itslef is accurate, right? Otherwise the url couldn't even be associated with your WMY account. To me, this sounds more like a bug in WMT reporting.
| 5:26 am on Jul 1, 2009 (gmt 0)|
Thanks Tedster. I just saw you're over 25k posts. Man...
Domain name is right, and of 17 URLs, 16 were non-existing ones (with scrambled internal paths), and only one was regular.
No other files have been reported - site is with 40+ static pages.
I can only guess that a really short DNS problem could cause this at the moment when Google's bot was going through.
Since the bot could not reach my .htaccess so wrong links get 301ed, that's why those awful URL's showed up (again).
The regular link is actually one that is disallowed via robots, but again, if the server could not be reached, the link has been reported as "timed out".
This would be the only good explanation I could come up - with your help.
| 5:03 pm on Jul 1, 2009 (gmt 0)|
i saw this problem in webmaster tools yesterday too.
i wonder if this is our problem or google's.
i am with a UK host <snip>. how about you?
if we are with different hosts than it would suggest this is a problem at googles end.
[edited by: Robert_Charlton at 6:16 am (utc) on July 3, 2009]
| 5:29 pm on Jul 1, 2009 (gmt 0)|
< US host >
[edited by: Robert_Charlton at 6:17 am (utc) on July 3, 2009]
[edit reason] removed specifics [/edit]
| 7:26 pm on Jul 1, 2009 (gmt 0)|
maybe this is google then. i just looked now and the DNS problems have gone from GWT.
| 11:45 pm on Jul 1, 2009 (gmt 0)|
Gone from GWT on my side, too.