Welcome to WebmasterWorld Guest from 54.234.38.8

Message Too Old, No Replies

"Fetch as googlebot" is getting stuck in an auto-refresh

     
2:02 pm on Jul 4, 2011 (gmt 0)

Full Member

5+ Year Member

joined:Dec 27, 2006
posts:341
votes: 0


I dont know if anybody else is having this problem but i just checked one of my urls using "Fetch as Googlebot" in WMT. Instead of the page loading with a new status like normal it is stuck in a auto-refresh. I have logged out and logged back in (in different browsers) and I get the same problem that the page just keeps auto refreshing. I have never seen this problem before, has anybody else seen this?
3:33 pm on July 4, 2011 (gmt 0)

Full Member

5+ Year Member

joined:Dec 27, 2006
posts:341
votes: 0


after 45mins i went back in and a red X with failed next to it was shown..
6:58 pm on July 4, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:Apr 9, 2011
posts:12719
votes: 244


Is it just the one page or your whole site? Does it show up normally if you browse it yourself? I just tried a couple pages of mine with no issues. (And what do they mean "May take a few minutes"? All the googlebots are on break but they'll get to it when they come back?) Can you look at the HEAD by itself? I thought there was a way to get at it in GWT, but can't find it now.


Er, google, those are not "duplicate title tags", they are the same page and you know it because the old page has been 301'd from the moment I made the change. Not one second of overlap. Sheesh.
9:12 pm on July 4, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 9, 2001
posts:1307
votes: 0


I am also having a problem with Fetch as GB, but not the same as you. In my case, they are saying that I've exceeded my weekly quota, even though I've not used it for at least 2 or 3 weeks. Looks like something is buggy.

.......................
10:06 pm on July 4, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


There's a load of glitches in WMT.

One that comes up regularly is a report that some URLs are blocked by robots.txt. The URLs were previously blocked, many weeks ago, but are not blocked any longer. Google should know that, as they pull the robots.txt file every few hours. Indeed WMT says they last had it 5 hours ago.

The problem that bugs me the most at the moment is the list of URLs that are supposedly returning 404 Not Found. No. They. Are. NOT. Most of the URLs in that list actually return either a 410 Gone or 301 Moved Permanently response. Why can't Google report the correct response here?