Welcome to WebmasterWorld Guest from 54.144.68.27

Message Too Old, No Replies

"Fetch as googlebot" is getting stuck in an auto-refresh

     

c41lum

2:02 pm on Jul 4, 2011 (gmt 0)

5+ Year Member



I dont know if anybody else is having this problem but i just checked one of my urls using "Fetch as Googlebot" in WMT. Instead of the page loading with a new status like normal it is stuck in a auto-refresh. I have logged out and logged back in (in different browsers) and I get the same problem that the page just keeps auto refreshing. I have never seen this problem before, has anybody else seen this?

c41lum

3:33 pm on Jul 4, 2011 (gmt 0)

5+ Year Member



after 45mins i went back in and a red X with failed next to it was shown..

lucy24

6:58 pm on Jul 4, 2011 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Is it just the one page or your whole site? Does it show up normally if you browse it yourself? I just tried a couple pages of mine with no issues. (And what do they mean "May take a few minutes"? All the googlebots are on break but they'll get to it when they come back?) Can you look at the HEAD by itself? I thought there was a way to get at it in GWT, but can't find it now.


Er, google, those are not "duplicate title tags", they are the same page and you know it because the old page has been 301'd from the moment I made the change. Not one second of overlap. Sheesh.

Reno

9:12 pm on Jul 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am also having a problem with Fetch as GB, but not the same as you. In my case, they are saying that I've exceeded my weekly quota, even though I've not used it for at least 2 or 3 weeks. Looks like something is buggy.

.......................

g1smd

10:06 pm on Jul 4, 2011 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



There's a load of glitches in WMT.

One that comes up regularly is a report that some URLs are blocked by robots.txt. The URLs were previously blocked, many weeks ago, but are not blocked any longer. Google should know that, as they pull the robots.txt file every few hours. Indeed WMT says they last had it 5 hours ago.

The problem that bugs me the most at the moment is the list of URLs that are supposedly returning 404 Not Found. No. They. Are. NOT. Most of the URLs in that list actually return either a 410 Gone or 301 Moved Permanently response. Why can't Google report the correct response here?
 

Featured Threads

Hot Threads This Week

Hot Threads This Month