I dont know if anybody else is having this problem but i just checked one of my urls using "Fetch as Googlebot" in WMT. Instead of the page loading with a new status like normal it is stuck in a auto-refresh. I have logged out and logged back in (in different browsers) and I get the same problem that the page just keeps auto refreshing. I have never seen this problem before, has anybody else seen this?
Is it just the one page or your whole site? Does it show up normally if you browse it yourself? I just tried a couple pages of mine with no issues. (And what do they mean "May take a few minutes"? All the googlebots are on break but they'll get to it when they come back?) Can you look at the HEAD by itself? I thought there was a way to get at it in GWT, but can't find it now.
Er, google, those are not "duplicate title tags", they are the same page and you know it because the old page has been 301'd from the moment I made the change. Not one second of overlap. Sheesh.
I am also having a problem with Fetch as GB, but not the same as you. In my case, they are saying that I've exceeded my weekly quota, even though I've not used it for at least 2 or 3 weeks. Looks like something is buggy.
One that comes up regularly is a report that some URLs are blocked by robots.txt. The URLs were previously blocked, many weeks ago, but are not blocked any longer. Google should know that, as they pull the robots.txt file every few hours. Indeed WMT says they last had it 5 hours ago.
The problem that bugs me the most at the moment is the list of URLs that are supposedly returning 404 Not Found. No. They. Are. NOT. Most of the URLs in that list actually return either a 410 Gone or 301 Moved Permanently response. Why can't Google report the correct response here?