| 3:01 am on Jan 20, 2012 (gmt 0)|
I was about to make a post about it , pleasant change . Link update too ext and internal. New(old) weird errors in WMT as a usual guess for new data set .
| 6:19 am on Jan 20, 2012 (gmt 0)|
A bit annoying is that it always defaults back to 10 rows of data.
| 9:48 am on Jan 20, 2012 (gmt 0)|
I'm going to have problems with that new arrow. Every time I see it I think I'm looking at the Unicode Replacement Character-- the black one with the question mark-- and I think they're saying they couldn't find my favicon. Doesn't help that the arrows periodically disappear, either. (They did while I was composing this post.)
Defaulting to 10 rows is nothing new. The "search queries" area never remembers your previous settings either. Good thing the search engine itself still does ;)
Wish they'd say where all those unfound or roboted-out pages come from. There are things on my list that aren't linked from anywhere.
| 4:14 am on Jan 24, 2012 (gmt 0)|
Just an update: This layout/style seems to have been rolled out to other parts of Webmaster Tools including "Search Queries". The Search section also now defaults to only 10 rows.
| 10:33 pm on Jan 25, 2012 (gmt 0)|
I really don't like having to tab through all the sitemaps, we broke our site down into smaller sections for google, but now we have to tab through pages to check their status. I liked seeing them in a list.
| 10:56 pm on Jan 26, 2012 (gmt 0)|
There has also been an update to the sitemap section under site configuration with graphs being added for the number of URLs submitted versus indexed.
| 7:07 am on Jan 31, 2012 (gmt 0)|
I am a bit annoyed at Google. The Webmaster Tools team modified the Sitemaps section which is great and its all a little neater, but what did they do with my data?
My sitemap.xml.gz file has 14,000 child sitemaps to cover our entire website. However, Google is only showing 200. What happened to the rest? They were there last week.
Google has somehow capped it at 200 and I cant go to browse beyond that.
Anyone else experience this problem?
| 6:40 pm on Jan 31, 2012 (gmt 0)|
The fetch as GoogleBot option is kind of interesting, I don't remember it having Fetches and URLs remaining. It looks like a promise to index/reindex a URL and the pages linked to it if they can fetch it, with a relatively low limit (500 Fetches / 10 URLs).
Then they show a page I 301'd off a couple days ago as having been feteched and indexed, along with the time and date they did it, even though I didn't request it.