Welcome to WebmasterWorld Guest from 54.204.165.156

Google tweaking Webmaster Tools

   
2:43 pm on Feb 14, 2012 (gmt 0)

10+ Year Member Top Contributors Of The Month



They've changed the graphs and the number of keywords under the search queries from 10 to 25. The graph looks rather unprofessional, especially at the point when a peak is reached as it is cut off.

Functionality in this section has now been dumbed down, as you are required to switch from "Basic" to "With Change" listings to retain functionality. As with other Google services, advanced features have disappeared altogether or been moved.

Is this the same change Analytics has gone through? I stopped using that service 8 months ago.
7:22 pm on Feb 14, 2012 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I can tell you that Analytics has gone through a serious reorganization. I'm still getting my "sea legs", but my early impression is that on the surface it seems dumbed down, but there are new possibilities that, once learned, will make it more useful.
8:54 pm on Feb 14, 2012 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I just learned about one of the new Webmaster Tools features. After using "Fetch as googlebot" you can choose to submit that newly fetched page directly to Google's index. That can be a help when some urgent content change really needs to replace a previous version of the content. You can also ask Google to update all the pages that the crawled page links to!

Video - Submit to index via Webmaster Tools [youtube.com]
Help page - Ask Google to crawl a page or site [support.google.com]
9:07 pm on Feb 14, 2012 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



That feature has been there a while, perhaps a couple of months or so.

What does frustrate is having to wait a whole day for Google to catch up. Recently having made an error with a robots.txt file, and then fixing it a few minutes later after testing in WMT, I had to wait 18 hours or more for Google to re-read the file. There was no quick way to say "heh, I fixed that problem, go check the file again".

Saving reports from WMT is a pain in the neck. Some files are called sitename-date-reporttype while others are sitename-reporttype-date or reporttype-date-sitename and so on. There's little or no consistency.
9:31 pm on Feb 14, 2012 (gmt 0)

5+ Year Member



@g1smd you beat me to it, more than a couple of months I think

What is really missing is a way to exclude from the index files like www.example.com/mypage.htm?something=x by just excluding mypage.htm.
Or have I missed something ....
10:05 pm on Feb 14, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Do you mean all pages named "mypage.htm" wherever they happen to occur? Or just one specific page?

:: quick detour to wmt ::

Whew. I thought you meant the "url removals" tab under "crawler access" under "site configuration" was gone.

Now let's see what happens to my Keyword list if I nudge them to re-index nnn & nnn, formerly one of my fattest pages.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month