Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Why I Recommend Google Webmaster Tools

         

tedster

6:16 pm on Jun 3, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



When Google first offered Webmaster Tools, my reaction was "no way - they've already got too much data on me." But as moderator of this forum, I felt I needed to be familiar with what other members here were talking about. So I verified a small site just to test the water. Over time, I began to see real value, and eventually I began to recommend it to clients.

With Google owning so much of the search market, knowing exactly how Google sees a website can be critical for success in organic search. Standard SEO analysis can learn only so much without using the data that Google freely offers from their experience indexing a site. WMT is really a service, in the best meaning of the word.

  1. There is no longer such a thing as having any one ranking position on Google - as we know, Google customizes the SERPs for different users, even within the same city and at the same time. Google is also suspected to return false, "red herring" data for automated rank checking software. Well, it is against their guidelines, so fair warning given I suppose.

    Only with Webmaster Tools can we see the full range of ranking positions for any keyword - and that report is amazing in its freely shared detail, especially when you drill down.

  2. WMT also shows click-through rates (CTR) for different ranking positions. Without seeing Google's data, there is no way to know how many impressions a site's URLs get in organic search.

    Yes, we can see clicks from our server logs, but not total impressions. And fixing a low CTR can be essential for getting the most out of any ranking. This information alone convinced many reluctant webmaster to open a webmaster tools account with Google.

  3. Google uses site speed as a ranking factor - and that data is gathered from Google toolbar users. There is no good way to gather that precise data (or to prioritize improvements) except by using Google Webmaster Tools.

  4. If there are any crawl errors (and every site has them) Google Webmaster Tools highlights what they are, and presents them in an easy and highly usable way.

  5. Many sites today have been deviously hacked but can't see it. This happens because the parasite payload of the hack is "cloaked" and only visible to googlebot. Such hacking is epidemic today -- an unfortunate reality of the web's aggressively lawless dark side. Webmaster Tools allows us to "fetch as googlebot" - to see any page exactly as presented to Google's own spider.

  6. Google has their own specific criteria for length and uniqueness of page titles and meta descriptions - and Webmaster Tools gives feedback about any problems in this area. Especially with large sites, an independent spidering and analysis of these factors can be quite problematic. Google makes it easy to know exactly what to fix and in what priority.

  7. Knowing what links point to any given page, both within our own site and from external sites, is a critical ranking factor. While we can pay for private spidering data, WMT free information can be key in cutting through that clutter.

  8. Webmaster tools will also show links that point to non-existing URLs on a site. If the URL doesn't exist, then that link no longer helps in ranking. But if we know about it, then we can take actions to "reclaim" that ranking power.

These are some of the more powerful reasons that any SEO analysis of a website benefits from access to Webmaster Tools. Some of the data Google offers can be critical and is not available in any other way. Some of the data is potentially available through other avenues, but only at the expense of a major investment in resources for the study.

As I've mentioned earlier, a Webmaster Tools account is qualitatively different than an Analytics or Adwords account. Webmaster Tools data is always maintained by Google as part of operating their search engine, and thus it needs no time for a history to accumulate. It is a very helpful gesture from Google to share their data in this way - and it is Google's own data, not the individual website's server data or anything along those lines.

Especially when a site migrates from a legacy version to a new launch, monitoring Webmaster Tools for early warning signals of technical trouble can go a long way toward ensuring that migration progresses smoothly and minimizes traffic loss. Fixing the legacy site's issues creates a solid platform for the migration, and fixing the new site's issues removes barriers quickly, before they become widespread problems for the new site.

gouri

6:31 pm on Jun 3, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I wanted to say thank you for taking the time to discuss the excellent features that Google Webmaster Tools offers.

I find the data that is available from GWT to be very helpful.

londrum

7:14 pm on Jun 3, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



it would be better if it was accurate. half of it seems to be a few months out of date. when mayday hit i had a few hundred crawling errors from pages which haven't existed for a year. it was like they resurrected errors from 12 months ago.
the speed test bit seems to update very slowly too. it suggested once that i remove or combine a javascript, so i removed it, ages ago, but its still showing as a suggestion now, even though it no longer exists. so how do you know whether the speed info graph is accurate? it can't be. plus you've got all the scripts from google's own stuff like adsense and site-search which it flags up as a slow, when apparently thats an error because it's really gzipped. another thing thats wrong.

if both of those are out of date, then how do you know all the keyword and link info is accurate? they might be a few months out of date as well.

[edited by: londrum at 7:22 pm (utc) on Jun 3, 2010]

amythepoet

7:20 pm on Jun 3, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How do I find crawling errors?

thank you

tedster

7:31 pm on Jun 3, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



it would be better if it was accurate

Yup!

How do I find crawling errors?

The section is very logically named "Crawl Errors". There's a summary on the Dashboard for any verified site, and a detailed drill-down in the +Diagnostics area.

----

I'm currently having problems with sites where someone else owns and has verified the site, then authorized me. In short, I still get now access. This is a relatively new feature and a VERY good idea. I hope the bugs are worked out soon.