When Google first offered Webmaster Tools, my reaction was "no way - they've already got too much data on me." But as moderator of this forum, I felt I needed to be familiar with what other members here were talking about. So I verified a small site just to test the water. Over time, I began to see real value, and eventually I began to recommend it to clients.
With Google owning so much of the search market, knowing exactly how Google sees a website can be critical for success in organic search. Standard SEO analysis can learn only so much without using the data that Google freely offers from their experience indexing a site. WMT is really a service, in the best meaning of the word.
- There is no longer such a thing as having any one ranking position on Google - as we know, Google customizes the SERPs for different users, even within the same city and at the same time. Google is also suspected to return false, "red herring" data for automated rank checking software. Well, it is against their guidelines, so fair warning given I suppose.
Only with Webmaster Tools can we see the full range of ranking positions for any keyword - and that report is amazing in its freely shared detail, especially when you drill down.
- WMT also shows click-through rates (CTR) for different ranking positions. Without seeing Google's data, there is no way to know how many impressions a site's URLs get in organic search.
Yes, we can see clicks from our server logs, but not total impressions. And fixing a low CTR can be essential for getting the most out of any ranking. This information alone convinced many reluctant webmaster to open a webmaster tools account with Google.
- Google uses site speed as a ranking factor - and that data is gathered from Google toolbar users. There is no good way to gather that precise data (or to prioritize improvements) except by using Google Webmaster Tools.
- If there are any crawl errors (and every site has them) Google Webmaster Tools highlights what they are, and presents them in an easy and highly usable way.
- Many sites today have been deviously hacked but can't see it. This happens because the parasite payload of the hack is "cloaked" and only visible to googlebot. Such hacking is epidemic today -- an unfortunate reality of the web's aggressively lawless dark side. Webmaster Tools allows us to "fetch as googlebot" - to see any page exactly as presented to Google's own spider.
- Google has their own specific criteria for length and uniqueness of page titles and meta descriptions - and Webmaster Tools gives feedback about any problems in this area. Especially with large sites, an independent spidering and analysis of these factors can be quite problematic. Google makes it easy to know exactly what to fix and in what priority.
- Knowing what links point to any given page, both within our own site and from external sites, is a critical ranking factor. While we can pay for private spidering data, WMT free information can be key in cutting through that clutter.
- Webmaster tools will also show links that point to non-existing URLs on a site. If the URL doesn't exist, then that link no longer helps in ranking. But if we know about it, then we can take actions to "reclaim" that ranking power.
These are some of the more powerful reasons that any SEO analysis of a website benefits from access to Webmaster Tools. Some of the data Google offers can be critical and is not available in any other way. Some of the data is potentially available through other avenues, but only at the expense of a major investment in resources for the study.
As I've mentioned earlier, a Webmaster Tools account is qualitatively different than an Analytics or Adwords account. Webmaster Tools data is always maintained by Google as part of operating their search engine, and thus it needs no time for a history to accumulate. It is a very helpful gesture from Google to share their data in this way - and it is Google's own data, not the individual website's server data or anything along those lines.
Especially when a site migrates from a legacy version to a new launch, monitoring Webmaster Tools for early warning signals of technical trouble can go a long way toward ensuring that migration progresses smoothly and minimizes traffic loss. Fixing the legacy site's issues creates a solid platform for the migration, and fixing the new site's issues removes barriers quickly, before they become widespread problems for the new site.