Welcome to WebmasterWorld Guest from 22.214.171.124
On the other hand, Google Webmaster Tools shows that the site has 1000 pages indexed. This has been relatively steady through the same period.
Overall Google traffic is down, while ranking on key terms is the same, which is why I'm concerned the site: command might be right.
Which should I believe, Google Webmaster Tools or the site: command? Or is there an even better way to determine whether my pages are gone?
Site is flaky and changes daily. I have been losing pages for the past year with absolutely no change in my google referrals. Started with 9,000 showing in regular google and am now down to 1950.
Could this be a new method of throttling? The only reason I ask is because I see NO changes in my stats for referrals from Google... from my stat server or analytics... same amount of visitors from google searches, same pages hit by google searches, same everything, though the overall monthly visits from all sources has gone from around 60,000 uniques per month to 100,000. This has been going on for the past year, yet I get an average of 3 new users signing up per day. Mmmmmm not sure what that's all about. Google keeps sending me the same stuff every month, yet the overall traffic grew by 40% without any help from G. The 'ol pie chart on my stat server is showing G as a smaller slice every month. For me, Bing has really been picking things up.
all the datacenters show either 22,000 or 23,000 pages
Bing has really been picking things up.
So obviously a huge increase today.
No change in positions for key terms.
We'll see how it translates into traffic.
Each day or two we seem to lose more pages (maybe a few hundred at a time) via the site operator while the Webmaster Tools seems pretty steady. In fact, we have gained some pages.
We are not sure which to trust here.
>> Should I believe site: command or Webmaster Tools?
I trust neither. The site: operator shows you the currently "relevant" results for that particular search query. Webmaster Tools shows data for various reports that (I'm convinced) is processed in an entirely different way from that used for organic search results.
In the old days, operators like site:, link: and even general search keyword showed you everything that matched a particular query. Now, they match the results Google thinks are relevant to a particular search query.
This is of particular significance if you attempt to use numbers returned in single Google searches for statistical purposes. Unless you understand the context, the data is totally unreliable.
I don't like the seeing the site operator dropping the number of pages but it's not like we haven't seen it before. Sometimes I also believe it's just a time when Google kind of refreshes the pages on a site. After a while, sites get messy and I wonder if it's just a clean up. Typically after several weeks of losing pages, they then start to climb again. I certainly hope that will be the case again this time but it always does make me wonder and confuse me when I compare the two results.
I like your points and there is usually something deeper than just saying one has more than the other so one must be broke.
Personally, while I like the data Google provide in Webmaster Tools, I take it with an extremely large pinch of salt.
In my experience, unless you are able to combine Google's data with another source to qualify the data, the reliability is just not sufficient to look at statistical trends, even in aggregate and over a large period.
(and the new Google interface with the blue buttons doesn't even display a count of URLs - a sign of things to come?)
The WMT count is way too low. I think all those tools are broken.