homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

Should I believe site: command or Webmaster Tools?

 10:18 am on Oct 28, 2009 (gmt 0)

Over the last month, I have seen the number of pages returned using the site: command drop from 1200 to 210. The drop has been steady...I lose a handful of pages each day. They are not going supplemental, they are just gone.

On the other hand, Google Webmaster Tools shows that the site has 1000 pages indexed. This has been relatively steady through the same period.

Overall Google traffic is down, while ranking on key terms is the same, which is why I'm concerned the site: command might be right.

Which should I believe, Google Webmaster Tools or the site: command? Or is there an even better way to determine whether my pages are gone?



 3:22 pm on Oct 28, 2009 (gmt 0)

I have lost trust in site: command. As I have continued to add pages, traffic has been up, ranking have improved, but the site: command shows fewer pages each year. But if I search for content on my website, I can find each and every article ever written. I think site: command is an approximation and may not be accurate always.


 6:42 pm on Oct 29, 2009 (gmt 0)

I don't look at the site command at all, anymore. On one site, the site command in google shows 1950 pages, tools shows 4900 pages, all the datacenters show either 22,000 or 23,000 pages, which is about how many pages there really are. I think the datacenters hold all the data and that data gets filtered down to regular google. Google knows all the pages are there, it just chooses which pages are deemed more important. This has always bugged me in the past until I noticed that a lot of unique phrase searches will show a page that is not part of the 1950 that the site operator returns.

Site is flaky and changes daily. I have been losing pages for the past year with absolutely no change in my google referrals. Started with 9,000 showing in regular google and am now down to 1950.

Could this be a new method of throttling? The only reason I ask is because I see NO changes in my stats for referrals from Google... from my stat server or analytics... same amount of visitors from google searches, same pages hit by google searches, same everything, though the overall monthly visits from all sources has gone from around 60,000 uniques per month to 100,000. This has been going on for the past year, yet I get an average of 3 new users signing up per day. Mmmmmm not sure what that's all about. Google keeps sending me the same stuff every month, yet the overall traffic grew by 40% without any help from G. The 'ol pie chart on my stat server is showing G as a smaller slice every month. For me, Bing has really been picking things up.


 6:58 pm on Oct 29, 2009 (gmt 0)

Interesting insight, webdude. Thanks.

all the datacenters show either 22,000 or 23,000 pages

How are you searching your pages by datacenter?

Bing has really been picking things up.

Great news. Bing tends to like my sites a lot more than Google. I spend a lot of effort on on-page SEO and I'm...well...pretty lazy when it comes to link building. So maybe Bing puts less importance on links.


 2:59 pm on Oct 30, 2009 (gmt 0)

I am searching by several online tools that lets you search for multiple datacenters all at once. Search google datacenter search to get a couple of links.


 12:40 pm on Nov 12, 2009 (gmt 0)

Mmmmmm... interesting. I did a site search, on the site mentioned above, this morning...

Results 1 - 100 of about 22,400 from sadfasdff (0.05 seconds)

Tried a bunch of datacenters too with the same results.


 12:40 pm on Nov 12, 2009 (gmt 0)

Mmmmmm... interesting. I did a site search, on the site mentioned above, this morning...

Results 1 - 100 of about 22,400 from sadfasdff (0.05 seconds)

Tried a bunch of datacenters too with the same results.


 2:43 pm on Nov 12, 2009 (gmt 0)

One of my sites was showing approximately 200 pages if I did a site: search, with the number dropping weekly. Been doing that for a long time. Just checked it today because of this post and it jumped huge at some point in the past 24 hours - suddenly says I have 1500 or so, which is just about the number of pages I have on the site in total. No traffic change either way though.


 3:35 pm on Nov 12, 2009 (gmt 0)

I had a similar jump as well today but results are bouncing around from high to low randomly.


 3:58 pm on Nov 12, 2009 (gmt 0)

Since most of the datacenters were/are showing the correct amount of pages, I am going to assume that there is some data updates being made.

Anyone here remember flux?


 4:22 pm on Nov 12, 2009 (gmt 0)

Here are the changes I have been keeping track of:
# of indexed pages (according to Webmaster Tools/site operator):
Oct 28:3,123/1,300
Oct 29:3,123/1,260
Oct 30:3,322/1,250
Nov 2-3:3,310/1,240
Nov 4:3,297/1,280
Nov 5:3,297/1,290
Nov 6:3,297/1,300
Nov 9-11:3,382/1,320
Nov 12:3,382/10,500

So obviously a huge increase today.


 4:51 pm on Nov 12, 2009 (gmt 0)

Funny...I noticed a huge increase today as well and came here to find out what's going on. I went from 250 pages to 8500 pages, a large percentage of which are not supplemental. And my competitors had similarly large increases today.

No change in positions for key terms.

We'll see how it translates into traffic.


 8:33 pm on Nov 25, 2009 (gmt 0)

We have approximately 54k pages submitted via sitemap. Google Webmaster tools says that approx 40k pages are indexed yet the site operator is currently saying 25k.

Each day or two we seem to lose more pages (maybe a few hundred at a time) via the site operator while the Webmaster Tools seems pretty steady. In fact, we have gained some pages.

We are not sure which to trust here.

Receptional Andy

 8:46 pm on Nov 25, 2009 (gmt 0)

In terms of the original question:

>> Should I believe site: command or Webmaster Tools?

I trust neither. The site: operator shows you the currently "relevant" results for that particular search query. Webmaster Tools shows data for various reports that (I'm convinced) is processed in an entirely different way from that used for organic search results.

In the old days, operators like site:, link: and even general search keyword showed you everything that matched a particular query. Now, they match the results Google thinks are relevant to a particular search query.

This is of particular significance if you attempt to use numbers returned in single Google searches for statistical purposes. Unless you understand the context, the data is totally unreliable.


 10:00 pm on Nov 25, 2009 (gmt 0)

Good points..

I don't like the seeing the site operator dropping the number of pages but it's not like we haven't seen it before. Sometimes I also believe it's just a time when Google kind of refreshes the pages on a site. After a while, sites get messy and I wonder if it's just a clean up. Typically after several weeks of losing pages, they then start to climb again. I certainly hope that will be the case again this time but it always does make me wonder and confuse me when I compare the two results.

I like your points and there is usually something deeper than just saying one has more than the other so one must be broke.

Receptional Andy

 10:07 pm on Nov 25, 2009 (gmt 0)

If you're curious about site: operator results, KrisE, I thoroughly recommend you look into them further. Try comparing the numbers for site: searches with the number for recursive site queries for subdirectories within a site.

Personally, while I like the data Google provide in Webmaster Tools, I take it with an extremely large pinch of salt.

In my experience, unless you are able to combine Google's data with another source to qualify the data, the reliability is just not sufficient to look at statistical trends, even in aggregate and over a large period.


 2:09 am on Nov 26, 2009 (gmt 0)

Spent the rest of the day really comparing against overall vs subdirectories.

The numbers are amazing... If I add up all the subdirectories, it comes out to higher than the overall.

I see what you are saying... Best to take it with "an extremely large pinch of salt". ;)

Thanks for the advice.


 2:31 am on Nov 26, 2009 (gmt 0)

I wouldn't (don't) trust either number, and my tongue-in-cheek theory on the dropping page count using the site: operator is they're getting closer to giving you 'the one right answer' eventually it'll just say 'yes' or 'no'...


 2:45 am on Nov 26, 2009 (gmt 0)

"eventually it'll just say 'yes' or 'no'..."

Oh man... I really needed that chuckle... Scary at the same time.


 3:59 am on Dec 2, 2009 (gmt 0)

For weeks now the site: operator count when checked at home is about 14M pages. From work (different top-level ISP) is about 9M pages. So that's the first annoyance -- different ISPs always connecting to different Google DCs.

(and the new Google interface with the blue buttons doesn't even display a count of URLs - a sign of things to come?)

The WMT count is way too low. I think all those tools are broken.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved