Welcome to WebmasterWorld Guest from 220.127.116.11
so today i get 860 sites indexed for site:example.com
now i check site:www.example.com and get over 7000 results which is closer to the actual number
all my urls are such as www.example.com/whatever, none of them is missing the www subdomain.
how is possible? how come the site: command returns less sites without the www subdomain? what should this info tell me?
[edited by: tedster at 1:15 am (utc) on July 25, 2009]
[edit reason] switch to example.com - it cannot be owned [/edit]
Google is doing a better job with the private WebmasterTools data, at least a good bit of the time. That reporting is behind a log-in, where only the validated site owner can have (or intentionally share) access. Safely tucked away from your competitors and malicious intentions.
However, even Webmaster Tools reporting gets buggy when Google makes significant changes to their back-end infrastructure (which seems to have happened this month).
I think this particular bug is fascinating - it might even tell us something about Google's back-end data storage. But, we have so little detail about that - it's almost impossible to extrapolate a useful insight as it stands.
Ny guess would be that a chunk of information became inaccessible to the site: operator when they changed something on the backend. It's even possible that a chunk of data went missing while they were moving it around between data centers - that's happened before. What KIND of data that is would be interesting to know, but I can only make wild guesses, so I won't.