Welcome to WebmasterWorld Guest from 184.108.40.206
site:example.com returns 80 results
site:example.com/ returns 80 results
site:www.example.com returns 78 results
site:www.example.com/ returns 83 results
site:http://www.example.com returns 81 results
site:http://www.example.com/ returns 80 results
Each set of results always showed at "results have been filtered" on the last page. Now if I add "&filter=0" to the Google URL I get:
site:example.com returns 80 results, but on the last SERPS page states 79
site:example.com/ returns 20 results
site:www.example.com returns 78 results, but on the last SERPS page states 76
site:www.example.com/ returns 80 results, but on the last SERPS page states 78
site:http://www.example.com returns 78 results, but on the last SERPS page states 76
site:http://www.example.com/ returns 80 results, but on the last SERPS page states 79
and if you try checking folders they vary even more. But ALL of the URls showing in the results given have the www. at the start. I use an .htaccess 301 redirect from non-www to www version.
I used my smallest site as an example as I noticed the issue when doing a site:webmasterworld.com search. The variances for WW are huge but I didn't know if they were just Google Guesstimate variances in the SERPS. At such a low number of pages I was able to see every page that Google was showing.
Any thoughts as to whether it's a bug or I've missed something would be appreciated.
It still concerns me, as just today I noticed the search with the prefixed space returns 150 results, whereas without the space returns over 5000 results (which is what I expect).
I do perform this search with the space quite a bit as it's a bookmark of mine, so maybe there's some sort of a penalty occurring.
When you are getting an estimated number, that is indicated by the word "about". The site: search is a reporting function, and not as precise as it once was, by the way. For instance site:example.com/directory/ can return urls that are not returned from site:example.com
zaqwsx3, that's an interesting quirk - I never noticed it before and I can't reproduce it. I'd say it's just a tehchnical quirk and you should fix the bookmark. There's no reason I can see for Google to apply a penalty just to a reporting function such as the site: operator.
The 301 redirect should mean that all your content is indexed using the www in the URL as the canonical domain.
Do a site:domain.com -inurl:www search to see if there are any stray non-www pages still indexed. If there are, Google will eventually delist them anyway. However, while they still exist in the SERPs they will still rank, and deliver visitors (and your redirect will silently bounce them over to the correct URL for the content).
Returned zero results.
The 301 has been up for two months and before that only the main page was indexed because there were no other pages.
All pages have <base> tags with the full www. version of the URL.
When you are getting an estimated number, that is indicated by the word "about". The site: search is a reporting function, and not as precise as it once was, by the way.
That is why I test it on a site with only just over a hundred pages. It's not really an estimate at that level, you can manually count the links and Google says "about 79 page" when there are "exactly 79 pages" showing in the results.