Welcome to WebmasterWorld Guest from 22.214.171.124
I had 20,300 pages showing for a site:www.example.com search yesterday and for the past month. Today it dropped to 509 but my traffic is still pretty constant. I normally get around 4,500 - 5,000 to that site per day and today I've already got 4,000.
So, either Google doesn't account for even a small percentage of my traffic (which I doubt) or the way Google stores information about my site has changed. i.e. the 20,300 pages are still there, Google will only tell me about 509 of them. As far as I can tell, I think the other pages have been supplemented.
That resonated with something that I was talking about with the crawl/index team. internetheaven, was that post about the site in your profile, or a different site? Your post aligns exactly with one thing I've seen in a couple ways. It would align even more if you were talking about a different site than the one in your profile. :) If you were talking about a different site, would mind sending the site name to bostonpubcon2006 [at] gmail.com with the subject line of "crawlpages" and the name of your site, plus the handle "internetheaven"? I'd like to check the theory.
Just to give folks an update, we've been going through the feedback and noticed one thing. We've been refreshing some (but not all) of the supplemental results. One part of the supplemental indexing system didn't return any results for [site:domain.com] (that is, a site: search with no additional terms). So that would match with fewer results being reported for site: queries but traffic not changing much. The pages are available for queries matching the supplemental results, but just adding a term or stopword to site: wouldn't automatically access those supplemental results.
I'm checking with the crawl/index folks if this might factor into what people are seeing, and I should hear back later today or tomorrow. In the mean time, interested folks might want to check if their search traffic has gone up/down by a major amount, and see if there are fewer/more supplemental results for a site: search for their domain. Since folks outside Google couldn't force the supplemental results to return site: results, it needed a crawl/index person to notice that fact based on the feedback that we've gotten.
Anyone that wants to send more info along those lines to bostonpubcon2006 [at] gmail.com with the subject line "crawlpages" is welcome to. So you might send something like "I originally wrote about domain.com. I looked at my logs and haven't seen a major decrease in traffic; my traffic is about the same. I used to have about X% supplemental results, and now I hardly see any supplemental results with a site:domain.com query."
I've still got someone reading the bostonpubcon email alias, and I've worked with the Sitemaps team to exclude that as a factor. The crawl/index folks are reading portions of the feedback too; if there's more that I notice, I'll stop by to let you know.
[edited by: Brett_Tabke at 8:07 pm (utc) on May 8, 2006]
May as well optimise the site even more, I can hardly lose any more pages!
I have made some changes to the site to make sure printer friendly pages do not cache and that I have a 1 url = 1 page situation. I estimate that if Google were to crawl my site and list every page it should be around 10,000 pages. Jan 2009 and I should be fully indexed.
Last night I visited the url from earlier in this thread
I sat and went through a fair few sites that people had mentioned and posted url's for just for a look.
I had several reasons for doing this.
One was to see if there was anything obvious or easy to spot that they all had in common.
Out of the random ones I picked over 80% did, a problem (IMO) my site had until recently.
I posted in another thread that I was set to tinker with several sections of my site and see how it went, I did and seen improvement.
Now that could be coincidence, I know that. I could have changed parts and struck lucky at the same time by getting reindexed. Now certain parts of my site are still the same as they were when those pages went missing but a lot have changed, the % of changed pages is greater than those not.
This was a "look" I took, not a study. I never took notes either but perhaps it's worth taking a look and comparing those mentioned sites with your own.
The one problem with this is we all optimise our sites differently and what I seen other may not.
Another thing I must add is that I'm not suggesting Google doesn't have issues but at a time when sites and pages are MIA anything is worth a look and if Google does have problems no one knows when they will be resolved.
Sorry I can't offer more or better advice, I wish I could.
So we can sum up your post like this..
"I noticed what might be causing problems for 80% of the sites"
"But I'm not going to share it with you so bad luck"
Sorry if it seemed that way, it wasn't meant to be. I was just suggesting other could take a look and see if they spot any common traits as I did.
I was only sharing a thought that may benefit others.
Perhaps in future I should keep my thoughts to myself.
I really didn't mean the post to read like "Im alright Jack........."
The website that is unaffected has thousands of pages and went online in 2001. However, I have not updated it on a regular basis for several months. Its format is very basic. It is still #1 for our keyword and when I do the Site: search all 10 pages of results are non-supplemental. This has been the case for many months; there was no improvement or change at all regarding its positioning or indexing.
My other website has been drastically affected. It went online in November of 2005. I update it several times a week. I have been experimenting with .css and am not yet good at it; therefore its coding is not as clean. Pre-Big Daddy, its position for our keyword had reached #12. Now, it radically changes daily, up and down, and completely unpredictably - it doesn't rise every day or fall every day. One day it is #33, one day it is #72, the next day it is #55 - you get the idea. Initially after Big Daddy, 4 pages were non-supplemental out of over 600 total. After a couple weeks, that had gone up to 12 - all of which were older versions (one of which had actually been deleted). A site: check today reveals that the site now has only 7 pages that are non-supplemental. Again, this is completely unpredictable, and I have done nothing differently on the pages that are indexed than I have on the pages that are supplemental.
I'm afraid I have no answers, just these strikingly different examples to contribute to the pot.
[edited by: jatar_k at 3:40 pm (utc) on May 16, 2006]
p.s. The older sites that I monitor still are crammed with pointless, repeating keywords ... and they are still getting number 1 listings for massive keywords (250 million +).
Oh well, wadyagonndo?
Another thought is that if Google is having problems with indexing pages of a site - wouldn't it affect the SERP's - number of pages of a site and lost backlinks thereby causing a SERP's change?
My theory is that if you made major changes to your site due to last years Google update madness, then you're likely to get hit with supplimentals etc. If your site has remained unchanged for many years, no matter how much spam is in it, it will probably stay where it was.
I disagree with this, and have proof. I completely redid the URL's on two sites in the beginning of March using the same format. One of the sites has been re-indexed correctly and is getting more traffic. The second has dropped off, is mostly supplemental, and gets about 20% of the traffic it use to.