Forum Moderators: Robert Charlton & goodroi
Our traffic has also dropped but the time of year is not good in our industry so we sort of expected it - but I have now noticed that some of our high ranking pages are dropping from the index.
Some of our pages were excessive in weight (250k) so we have now addressed this because we considered Google may be allocating space in its coffers according to the page rank of the site. If our pages were bloated and used our allocation then tough luck for having heavy pages.
Is this theory a possibility?
We still have lots of pages ranked well which are versions of the same page (different product).
Our pages are roughly 60k now.
Thanks for any help
One thing to keep in mind is that Google often shuffles their back end as they move data around, preparing new algo factors and new infrastructure configurations. The site: numbers are almost never accurate for sites of any size - they say "about" and the numbers can shift dramatically as you drill into the deeper results pages for a site: operator query.
Another factor you can notice even on some small sites is that Deeper Site: Queries Can Returns More URLs [webmasterworld.com].
One thing I'm hypothesising with our site is that the pages are too similar, so we're working on making each one as unique as possible.
Google still crawls the pages happily enough, and will index up to 30,000 or so, but regularly drops a load so the number indexed goes back to less than 1000 sometimes.
We've already added 'noindex' to the pages we don't consider are unique enough for Google to like at present. We rely on user added content for many pages, and sometimes they just don't add enough! Fingers crossed this is going to work - it only went in this week.
You say you have similar pages which show different products - is the rest of the content on those pages the same, apart from the product content? Because that's a little like our pages which get dropped.
The problem was that the pages were too similar (or possibly that there were just too many of them).
To solve it we've added 'noindex' to all pages we do not consider unique enough - do not have enough unique content. I've blocked two sub-domains completely using the robots.txt file for the time being while we're working on making those pages more unique.
When we free up those sub-domains again we will never let as many pages be available for indexing as we had before (200,000), only those which have sufficient user-generated content.
Having put the changes in and submitted a reconsider request to Google, our homepage and our Christmas page (thank you Google!) are back in number 1 position for our required search term, but we're still working on the other pages and other search terms.
BBonanza - you mention that you code your page titles, meta data and headings to be different for each page but so did we. I'm guessing your products still sit within what is effectively a template with the same menu, website heading etc. etc on each page? Maybe you have to find some way to increase the unique content on your pages.