Hi All. I have a site (Wordpress blog) that is about 3 mths old. From the start it was set-up with robots.txt and the usual recommended seo to prevent duplicate content etc. Nothing black-hat.
No doubt it is still in the google sandbox and before the supplemental tags were removed last week of the 330 indexed pages all but about 50 were supplemental.
I've noticed that for some reason, Google won't index more than 330 pages (there are 400 posts on the blog so far). In fact what it does is it indexes the new pages daily, lets them sit in the main index for a few days before dropping them back to the supplemental index.
Thats not so unusual in itself for a new site but what gets me is it maintains 325-330 pages indexed so as it indexes a new page it is dropping one of the previously indexed pages into a black hole... The way it is going suggests that I could have 1000 posts but still only have 330 odd indexed.
I've seen something similar on a directory type website of mine, with lots of pages with only a single link to them. Google started indexing all pages (after a while) when we added newsletters we'd sent to the site. I suppose the extra links and context in the newsletter pages convinced Google.
Thanks for the feedback. The inbound links are building and there is no doubting that the pages that aren't supplemental have more links pointing to them.
The bit I still don't understand is why when I post a new article google will ditch an old one and refuse to take the indexed pages(normal or supplemental)higher than 330 in total. I would have thought it would keep building the indexed count even if most of them were supplemental but it just won't go past 330...