Forum Moderators: open
So...is the following a possible work around?:
If you needed to introduce say 2000 new pages to a site all at once. Could you post all the pages at once *but* put, say 1700 of them in a directory that is not spiderable? That way google will only see 300 new pages at once. Then you could "roll out" the other pages systematically over time, by moving them to a spiderable directory?
Alternately, would it be better to just use javascript links to these pages so google wouldn't find them?
I know it would probably be best not to introduce so many pages at once period, but it's all or nothing with this project. The pages are all somewhat interdependent and only a portion of them being available to the surfer at one time is pretty much useless.
Hopefully I have explained it well enough to get some input :-/
Thanks all.
Therefore, it strikes me that a better course of action for you is to dump all the pages at once, realizing that it will be three months or so before Google may remove those pages from the sandbox. The overall traffic effect will probably be the same as trickling them out a few pages at a time.
More of an educated guess than sound advice so take it for what other posters deem it's worth.