Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
I have a site which is pretty old (3 years) and am considering adding a large new section (around 100k content pages) in this site, but am currently holding up doing so fearing Google may put the entire site in sandbox along with the new pages.
Has anyone done the same and found any positive result?
Does it sound like this second section is sandboxed?
Yes that is what this sounds like. I've seen the same thing happen on other sites (just recently).
COULD be, and lets only hope, that they have simply gone backwards in time and need to re-update.
False, and it has been false for months and months. The sandbox effect can be seen on established domains sometimes, and here could be yet another example.
It looks to me like when googlebot encounters a group of new pages, or a new section, it does something like phoning home, then has to wait for permission to start spidering it. Last time I added a chunk of pages I watched the googlebot for a few weeks bounce against the new section, but not spider it, finally it seemed to get the go ahead from hq and spidered it all in a day.
This is one reason I believe that Google still has a primary index, with capacity issues, and that those issues have not yet been resolved. It's also why I think that Google is playing a dangerous game, it is not in fact delivering fresh content consistently. Some sites can get their new blocks of pages spidered quickly, but it's not consistent, unlike a few years ago where all new content could get in very quickly, that's what made Google a market leader so quickly .... keep that in mind google, this is how you lose the game, you aren't immune to the rules.