| 6:19 pm on Dec 28, 2004 (gmt 0)|
Google can't be random in choosing which section/pages to sandbox/ignore. You mentioned you added an affiliate section, have you made sure you have unique content that is not same as the site you are affiliate of? Adding &filter=0 to the URL may be interesting to see.
| 6:50 pm on Dec 28, 2004 (gmt 0)|
The funny thing is that the "section" that is not unique (quite a few of these floating aroung the internet) ranked really well within 2 weeks, where as this second "section" which is completely unique seems to be sandboxed.
| 6:52 pm on Dec 28, 2004 (gmt 0)|
I have a site which is pretty old (3 years) and am considering adding a large new section (around 100k content pages) in this site, but am currently holding up doing so fearing Google may put the entire site in sandbox along with the new pages.
Has anyone done the same and found any positive result?
| 7:02 pm on Dec 28, 2004 (gmt 0)|
|Does it sound like this second section is sandboxed? |
Yes that is what this sounds like. I've seen the same thing happen on other sites (just recently).
COULD be, and lets only hope, that they have simply gone backwards in time and need to re-update.
| 7:37 pm on Dec 28, 2004 (gmt 0)|
The whole site on sandbox! Yikes! That sounds somewhat counter-productive for a search engine unless they were planning on dumping organic results altogether and heading the PPC route. The way I figure it (although unfortunatelly it's not the case), a 2 year old domain that keeps getting more and more google traffic should be viewed as offering a service to the surfers when new content is added, regardless of its intent (money). Then again, that's just my wishful thinking...Google you have won, I will fork up the cash for PPC!
| 7:56 pm on Dec 28, 2004 (gmt 0)|
The 'Sand Box' effect is only seen on new domains - Not pages - No sub-directories and Not sub-domains
| 8:02 pm on Dec 28, 2004 (gmt 0)|
"The 'Sand Box' effect is only seen on new domains - Not pages - No sub-directories and Not sub-domains"
False, and it has been false for months and months. The sandbox effect can be seen on established domains sometimes, and here could be yet another example.
| 8:08 pm on Dec 28, 2004 (gmt 0)|
I don't think the lag in adding new pages is the sandbox, although it may have some relation to it. I've seen this before, the lag comes in how long it takes the pages to get indexed, especially when there are a lot, more than a few hundred.
It looks to me like when googlebot encounters a group of new pages, or a new section, it does something like phoning home, then has to wait for permission to start spidering it. Last time I added a chunk of pages I watched the googlebot for a few weeks bounce against the new section, but not spider it, finally it seemed to get the go ahead from hq and spidered it all in a day.
This is one reason I believe that Google still has a primary index, with capacity issues, and that those issues have not yet been resolved. It's also why I think that Google is playing a dangerous game, it is not in fact delivering fresh content consistently. Some sites can get their new blocks of pages spidered quickly, but it's not consistent, unlike a few years ago where all new content could get in very quickly, that's what made Google a market leader so quickly .... keep that in mind google, this is how you lose the game, you aren't immune to the rules.