homepage Welcome to WebmasterWorld Guest from 54.161.246.212
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Sandbox For a New Section On a Domain?
georgiek50

10+ Year Member



 
Msg#: 27250 posted 1:14 am on Dec 28, 2004 (gmt 0)

I have website that does very well in Google for my target keywords. It's a content site, so a new page goes up almost every day and ranks very well within a week. Now, I put up a whole new strictly affiliate section and it got indexed but I have checked 25 pages of Google and no results for my target keywords. Last month I did the same (new site section) and Google is loving it and sending tons of traffic. Does it sound like this second section is sandboxed?

 

McMohan

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27250 posted 6:19 pm on Dec 28, 2004 (gmt 0)

Google can't be random in choosing which section/pages to sandbox/ignore. You mentioned you added an affiliate section, have you made sure you have unique content that is not same as the site you are affiliate of? Adding &filter=0 to the URL may be interesting to see.

Mc

georgiek50

10+ Year Member



 
Msg#: 27250 posted 6:50 pm on Dec 28, 2004 (gmt 0)

The funny thing is that the "section" that is not unique (quite a few of these floating aroung the internet) ranked really well within 2 weeks, where as this second "section" which is completely unique seems to be sandboxed.

Imaster

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27250 posted 6:52 pm on Dec 28, 2004 (gmt 0)

georgiek50,

I have a site which is pretty old (3 years) and am considering adding a large new section (around 100k content pages) in this site, but am currently holding up doing so fearing Google may put the entire site in sandbox along with the new pages.

Has anyone done the same and found any positive result?

HayMeadows

10+ Year Member



 
Msg#: 27250 posted 7:02 pm on Dec 28, 2004 (gmt 0)

Does it sound like this second section is sandboxed?

Yes that is what this sounds like. I've seen the same thing happen on other sites (just recently).

COULD be, and lets only hope, that they have simply gone backwards in time and need to re-update.

georgiek50

10+ Year Member



 
Msg#: 27250 posted 7:37 pm on Dec 28, 2004 (gmt 0)

The whole site on sandbox! Yikes! That sounds somewhat counter-productive for a search engine unless they were planning on dumping organic results altogether and heading the PPC route. The way I figure it (although unfortunatelly it's not the case), a 2 year old domain that keeps getting more and more google traffic should be viewed as offering a service to the surfers when new content is added, regardless of its intent (money). Then again, that's just my wishful thinking...Google you have won, I will fork up the cash for PPC!

conor

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27250 posted 7:56 pm on Dec 28, 2004 (gmt 0)

The 'Sand Box' effect is only seen on new domains - Not pages - No sub-directories and Not sub-domains

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 27250 posted 8:02 pm on Dec 28, 2004 (gmt 0)

"The 'Sand Box' effect is only seen on new domains - Not pages - No sub-directories and Not sub-domains"

False, and it has been false for months and months. The sandbox effect can be seen on established domains sometimes, and here could be yet another example.

lizardx

10+ Year Member



 
Msg#: 27250 posted 8:08 pm on Dec 28, 2004 (gmt 0)

I don't think the lag in adding new pages is the sandbox, although it may have some relation to it. I've seen this before, the lag comes in how long it takes the pages to get indexed, especially when there are a lot, more than a few hundred.

It looks to me like when googlebot encounters a group of new pages, or a new section, it does something like phoning home, then has to wait for permission to start spidering it. Last time I added a chunk of pages I watched the googlebot for a few weeks bounce against the new section, but not spider it, finally it seemed to get the go ahead from hq and spidered it all in a day.

This is one reason I believe that Google still has a primary index, with capacity issues, and that those issues have not yet been resolved. It's also why I think that Google is playing a dangerous game, it is not in fact delivering fresh content consistently. Some sites can get their new blocks of pages spidered quickly, but it's not consistent, unlike a few years ago where all new content could get in very quickly, that's what made Google a market leader so quickly .... keep that in mind google, this is how you lose the game, you aren't immune to the rules.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved