Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

Small Website Guy

2:59 pm on Nov 23, 2004 (gmt 0)

10+ Year Member



Our experience is that new pages on existing sites do not exhibit the so-called sandbox behavior.

This is absolutely correct. I've added new pages to a site that a day later pulled in hundreds of hits a day. (Easy to do with good page rank plus a current events topic.)

BeeDeeDubbleU

4:12 pm on Nov 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Existing spammers were filtered by the C-class IP address penalty or were sorted out manually, I suppose. The typical spammer built a network of sites on generic domain names, cross-linked them like crazy, enjoyed the traffic as long as possible, got caught, was penalized, moved on and started from scratch somewhere else.

I was not just talking about spammers. Any legit site, (like my main site), can add new pages and get them ranked within a few days. You can see that from the posts above and I am working on another new page right now.

Sorry but I just cannot subscribe to any suggestion that this lag is deliberate. It makes no sense at all to ban all new sites from the SERPs. Had this been a spam prevention measure some comment from Google would have leaked out by now. Believe me, we'll only start to find out what's going when the press get a hold of it.

steveb

5:39 pm on Nov 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"New pages on old sites get ranked quickly. This is a fact AFAIAC."

This is not a fact. Try to create a new page to rank for a search term where you already have a page ranking decently. It is difficult to get a new page to outrank a more mature page, even if the new page obviously should, like a page about Portland being outranked by an Oregon page for a "Portland" search, where the new page has more accurate/better anchor text, etc.

Spine

5:49 pm on Nov 23, 2004 (gmt 0)

10+ Year Member



I was able to add a couple of pages every few days and have them ranking withing a few more days on an established site, until the site disappeared.

dvduval

5:53 pm on Nov 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, but if I create a new page on an old site and a new site, the old site has the advantage every time.

I almost see the battle between conservatism and progressivism on Google! But one someone will come along and say out with the old and in with the new! Google is now longer the New New, and the sandbox just further enhances this notion.

Scarecrow

5:58 pm on Nov 23, 2004 (gmt 0)

10+ Year Member



If a page has never had a PageRank before, it can be defined as a new page.

If it is not a root page, then flag it and defer the ranking until the root page for that domain has been ranked. After the root page is ranked, give the new page a PageRank of root page minus one or two.

If the root page itself has never had a PageRank before, start it out with a "new root page" PageRank that seems reasonable, but is independent of its backlinks. The next time around it won't be new, and can start growing its "natural" PageRank if it has sufficient backlinks.

This isn't so exotic. In the old days Google used to assign a PageRank of root minus one (according to the toolbar) for every directory deep where a new page was found on an old domain. That would work between updates. Then at the next monthly update it would acquire a more accurate PageRank.

tomasz

6:23 pm on Nov 23, 2004 (gmt 0)

10+ Year Member



My theory is G is serving pages bases on 6 months moving average of page PR similar to Alexa ranking..

randle

7:30 pm on Nov 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My fascination with the sandbox is how perfectly subtle it is. We constructed our sandboxed sites in the same manner as other successful ones; same linking methods, submissions to the same places and directories. They get indexed, are granted PR, receive regular spidering and have good fresh cache dates.

If you search for the sites using various commands like allintitle, ect. there they are looking perfect and right near the top; Title just right, snippet right on the money, fresh tag from two days a go. Everything is exactly as it should be except they just can’t get anywhere for the key words you are optimizing for. (I don’t know the definition of “competitive” but our sites are chasing terms that produce results from 3 to 8 million)

Now I know there are pundits on this board who say there is no way around this thing because it doesn’t exist in the first place. And there are other wise men who say they can get out of it with a little extra hard work and smarts. All that may be true, and my hats off to you, but it doesn’t change the fact that an awful lot of people have launched sites eight months ago, in the same fashion they always have, that don’t even show on the radar screen for terms they were designed for.

It is absolutely the most bizarre thing we have encountered in this business. The real problem with it is what do you even say about it? “gee whiz, we don’t rank for this competitive key word as well as we should, darn engine must be broken, can’t be our fault”. Or how about, “sorry sir, but your site is due to break the first 1,000 places hopefully in about 10 months, uh, we think”. Other than the kind people on this board allowing sandbox sufferes like ourselves to rant a little, theres no one to share this little problem at work with; “Hey hows everything at work?” “Uh great, except for this thing they call the sandbox, its, uh,, well its.. oh never mind, everythings great, and you?”

I am a fan of Google; always have been and probably always will be. We have done well in this business and Google has been a big part of that so no bashing here. The thing that’s really beginning to bother me is a growing fear that when we find out what is causing this thing, its going to be something so incredibly obvious, we just won’t ever get over the fact we couldn’t figure it out.

wanna_learn

8:27 pm on Nov 23, 2004 (gmt 0)

10+ Year Member



Let Google estimate how badly the SEOs pissed off and its impact on Adwords revenues.

talking of commercial websites...

The question that Google would have asked to itself is "why would someone (online/offline) build more the one website for the SAME Business?"

Answer- Just to play with SEO and acquire top of positions with variety of websites in portfolio attacking the variety of KWS.

Counter Action - The Genuine SINGLE Biz - Single Website owners would also not be encouraged for few months (along with build-new-website-crazy-webmasters)to see if both of the groups start getting used to Adwords.

Result - Still to come

dvduval

8:35 pm on Nov 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The idea that the sandbox helps boost Adwords revenue has been mentioned before, and I believe this is the most probable reason.

If it were something like the search engine is broken, then why would they double the size of their index?

Since no Google rep will make a statement about it tells us a lot:
1) It must exist, because if it didn't, they could easily tell us.
2) The reason for its existence is not something they want to tell the public, because we would likely not like the answer.

Meanwhile, other engines are working on beating Google, and working on their image among webmasters. 2005 is going to be very interesting.

This 472 message thread spans 48 pages: 472