homepage Welcome to WebmasterWorld Guest from 54.227.146.68
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
what's the reasoning behind the sandbox
why would google like to have it?
koen




msg:764956
 1:30 pm on Feb 17, 2006 (gmt 0)

I've done some tests with creating clean sites (with a few pages of content) and position them for keywords. Two sites and one page from another site are sandboxed from day one. It is really annoying. And I can't see the reasoning behind it. If it would be to discourage spam sites they should update their algo to handle spam sites not clean sites. I remember Matt Cutts saying on his blog somewhere that they looked at the sandbox effect and they liked what they saw. So they endorse it. But why? What are google's benefits for sandboxing sites?

 

jadebox




msg:764957
 3:35 pm on Feb 17, 2006 (gmt 0)

I don't think there's a "sandbox" as it's usually defined.

I think it's just a natural part of the algorithm Google uses. New pages tend to get a boost in ranking if they match popular search terms because they may be topical (such as news stories about a recent event). Then the pages drop in SERP rank quickly (as the "topical" boost is removed). If they are still of interest to surfers, they'll tend to rise again in the future due to incoming links and other things Google uses to determine relevance.

New pages which don't match popular search terms don't get the initial boost and still have to wait for incoming links and other things to improve their ranking.

-- Roger

koen




msg:764958
 4:36 pm on Feb 17, 2006 (gmt 0)

In my case we are not talking about getting higher in ranking, but about getting a ranking in the first place. The two sites and my one page are indexed in google. But when I search for their obvious keywords they are nowhere. I'd prefer them to be on the last page of the rankings than not being there at all.

truezeta




msg:764959
 5:12 pm on Feb 17, 2006 (gmt 0)

I have the same problem. I don't understand the whole "sandbox" deal. My site is all content and the only keywords that rank currently are my site name and author's names. The only way people find my site on google is through Adwords. However, I have cut back on using them because I've noticed that most of my impressions are from parked domains. It is very frustrating and makes no sense whatsoever. I can see how they would want to deter spammy sites. But for sites that are strictly informational, you think they would cut us a break! I'm not profiting from my site. "Sandbox" those sites that are strictly for profit (MFA), but what is the point of doing this to everyone else.

Sorry for the rant.

MHes




msg:764960
 6:11 pm on Feb 17, 2006 (gmt 0)

The 'sandbox effect', as Google has acknowledged exists, is probably due to several tweeks in the algo. These may include an extended time it takes for some links to pass their full pr (applicable to old & new sites) plus deeper analysis of the type of inbound links. There may be other factors such as usuage from toolbar data or Alexa which could help determine the quality of a site. Another possibility is that to achieve high rankings for competitive searches, the body content of a page needs to be being counted and contributing 'points' in the ranking algorithm. I suspect that new sites have their body content pretty much ignored until the site earns a sufficient status. These and many other factors need to be processed and evaluated before a site gets good relevant rankings. The emphasis seems to be on 'natural' growth which, once Google is satisfied, will deem the site worthy of ranking. Apparently if you present the site in the right way this can be a relatively short time period, but personally I have yet to see any proof of a site ranking well for many keyword phrases within 6 months.

"what's the reasoning?"
Discouraging webmasters from putting up a new site every week may not have been the intention but it probably has achieved this. The lack of 'instant gratification' is a big dampener for producing endless new sites. The real reason is probably an attempt to evaluate sites and rank them according to considerably more factors than in the past, with no one factor having a major importance. In other words, you have to achieve optimisation in many areas, whereas 2 years ago a few links and a good title did wonders. Now the algo has 200+ requirements. Many older sites will already have clocked up the necessary 'requirements' and hold their position, but new sites need to earn them. Some old sites hang in there but will drop away as new sites qualify for ranking. This takes time, especially if a 'natural' linking and growth is being looked for. It is as if Google now assumes all new sites are potential spam and thus treats them with caution. In the longterm, this will produce high rankings for those who have 'qualified' on many levels and not on only a few.

Google algos are simplistic, so achieving good ranking for a new site for any term (even a unique company name) is difficult. However, it can be done for one or two keywords if all the right ingredients are there and the keyword is not very competitive. But to rank well for competitive phrases, the onpage factors need to count and I wonder if they don't until off page factors have been satisfied.

Get the off page factors right, allow time for Google to process them, and then the onpage factors will kick in and qualify you for decent rankings... I think.... :)

flobaby




msg:764961
 8:45 pm on Feb 17, 2006 (gmt 0)

What are google's benefits for sandboxing sites?

Lots and lots of money. The sandbox forces thousands of webmasters to rely on Adwords until their sites finally get ranked.

ulysee




msg:764962
 12:56 am on Feb 18, 2006 (gmt 0)

It's probably to get rid of the "little guy", mom & pop type sites and to destroy the new and last vessel for the American dream.

Jimmers




msg:764963
 4:16 am on Feb 18, 2006 (gmt 0)

Many older sites will already have clocked up the necessary 'requirements' and hold their position, but new sites need to earn them.

I cant see that statement as relevant for a basic reason.

A scraper site relies on nothing more than content from everyones website. Many MFA sites I look at do not have hardly anything for keywords and also any sort of description is typically void of any seo work. That would mean that these sites already passed the sniff test and would be deemed as credible when in fact all they have are snippets of material on each and every page. Yet they continue to remain indexed and rank well for competitive searches while the white hat webmaster disappears, or is sandboxed, from sight.

Why does the sandbox exist? Good question. One of my sites was sandboxed or whatever you prefer to call it. I did "nothing" different in the next 6 months except create exclusive informational articles. In the next update my site was back. Following update gone again. I grew tired of chasing after Google traffic and building what "they" wanted in the beginning. These days I only look at Google traffic as a bonus and build the pages for myself and the visitors.

MHes




msg:764964
 8:39 am on Feb 18, 2006 (gmt 0)

Jimmers

> Many MFA sites I look at do not have hardly anything for keywords and also any sort of description is typically void of any seo work.

So how do they happen to rank well? There will be a reason. These sites will continue to rank well until better sites that have passed the probation period displace them. However, the fact they have little obvious seo work is perhaps helping them! They are probably quite old and have many inbound links, good anchor text etc. Having said that, many of these types of sites only rank well for a few isolated phrases which you happen to notice.

>That would mean that these sites already passed the sniff test and would be deemed as credible..

Although you don't like them, they may be a good resource, despite being a scraper site. Afterall, google is a scraper site and is found quite usefull :)

>Following update gone again.

Sounds like you have an old fashion ranking problem rather than sandbox.

>These days I only look at Google traffic as a bonus and build the pages for myself and the visitors.

I sympathise, but turning your back on the biggest traffic source in the world is a big decision. In theory, if your site is built for visitors and is good quality, then this is exactly what Google wants to find and rank well. If I was in your shoes, I would want to discover why this is not happenning. The reason must be that your site either has a technical problem (not spiderable, server down time, canonical url problem, duplication etc) or it is not attracting the in bound links it naturally should or have content that clearly signals what keywords you are targeting. A good site for visitors is also one they can find.... otherwise its not a good site for visitors.

>what's the reasoning

It has to be to improve the quality of the index. I don't buy into the theory that less fresh organic results will attract more adword customers. Google knows that people visit Google on the strength of it's organic results and without that they will lose visitors and thus revenue. By subjecting new sites to a deeper analysis, they prevent new spam. Perhaps the answer to the question can be addressed by looking at:

1) How many new spammy sites have succeeded high rankings? - Probably close to none.
2) How many older spammy sites have recently been removed from the index? - Well... err... a lot of mine ;)
3) What is the quality of the new sites that have achieved good traffic after 6 months? I bet its quite good.

If Google maintains this approach, then the index will get better over a few years. If they are going to be around for many years, now is the time to apply this tactic. In the meantime, yes we have old spam still there, and new quality sites not yet showing. This is the 'sandbox effect' but in reality is the consequence of stricter algos for ranking purposes and better algos for detecting spam.

Sparkys_Dad




msg:764965
 9:49 am on Feb 18, 2006 (gmt 0)

Defend it if you will, but anyway you look at it is "guilty until proven innocent."

Essentially Google is saying, the algo can't tell the difference between spam and quality, so we'll just put everyone in a holding pen until we get a better handle on what the site is really about. Besides, we have a hard enough time crawling the entire Web as it is, who needs new sites?

Has anyone else noticed that MSN is two to three times as fast at crawling new content? Meanwhile, Google puts up a "Big Daddy" smokescreen to sucker the mooks who would otherwise be asking, #*$! is wrong with Google?

"Big Daddy?" Hah, don't believe the hype, it's a sequel.

MHes




msg:764966
 10:02 am on Feb 18, 2006 (gmt 0)

>"Big Daddy?" Hah, don't believe the hype, it's a sequel.

Completely agree with that. I am amazed how joe public seem to know about big daddy and talk as if Google has made a great advance.... great PR!

>...we'll just put everyone in a holding pen until we get a better handle on what the site is really about.

Yes, but that seems reasonable to me.

Sparkys_Dad




msg:764967
 10:29 am on Feb 18, 2006 (gmt 0)

Yes, but that seems reasonable to me.

The low points of civilization have often marked by societies giving tacit approval to (and thereby legitimizing) otherwise indefensible policies.

donovanh




msg:764968
 2:50 pm on Feb 18, 2006 (gmt 0)

While I do believe many people are affected by this, I have not been able to replicate the "sandbox" effect.

I launched a new site at the beginning of February, and it's gone from zero to around 200 uniques a day in 3 weeks, ranking for about 10 different, relevant, phrases. I've only launched a few sites in the last year but even with very minor link building they've been turning up in searches within weeks, rather than months/years.

What is it that makes some sites susceptible to being sandboxed, while others arent?

larryhatch




msg:764969
 3:11 pm on Feb 18, 2006 (gmt 0)

"What is the reasoning behind the sandbox?"

None at all. One might as well ask the reasoning behind a traffic jam on the highway.

Is there anyone here who honestly thinks the "sandbox"
(call it a gravel pit if you prefer)
is some intelligently designed construct or algorithm to keep new pages out of the SERPs?

Isn't it more likely that it takes time for the engines to find and index new pages
when they can scarcely keep up with the existing ones? -Larry

jd01




msg:764970
 4:18 pm on Feb 18, 2006 (gmt 0)

Essentially Google is saying, the algo can't tell the difference between spam and quality

OR they are saying 'to more effectively determine the difference between a quality site and spam, it is better to wait until a pattern emerges...'

Which would mean: the period of time spent in the 'box' will vary from site to site, and until each site's pattern (my guess is 'as compared to other sites in the same niche') has enough validity to establish where a site should rank, it will remain lower in the index.

Some sites will establish a 'better' OR 'higher quality' pattern (than other closely related sites) and rank in a few days/weeks; others will take months/years to establish a level of 'acceptability' (as compared to other sites in a niche).

Justin

And yes, I'm silly enough to think they're doin it on purpose, cause the whole google thing is really pretty much just some math... I can see the logic of waiting for a pattern from here, and i aint too edju-muh-kated or even very smart.

steveb




msg:764971
 8:12 pm on Feb 18, 2006 (gmt 0)

The world has always had stereotypes. It is easier to not rent a furnished apartment to someone who has kids than to consider if the kids are responsible.

The majority of sites put online any day are garbage/template/duplicates. In theory it makes some sense to block all then let the genuine sites in, but this process should take a month or so, not a over a year.

Marval




msg:764972
 8:24 pm on Feb 18, 2006 (gmt 0)

First off - I would love to see an actual reference to where Google ever acknowledged the existence of a sandbox other than the one they have in adwords for keywords.

Second - for those that werent around back when this "effect" started occuring, the best guess was that it was a domain registration dampening based on the huge amount of DMOZ expireds that were being bought back then to get automatic ranking and relevance. When it first started occuring, it looked like to us, and was discussed pretty extensively, as a small depression in the domains' ability to rank if it were bought after it had been owned by someone else and had expired.
The kicker was that Google decided to apply to become a registrar around the same time (possibly so that they could legally use an automated whois query?) which kinda added credence to that theory.

As far as most of the people I talk to on a regular basis who launch new sites frequently on new domains - they all agree that there is no effect seen with a clean/never owned domain and its something I can back up with my own experience throughout at least 150 domains in the last year.

jdhuk




msg:764973
 9:15 pm on Feb 18, 2006 (gmt 0)

As far as most of the people I talk to on a regular basis who launch new sites frequently on new domains - they all agree that there is no effect seen with a clean/never owned domain and its something I can back up with my own experience throughout at least 150 domains in the last year.

Marval, so your saying that you and your friends/associates have not noticed any filtering effects for brand new domains? If that is the case then they are not targeting very competitive keywords.

Rollo




msg:764974
 10:18 pm on Feb 18, 2006 (gmt 0)

Whether there is a "sandbox" or not is really irrelevant... it does seem to be taking longer and longer for new sites to rank in Google. It used to be three months...then six...now it's about a year.

Sparkys_Dad




msg:764975
 4:23 am on Feb 19, 2006 (gmt 0)

First off - I would love to see an actual reference to where Google ever acknowledged the existence of a sandbox other than the one they have in adwords for keywords.

You are out of the loop. Matt Cutts has discussed it on more than one occasion (See Webmaster Radio/Todd Friesen & Greg Boser Interview.) Although he refers to it as an "effect." Same difference in my book. He will no doubt, be asked about it in NY later this month.

Second - for those that werent around back when this "effect" started occuring, the best guess was that it was a domain registration dampening based on the huge amount of DMOZ expireds that were being bought back then to get automatic ranking and relevance.

Completely immaterial. Sandbox is most likely to affect newly registered domains.

As far as most of the people I talk to on a regular basis who launch new sites frequently on new domains - they all agree that there is no effect seen with a clean/never owned domain and its something I can back up with my own experience throughout at least 150 domains in the last year.

150 domains per year? That's a pretty tough number to swallow. Please sticky me the list. I would love to see 150 newly launched sites that evaded the box. I am certain that many others would be interested in the list as well. Keep in mind that if they are not primarily English-language sites, the Sandbox will not apply.

The world has always had stereotypes.

Yes. Much to our chagrin.

The majority of sites put online any day are garbage/template/duplicates.

Please quote your source for this. And please don't say it's obvious, because it is not. You are applying some sort of highly distorted perspective here.

If I had to guess (excluding blogs and based on the market for Web design work), I would say that the majority of new sites are small business brochure sites.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved