Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duration of Sandbox

I thnk I am finally out!

         

BeeDeeDubbleU

2:55 pm on Feb 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Background
----------
This thread [webmasterworld.com...] from the back end of last year discussed the duration of the Google Sandbox. I mentioned a site in this thread (and others) that had been in the sandbox for about one year. Several people disagreed saying that the site was probably just badly optimised, etc.

Today the site finally seems to have come out. This has happened pretty much without my help because I have done very little to trigger its release. It now has respectable rankings for several terms. The site is about 15 months old. I thought I would point this out because to me it seems like more evidence in favour of a sandbox filter.

koen

9:18 pm on Feb 27, 2006 (gmt 0)

10+ Year Member



"When a domain begins to display in the search results is very specific to when that domain shows Google enough positive signs"

This implies that a site can be sandboxed for a given search phrase for ever? Suppose I create a one page site, get one link to that page so google finds it, indexes it and sandboxes it for my main keyphrase. I never do anything on the site, nor the page with the link to it is changed. How can things then evolve so positive signs emerge (for google) without there being a time factor?

randle

10:01 pm on Feb 27, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This has happened pretty much without my help because I have done very little to trigger its release.

When a domain begins to display in the search results is very specific to when that domain shows Google enough positive signs

It’s out, but he did nothing to it; so why now?

koen

10:09 pm on Feb 27, 2006 (gmt 0)

10+ Year Member



Another remark: when a site of mine was sandboxed, it got no visits of google at all. Apparantly, the things that must change are external to the site.

This all points to time factors to me.

Iguana

11:23 pm on Feb 27, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Time for the site or time for the link to rank? That's an important question.

tedster

11:33 pm on Feb 27, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Apparantly, the things that must change are external to the site.

Many of them are - links for example, especially those obviously independent links. Some people have even noticed that traffic seems to be a factor -- though exactly where Google is getting traffic numbers is still a matter of conjecture. You can see some obvious possibilities, though.

So certainly some time must pass for these "signs of quality" to be in place, but that doesn't mean there's a set duration. I have seen that there isn't. I've seen anywhere from 4 weeks to 10 months, but I can't claim to have isolated all the factors involved. I just know for sure that it's not a fixed time period.

RichTC

1:13 am on Feb 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



i would agree that certain keywords will take longer.

Once out the sandbox you start ranking, with time and quality content you can improve.

A PR7 site can add new content that may rank well for lower rated keywords fairly quickly. The moment it hits a high value keyword that it hasnt ranked on before it still takes further time.

CainIV

7:53 am on Feb 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tedster is pretty bang on there. It's not a definite amount of time (for high comp keywords), it's all dependent on factors.

Using the one link example, the one link you have acquired is 'fresh'. So is all of your site pages. Google measures the length of time of all of this, including domain age. Does it use this in it's sandbox filtering? My guess is yes, but noone can say for sure except the engineers at Google, and some others privy to that golden knowledge.

I do know that some things have 'seemed' to help my sites get out quicker:

No linking home with keywords
Respectable amount of links acquired, (I know this is ambiguous) and a mix of the link types.

Hope this helps

alphacooler

5:06 pm on Feb 28, 2006 (gmt 0)

10+ Year Member



I don't think Tedster is correct here.

You say everything is site specific and depends on showing enough "positive" signs to Google, but then how do you explain that fact that there are periods of time when large groupings of sites are "released" (i.e. usually during updates)? This evidence is anecdotal no doubt, but it seems pretty consistent.

CainIV

7:54 pm on Feb 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have about 6 sites that have made it out of the filtering stage. None of them corresponded with any known Google update.

I really do believe it's a matter of positive factors as well, including, but not limited to the ones I mentioned previously.

When enough of the 'cards' are right, your site is released.

You say everything is site specific and depends on showing enough "positive" signs to Google, but then how do you explain that fact that there are periods of time when large groupings of sites are "released" (i.e. usually during updates)? This evidence is anecdotal no doubt, but it seems pretty consistent.

Because the 'positive signs' are checked by Google not you. If all of a sudden Google made every new link on the Internet the same value as the old, sites that are new would make more impact, and more would 'have it right' to start. A slight change in any alog by Google could trigger sites to leave that were previously being held back for one reason or another/

concepthue

9:25 pm on Feb 28, 2006 (gmt 0)

10+ Year Member



How do you tell if your site is in the sandbox?

I can search my domain name of my site and it comes up..just the home page.

It also comes up for various different search terms.

Though MSN seems to be better for me than Google. I get better results there.

Google doesn't even crawl beyond my home page either, while msnbot and exabot do.

What's up with that? Is it in the sandbox? How can one tell? ...and why in the world would Google penalize your site and keep it in holding for a year? IT's HURTING MY BUSINESS! I have a legit site. It's targeted to a local market too, not competing with country-wide sites.. A-Hole Google.

also --

Because the 'positive signs' are checked by Google not you. If all of a sudden Google made every new link on the Internet the same value as the old, sites that are new would make more impact, and more would 'have it right' to start.

Uh... yea that makes perfect sense. It makes sure that sites are updated...but I think Google should (if it doesn't already) factor in (to a degree) link hits. Sometimes content doesn't need to update, but if people are going to the site, it's probably something wortwhile.

But I'd really expect new sites not to end up at the end of a search result..That's just as bad as if they appear on the top of the results.

JoeHouse

9:53 pm on Feb 28, 2006 (gmt 0)

10+ Year Member



Question

Built a site and submitted it April 2005. After 7 months site built in cold fusion was not performing well so I decided to completely change the format.

My question is regarding the sandbox. Will I be starting over because I literally changed everything but the domain name? Or because the domain is aged (since January 2005) will I be OK?

Anybody have an experience with this?

concepthue

10:14 pm on Feb 28, 2006 (gmt 0)

10+ Year Member



I just got done reading an article that completely redesiging your site can have that effect.

"Google likes a natural web site with natural changes and progressions" -- what some moron over at google didn't realize is that it's quite natrual to completely redesign a site because we are unhappy with the look.

also we generally don't put a new design up half at a time...we make it work offline then put it all up when it's done. making a big quick change.

that IS natural. Google is UNnatural lol

Tinus

10:18 pm on Feb 28, 2006 (gmt 0)

10+ Year Member



This thread is about this subject.
[webmasterworld.com...]

CainIV

7:44 am on Mar 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I believe that newer pages will now not rank as well because Google believes them to be brand new (as judgded by the url), even thoough they have been there for quite a while.

However, all logic would point to there being no reason why the entire domain would be set back, except in the issue where most of your links were to those inner pages.

One thing for sure, make sure you are comfortable with the new url's and plan to NOT change them for quite a while.

BeeDeeDubbleU

9:08 am on Mar 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My question is regarding the sandbox. Will I be starting over because I literally changed everything but the domain name? Or because the domain is aged (since January 2005) will I be OK?

I did this with three or four sites last year. All resulted in a positive effect in the SERPs. My changes were as a result of recreating sites with CSS design. The most important thing to remember is that the URLs/Page names must not change.

koen

3:59 pm on Mar 1, 2006 (gmt 0)

10+ Year Member



I set up a new testsite (one page, I love setting up testsites :) a few days ago with a very spammy subject with info scrambled together from various informational sources so the content looks unique and very informative. I was hitting for a keyw1 keyw2 phrase (though used it very sparingly in the text because I wanted it to be really informative). I am sandboxed for "keyw1 keyw2" but show up for "keyw1keyw2" on place 10. Only 9 first class spam sites before me. At least I am somewhere for a searchprase interesting enough to see what actions have what results. I will try some small things (adding some links to it, changing some content, adding a page etc) and see what effect it will have. Then I'll leave the page alone.

All sites ranking better than me have the same as I: keyw1key2 as part of the url (except one site with only keyw1) and have keyw1 keyw2 in the title (actually most sites after me have these same 2 conditions).

2 higher than me are unreachable.
5 are "search" sites that give "results" (3 sites are the same)
1 site has a suspended account notice

So these are the kind of sites that do better than me.

econman

4:52 pm on Mar 1, 2006 (gmt 0)

10+ Year Member



My experience is consistent with Tedster's. All of our sites are fairly new (6-18 months old) and all have started getting traffic from MSN almost immediately, followed by Google. Never takes more than a few months to get some traffic, but the volume is very small compared to what we are shooting for.

These are all information-intensive sites in moderately competitive niches designed to provide information to users and, hopefully, generate revenue from advertising.

The pattern I've seen is the sites initially only rank for and start to get traffic from) "tail" search phrases, where we happen to have content that precisely matches the specific combination of words used in a multi-word search.

Then the sites gradually start to rank for and get traffic for less obscure searches (fewer words, more popular/important combinations). None of these sites are yet ranking for any highly competitive searches except for the two words that are used in the domain name.

In our particular situation I've not seen any evidence of sudden removal of filters or movement out of a "sandbox," but lots of evidence that Google waits for enough positive signs to accumulate. Hard to know for sure what "signs" we are gradually benefiting from, but that concept seems to fit the pattern I've seen.

I'm guessing that we are benefiting from the slow accumulation of inbound links, particularly a handful from sites that Google "trusts" for some reason, and perhaps some sort of positive traffic patterns (e.g. something about the way users are reacting when they reach these sites in response to an obscure "tail" search).

Certainly, the passage of time could be a factor, or the "aging" of links, which could contribute to both the pattern I've observed and the pattern seen by some -- where a site suddenly starts appearing in the SERPs with no apparent triggering event.

I accept the fact that other sites seem to behave differently -- including ones that suddenly start ranking after long delays. Trying to reconcile these very different patterns, I wonder if there might be things that are occurring in the background -- such as observation or aging of one or two "trusted" links -- that causes some sites to reach a "tipping point" where Google finally concludes there are enough "signals of quality" that it is willing to include the site in certain SERPs.

alphacooler

6:26 pm on Mar 1, 2006 (gmt 0)

10+ Year Member



Econman,

Interesting points. It seems like collectively we could all combine our anecdotal experience on launching new sites to figure out these conceptual "positive signs".

I certainly don't think this is an issue of trusted links right off the bat. I've had an informational site in a very uncompetitive space get a DMOZ listing and Yahoo dir listing as well as a couple of other very trusted links all with varying anchors and acquired very slowly. This site of course doesn't rank for much of anything except tail kw combos.

** I think Matt Cutts has given us some very big hints about this filter.

He has noted that it was not intentionally created as a single filter. (He had to go back and look at the algo changes to see what was causing this effect, according to him).

He has also stated that certain sites get caught in this filter and others don't.

These two facts combined obviously point to the ability to launch a new site and have it rank well for its main kw phrase (i.e. bypassing the filter). So despite what many people have noted, I truly believe this thing is avoidable.

We should honestly be able to figure this out. Google created this filter ostensibly to break the pattern of spammers creating a flurry of spam sites, getting them banned, then doing it all over again, pulling economic rents in between. This was very profitable and it was causing noticable degradation of the SERPs.

With this in mind one can posit that when launching new sites we should strive to distinguish our sites from even the hint of spam. This would lead us to look at what factors are consisten in spammers' pages.

> skeleton pages (code>html)
> scraped content
> static (once up, the content doesn't change much)
> burst of links right out of the gate which is statistically improbable when looking at the link graphs of all websites on average.
> this cluster of links looks automated (identical anchors)
> all links coming to front page
> very few 'trusted' links
> lack of actual search traffic and click thrus
> etc.

Sounds good right? Well I launch a lot of sites. And for the past 6 months have been incredibly mindful of these factors. And still! I get 'boxed.

I'd love to hear people's input on this. What "positive" signs can we show google? Is it as simple as age of IBL's? In which case their isn't much we can do (doubtful).

eyezshine

8:33 pm on Mar 1, 2006 (gmt 0)

10+ Year Member



I think alot of it comes down to the toolbar data which could make a site or break a site with a few clicks of the smiley face or frowney faces on the toolbar?

What if all it took was 5 clicks on the smiley face from 5 different IP's to bring your site out of the sandbox? This could answer the question of why some sites come out faster than others.

What if the smiley/frownie faces were calculated like PR as votes and the more votes your site got the higher you rank? This would only take one number in the database that gets added to or subtracted from depending on what the surfer clicked on.

Remember that there are alot of people out there that think they are helping google when they are clicking those happy faces and you can guarantee that people are doing it often.

What if google was penalizing sites when someone searches for their website using the site: command or the link: command. Really if you think about it, that info is mostly information that webmasters are interested in.

Maybe google is penalizing for over optimization which I think is true. It would be easy to figure out what a page is trying to rank for simply by calculating the weight of the keyword density and then filtering that page for the highest density of keywords on that page causing it to only rank for obscure keywords.

The only way to solve that problem is to not optimize at all. By the way, I analyzed the top 20 sites for a search on google and the average percentage of the keyword in the title was 20% density. Some sites did not even have the keyword in the title.

stever

8:55 pm on Mar 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree with a lot of what econman says above. I've launched two sites recently and neither (IMO) shows the effects that others refer to as a sandbox.

One was an old domain name which had been sitting on a server for a while with no content and one was completely new.

One has had no content added post-launch and one has had regular content added.

If I were to identify differences between them and "normal" WebmasterWorld sites, the only points I could pick up on are:

Both link out freely without heed of PageRank hoarding. The links are to relevant sites for their subject. (They aren't directories, however.)

They both received links from good (but not great) sites in their subject area (or above it). In one case, the link was very deliberately from another of my sites. In the other, the links were completely unconnected.

In both cases, MSN picked up the sites almost immediately, Google with a delay of about a week, and Yahoo is just starting with one and ignores another. I would judge them both to behave in the same way as a site would do before any mention of a sandbox. Incidentally, if I were concerned about a "sandbox" it would be with Yahoo and not with Google...)

Neither have had any further "link campaign" work done on them and thus rank how I would currently expect with Google (well for low to medium terms, average for top terms).

Note: it would be quite possible to judge them as being in "uncompetitive areas", compared to the internet's three p's. However, to be fair, they would appear to be in substantially more competitive areas than the site which is referred to at the beginning of this thread.

BeeDeeDubbleU

12:29 pm on Mar 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Incidentally, if I were concerned about a "sandbox" it would be with Yahoo and not with Google.

Even though Matt Cutts has acknowledged the sandbox effect?

stever

7:57 pm on Mar 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would rather rely on what I understand Matt Cutt's words to mean (which is remarkably similar to tedster's message #30) than your interpretation, with all due respect, beedeedubbleu.

randle

8:15 pm on Mar 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would rather rely on what I understand Matt Cutt's words to mean

Sort of like interpreting scripture?

stever

8:19 pm on Mar 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Sort of like interpreting scripture?

Apparently so...

Seriously, there are a lot of different theories about what works and what doesn't when it comes to websites and my opinion is that the only yardstick can be what works (or doesn't work) for you.

CainIV

8:58 pm on Mar 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One was an old domain name which had been sitting on a server for a while with no content and one was completely new.

This would explain the first one as domain age is a big factor imho. All sites that I seasoned (got indexed in Google before the site was put together) fared much better in respect to sandboxing when the other factors were considered. This may point to domain age being one factor in the filtering effect as mnetioned earlier.

Keep in mind it seems newer site as often do rank for up to a couple of weeks before dropping considerably. This may provide eveidence to us that Google is evaluating the site and calculating it's proper spot.

koen

12:05 am on Mar 3, 2006 (gmt 0)

10+ Year Member



Some more anecdotical experience:

One page of a site of mine got sandboxed. Now today I checked stats, saw a searchphrase close to what I was aiming to with this page (not the site) but the visitor came from google.nl. And surely, I was in google.nl for the aimed searchprase. I don't know whether I was previously in google.nl though. The page is less than 2 months old.

This 56 message thread spans 2 pages: 56