Welcome to WebmasterWorld Guest from 18.104.22.168
No more sandbox crap talk again please. New sites actually get a boost in the new ranking algo of Google, but if they are not up to it, they fall deep down in SERPs.
Then, acting like an editor in chief for the entire world, they froze it to preserve its quality, and you now need the right type of links, acquired responsibly, to break in.
What's unfair about this (if I'm even roughly right), is that older sites continue to rank according to more classical Google methods. Old links in to a site continue to pass their gold easily; new links are put under an electron microscope.
Whether this is ethical is another discussion, but I think its fair to say that Google is ultra conservative these days.
(Think Google's April's Fools and their function, beyond the fancy fun for sure. They keep people talking, perhaps divert some of their talking energy away from other stuff Google is doing, spread the buzz about Google, and at the same time send some messages to the smart ones who can get it.)
I am not sure how to say this diplomatically, but you have simply got it all WRONG about PageRank. "half a dozen monkeys just randomly thowing banana peels at a keyboard" to me sounds so funny! That can NEVER be how PageRank really works.
But as usual, I'm sure those hardliners will come popping up to me asking me for "hard" and "authoritative" proof for my claims. Well, to those I say there is no need to get any proof on that. It is abvirous, "monkeys" oh my! I never knew that hypothesizing could one day ever go to THAT extent!
No dear, there aient no monkeys in the Google PageRank algo, and my confirmation is backed by hard (and official) evidence this time, from Google themselves, no monkies involves, it's just a bunch of pigeons
(And they're pecking at the keyboard. The idea of throwing bananas is just out of question for it would have made the Googleplex too messy.)
Sometimes they do but overall this forum and others is a thorn in their side. Amoungst all the weird and wonderful speculation nuggets of information emerge which allow an advantage to those that see it. This opens the door to spamming their index which is not what they want. Over the last few years this forum has given very valuable information allowing a select few to dominate serps in many sectors, which is not always helpful to the quality of google results.
I would guess that on balance they would prefer that collective analysis of their methods did not happen, but their importance dictates it is an unavoidable consequence.
Hahahah! I doubt they use monkeys for anything (messy, smelly, full of mischief..)
If G wants to add a random factor to their SERPs or PR, there are cleaner ways. -Larry
..However, if your site gets a cache date that is never more than 10 days old, (regularly crawled) and the title, snippet and url appears just as you intended when you search by your url, and you run a search using the commands; allinanchor:, allintext:, allintitle: and your site comes up within the first few pages, BUT when you search for the main keyword the site was designed for and your not within the first 1,000 places you could throw some sort of label on that.
OK, but how about this:
I've launched new site last january. It got spidered and indexed within a week. Then, the site indeed got that starting boost immediately (but remained its status in SERPS ever since with minor changes). Three months later and about hundred of backlinks, site gained PR6 from initial PR0. I've continually added fresh content and tweaked old pages furthermore. Google positively responded to the changes and those pages even improved their positions in the SERPS.
Now, as my site deals with beautiful widgets, ugly widgets and advices of how to make widgets more pretty, the strange thing was, while indeed my site was nowhere to be found in SERPS for my #1 main keyword (beautiful widgets), it instantly gained and remained great SERP positions for my #2 main keyword (ugly widgets) and the rest of my keywords.
I think it's important to note here that the search for beautiful widgets produces about 5 million results, which obviously indicates very competitive area, while my second main keyword, ugly widgets has only 1 million result pages.
Now, seven months after the launch, my site is very well positioned for some of my main keywords from the beginning, but only for my #1 keyword in pretty rather competitive area, the site can't be found even in the top 1000.
So my best guess on the sandbox phenomena would be, as discussed earlier here and around the net, that not all sites got sandboxed - at least, not filtered out for particular keywords and at the same time present in SERPS with the rest of the keywords. In my case, it seems that only more popular keyword got filtered while others from less competitive areas, had no problem with the SERPS from the very start.
Now, as my site deals with beautiful widgets, ugly widgets and advices of how to make widgets more pretty,
I love restrictions, they make us more creative and produce more interesting things. (After all it's ages of time, enormous heat and massive pressure that turns coal into diamonds though both are made of the same chemical element: Carbon. It's all in how the molecultes are arranged next to each other.)
(1) The sandbox is a binary switch. You're in or you're out. There's no gradually going into or coming out of the sandbox.
(2) The sandbox seems to trump all other ranking factors. It doesn't matter what your PR is, what the on-page factors are or how competitive the search phrase is. As long as you're in the sandbox, you don't rank anywhere near the first page, not even for your own company or site name.
(3) Being in the sandbox is an attribute of a domain--not a link, site, or page.
I had a site with a TBPR of 5 that ranked very well for some obscure phrases for years. I moved the content into another domain and set up 301 redirects to the new location. The PR transferred almost immediately and the content didn't change to any significant degree.
The site disappeared completely from the rankings. About two weeks after the domain hit the one year anniversary of its registration (about 5 months after the content moved), I checked the search results and found that the pages had magically reappeared at almost exactly the same positions they had originally held. New pages were ranking normally, and all signs of the sandbox effect had gone away simultaneously and completely. Three weeks before that, nothing had visibly changed since Google first noticed the site had changed domains. The inbound links, PR, and content of those pages experienced no significant change during that time, which would tend to explain the nearly identical rankings pre- and post-sandbox.
Any attempt to deny the existance of the sandbox is going to have a hard time fitting the facts here. Any attempt to explain it as an issue of link age or bad SEO in terms of links and on-page criteria will have a very hard time fitting the facts in this case as well. Denial is no substitute for data and reason.
The characteristics you describe are exactly the same for sites that have been hit in the limited updates from Aug 16 onwards (including Bourbon). Sites have practically disappeared from the SERPS and then sometimes come back in their former glory.
Because of the similarity in effect, I am beginning to believe that the two phenomena are part of the same thing - in other words whatever site (not page) filter that gets applied and hits some long-established sites is the same as what causes the Sandbox. If this was true then there would be no easy explanation such as link aging.
Whether it is some application of a Hilltop-like algo or something based on pattern of links from other domains I don't have much evidence. But it maybe that the 'Sandbox' is a side effect of a serious anti-spam measure that has been applied.
Although I have not figured a way out of this so called sandbox but I think I figured out why it appears we are in a sandbox.
Shafaki, you were correct the sandbox is a myth. Doesn't exist. Now Wait before all you sandbox believers jump down my throat, hear me out. You too are partially correct but its not a sandbox these new sites are in. The new name for this google filter is now called "PAGERANK TURTLE".
Why "PageRank Turtle" because slow and steady wins the race. Let me explain:
My site is about 4 months old. I have been exchanging links with as many quality/relevant links that I can find that pertain to my industry.
OK all that's normal, all of us do it, right? The interesting thing is that I have exchanged links with sites ranking anywhere from a PR 0's and as high as PR 8's.
Now here comes the interesting part. Google is only displaying PR 0's up to PR 4's indexed links that I've obtained and is holding back or filtering the PR 5's thru PR 8's that I also have obtained, which are not being displayed.
What this is telling us is yes indeed there is definitely a filter not a sandbox but a "PageRank Turtle" instead.
What that means is yes your keywords are there but they are so far buried that it can't even be register. Why can't your major keywords be found? Because you are a new site and according to "google" your indexed PR links are 4 or less. GET IT? Low PR exchanges, Low Keyword Rankings!
Now once the filter wears off and starts to allow your higher PR backlinks your keywords will slowly make its way to the top.
Then once you have done your time and you are completely unfiltered its off to the races for your kewyords and your PageRank.
"PageRank Turtle" Slow and Steady wins the race.
(Quote from Go60Guy citing a Google Engineer) He did, however, openly acknowledge that they place new sites, regardless of their merit, or lack thereof...
Enough of this thread.
He did not mention, or rule out the monkeys though, so they may very well be real.
While people are fond of saying it's inbound links that are sandboxed, this doesn't do an adequate job explaining the above behavior, which I saw on a site with one link, 500 links, new links added, no links added. Despite the lack of any real evidence that particular theory still manages to convince some.
[edited by: 2by4 at 2:49 am (utc) on Aug. 11, 2005]
As always with all link based theories, empirical facts disprove them, I have sites ranking with no inbound links of any particularly great pr at all, they didn't have them when in the sandbox, and they don't have them now. But they do rank. Sometimes for serps in the 30million + range.
However, it's totally irrevelevant going much further, it's adequate to note the presence of the phenomena, the sandbox, worrying about just what it is doesn't really help you get past it, so I don't bother anymore, though I'm fairly satisfied that my working theory is adequate to explain what I see.
However, you are exactly right, it is all there, just not where you want it to be, it's exactly where it is placed in the serps based on a flag, my guess, good enough for me. Flag is gone, back to normal. Nothing you can do that I know of to get rid of it, but I have read some people who claim to be able to, with no great reliability, avoid it in the first place.
Out of all the post I have read on this subject, your opinions closely resemble my beliefs.
Can we all agree with this point. At some point this flag....sandbox....PageRank Turtle or what ever you want to call it will come to an end for new sites and the world will be right again?
As for the condition of the rest of the world after this happens... no comment.
I was at the Googleplex yesterday at 1 pm, and I wandered away from my host and I saw the monkeys. I saw them with my own eyes. People, it's monkeys and bananas.
martinibuster, I cannot refute that! After all, seeing is believing. (Unless of course a strong doze of martini has impared your vision, mental interpretation of your vision or both.)
So, you're sure you did not notice any pigeons fluttering around? (... or at least any traces of feathers floating about?)
(Hey, by the way, what's the difference between dry martini and martini dry? Was an interview question for a barman.)
Is it 8 months or 4 months.
I have been hearing conflicting responses to this. Anybody have any facts to bring to the table on this subject?
That's what the patent says. So, probably there is no simple means of determining it, but it's a relief to think that they indeed use a combination of the above (and perhaps other methods too) to determine date of inception, or else a domain registered for 8 months without being 'discovered' yet by Google will suffer from having an incorrect inception date.