Forum Moderators: open
Sandboxing sites is the cleverest move by google for a long time. The effect on us is to stop building new sites and develop and improve existing sites. I'm sure this is the case with others, and this helps google rather than swamping them with new domains all the time.
The effects of sandboxing as I see it:
1) Discourages 1 minute spammy sites that fly then die.
2) Allows google to monitor the growth of a site, including natural links in and new content, before they really start to rank it.
3) Removes the 'instant success' factor, which makes us all greedy and produce sites with little thought and effort.
In short, by removing the instant success factor, the incentive of setting up new sites has been reduced, and the incentive of working on improving older sites has increased.
Well done Google - smart move.
Let's say, for example, that your homepage is about Blue Widgets. Suddenly, Blue Widgets are no longer cool, and nobody cares about Blue Widgets anymore. So you decide to change the theme of your homepage to Red Widgets. Since you have a link to Blue Widgets on all 1000 pages of your site, all you have to do is change the link text to Red Widgets, right?
Wrong! You will be waiting for months for Google to change with the times and realize Blue Widgets are out and Red Widgets are in.
This is not good at all...I hope I am wrong.
But to assume that a delay in applying PR of new links would also delay anchor text changes is making a real jump. Those are separate operations and should be investigated separately.
I also hope you are wrong, but I'm afraid you are right. I tend to believe that this Sandbox does actually exist and it is not meant for new sites, but for any new links, especially the "external" ones!
I also tend to believe that this sandbox effect could be dated back since the end of last year. Everyone had seen it but no one understood it by that time as the atmosphere was too obsessive about Florida and Brandy update as well as missing index page.
My observation is that - In Dec 03, I tried to boost rankings for less competitive terms by adding anchor text links from different sites with good PR to different pages of one big and well-established site. After G update in Jan 04, I was greatly surprise not to see any significant boost in ranking for those terms. Not until March that I began to see great boost in ranking across the board for those targeted secondary terms. I don't understand that phenomenon until the word "sandbox" had been coiled.
It is for the new links, not for the new sites. Sandbox of new sites is just the by-product of that effect.
With allenp et al, I think the likely solution has to do with the algorithm, rather than a Michrosoftiavellian scheme to drive traffic to AdSense. And evidence is accumulating (not just in this thread) to persuade me that it has to do with new links rather than new sites per se. As usual, the exact details have to do with the 101 factors of pagerank, and so may continue to elude observers.
Remember that underneath all of this, Google still calculates page rank -- a multi-week process. It may spider and index pages every day, but how is the initial page rank to be calculated?
Perhaps with Florida, Google allowed what in retrospect they realized was much too high an initial estimate for page rank, and they realized they were being snowed by the so-called "directories". So they went with a more conservative estimate (like zero?)
Still, underneath the hood, Google reruns page rank once a month (who knows, it might be once every six weeks now?) So a new link waits 1-4 weeks till the next page rank rollover, then waits 4 weeks for the PR to be calculated and propagated to the servers.
I might also wonder whether pages without pagerank had their links included in the PR recalc. If that is the case, each new PR recalc would effectively spider one link deeper in a new site. And so you could easily imagine sites that took 3 or four full cycles to get a substantial number of their pages' links processed.
This algorithm would, I think, show many of the symptoms that have been reported.
It's easy to not like the freshness of the results; but there is a fundamental fact of life: all kinds of outside references (search engines, directories, personal links) favor large, stable sites. Rather than cursing the darkness, design sites to fit reality: Create sites with a stable core and a stable default navigation -- then add "fresh" content directly to the archives, and immediately link to it both by "daily special" links and by permanent archival links. Seasonal content should stay year-round, but be seasonally featured by prominent "daily special" links. (And create large integrated sites rather than tiny doorway domains.)
And if your business plan depends on SEOing your mom-and-pop pen-and-pencil shop past OfficeMax for the term "office supplies" -- that's just gambling. Expect to be wiped out 50% of the time the roulette wheel drops, and plan accordingly (that is, be prepared to weather a three-to-six-month dry spell anytime.)
So sandbox is about links in and not new sites.
What I can't get my head around is how does this improve the serps? The only thought I have is that by delaying the high ranking of new sites/pages then this buys time for google to properly vet these pages/sites with other spam filters. It also wants to see a natural growth in links over this period of time.
Is the following possibly correct....
A new site/page that gains a high pr from a few links in will never rank well, because the site/page also needs to be substantiated by a number of links in from numerous other sites, whatever their pr is. This takes time to identify hence sandbox.
This is working on the assumption that if a pr6 site links to you (because you are a quality site) then logic dictates that at least 50 low pr sites will also link to you.
Conclusion: Quantity of links in is required before quality of links in has an effect.
boy did they ever sell you people the "do no evil" motto... but good!
How many "ducks" do you know that actually come up in your face via their PR depts that post here and say ..."Hey Guys ..WE ARE A DUCK!"....
If I'd known that it was so easy to build this kind of blind customer loyalty I'd have spent more time studying programming in the seventies and built a search engine .....
Waking up to the real world may not solve the problems of some of us in relation to what google is doing to maximise its profit...
but at least there'll be less cringe making naivity ...when I worked in Pr and advertising a long time ago ..people of the quality and caliber of those who post here were much much harder to fool ...!
New Domain Sandboxing,
Link Aging,
External Link Aging and
Delayed PR Application.
I am convinced that as far as new sites are concerned (External) Link Aging has all the symptoms of New Domain Sandboxing. It also seems a more effective solution because it takes care of more types of G spam.
I read a lot of posts in the past saying that new pages in existing well ranking sites rank well within days. Assuming that new pages require new internal links to them, it would mean that internal links do not have to age (are not sandboxed). Consequently, link aging is only applied to external links. Hence, External Link Aging is my favorite so far.
Nevertheless, External Link Aging does not explain the delayed application of anchor text changes. So, let me throw in another idea:
External Anchor Text Aging
It's basically the same as External Link Aging, except that the age is associated with each occurrence of a keyword in the anchor text of a link, instead of being associated with the link itself. [Read the previous sentence one more time] The higher the age, the more relevant the keyword occurrence is. I don't know much about G's database model, but I think this would be easy to implement, or at least easier than associating the age with the link. One has to keep in mind that G optimizes its algorithms towards fast search and that the search is fed with words. If the associated age had to be fetched from a separate database table, the search would slow down significantly. But if the age is stored with the keyword occurrence it can be used to weigh the matches with negligible overhead.
I have said this in the past. I truly believe that some of you are pushing the envelope to far on SEO and need to back off a little. Whether it’s getting to many external links from the wrong places and you don’t know that Goggle has penalized those websites that you have links to and from. Or your creating a bunch of pages that are only there for one reason {TO SPAM}. Not to sell another kind of product, but to get people to your website to sell the same product that you are actually selling on 10 other pages.
When you try to ride on the edge of what the search engines allow and doesn’t allow. All they have to do is change the rules just a tiny bit and now you look like you have gone passed the line that you are not allowed to cross. Then you get some sort of penalty and wonder why.
Sandbagging:
I believe sandbagging explains many problems with getting new websites listed. I also believe it is a necessary step to curve the problems that are created by the webmasters that are finding new ways to spam. It seems to be the lesser of two evils. Make the new websites wait a little long then normal to see if they are on the up and up. And in the mean time, to try and weed out those websites that have been around for a while that need to be penalized for spamming. If you think about it, Goggle has a hard job ahead of them.
On the Internal anchor text:
In my main website we added 50 new pages about 30 days ago. This doubled the size of the site. All of the new pages were picked up in the first 2 weeks. I made sure that the appropriate amount of internal anchor text was used, not only to help my customers, but to also help with showing the importance of each page to the search engines.
My website has been around for about 4 years and this may be part of the answer, but we have no problems with getting new pages listed and they are by the most part in the top 10 of the keyword phrases we went after.
I just think some of you push the envelope too far and don’t realize it. Remember Optimization 101. How many of you went a little beyond where you should have?
Wasn’t trying to offend anyone, but there’s my two cents.
CygnusX1
External Link Aging:
Some weeks ago I put a link to a site that went in the top 10 within a few days for the achor text I used. Anybody had similar cases that could tell us something about a possible aging factor for external links?
What I can't get my head around is how does this improve the serps?
we have no problems with getting new pages listed
A while back, maybe even a year or so, Google did something to dampen the PR on some of us with older sites ( I've had one of mine since 1996). I guess it was considered they were old and stale and unfairly getting old links. Looks like they are going in the other direction now.
Is it a massive number of inbound links from one site (ie aquiring footer links) that triggers the sandbox?
Is it a substantial increase in the number of inbounds period (ie going from 50 inbounds to 100 inbounds in 1 month) that triggers the sandbox?
Or is it a combination of both. While it doesn't seem to be affecting 100% of the people it is affecting a majority.
Are your links in from "authority" websites?Aprox how many different sites do you have inbounds from?
Do you have any sitewide inbound links?
A> no, they are not
b> hundreds, but few are more than pr1 or pr2. They are from blogs. (too specific?) - no not comment spam.
c> no no sitewides
edit: it's possible that my rank is due in part to the googlebomb effect.
I'm afraid I just plain don't believe in sandbox. I have a new site that within 2 weeks was showing #6 for a very competitive keyword and still shows a month later. Sandbox is a myth imho.
I think you probably should have said that you don't believe the sandbox effect is a problem with your site. For most of us the sandbox effect is a fact of life that we have had to live with for the last few months.
Bear in mind that the sandbox effect will be triggered by the algorithm so you may be lucky enough to have escaped.
Those people who make one web site after another, one "fake directory" after another, one doorway page after another ... who cloak (the bad kind) ... who use ... hidden links, hidden text and every other trick in the book are pariahs. They refuse to work hard and honestly and follow the rules. They want the fast track to every get rich quick scheme there is.
Actually, they probably work very hard... but i'm not saying i'm in league with them, just that it's not easy to pull off large scale manipulations.
Anyways, either the sandbox rules have changed over the last two months or there DEFINITELY are triggers. One site spent about 10,000 on links per month, and had a large amount of content, quality site overall, and didn't rank in the top 1,000 for 2 months for its money term. Ranked pretty poor for obscure terms too.
2nd site got about 15 links 1/2 from his own sites, same ip and is ranking in the 20 - 300's 2 days after new domain propogated (domain, content, and links are two days old) with only 2 pages(of 30) indexed. Keywords are not very competitive, but it's still quick.
Interesting note is that the first site did not have any penalty; no changes were made and after two months it shot up to top 10 for an incredible amount of competitive phrases.
yes, i belive this theory.....but has anybody actually had a site in top rankings and then suddenly went out on a massive links campagin? If so, how did that site get affected? did it loose or gain rankings? or stayed the same?
It may also be designed to disrupt the SEO link sales industry. Imagine trying to price your links now...
Sandbox special on PR9 link! First 3 months for $29.95!
4th month is standard pricing --> $500 per month.
;)