Welcome to WebmasterWorld Guest from 220.127.116.11
In reply to a question from Brett Tabke, Matt said that there wasn't a sandbox, but the algorithm might affect some sites, under some circumstances, in a way that a webmaster would perceive as being sandboxed.
So, for some sites, in effect there IS a sandbox.
However, despite this agreement, I've seen sites come out of the sandbox with no adwords at all, it's just something that happens. beedee if I recall has also seen this, as have others, it's just like one day you aren't ranking except for allin searches, the next day your traffic is up.
One possibility is simply google tracking of user behavior by click throughs on links on google.com, an adwords link may count in that game, I hadn't thought of that, but no reason why not, the searcher clicks on your adwords thing, stays on your site, google decides your site is legitimate, and relevant, after all, despite having no real backlinks.
Your issue is lack of quality backlinks, pure and simple. Maybe adwords helped, maybe it didn't, but if you have no quality backlinks, google doesn't consider your site to be quality, that's very simple.
Authority or hub status only comes from authority or hub links, trustrank is the game, in competitive terms, get it and rank. Of course, now we'll have to sit back and watch all those guys whose sites don't have trustrank tell us it doesn't exist because they don't have it....
I removed the lines from the script, filed a reinclusion request, and explained as Matt suggested in his blog. He says that if the rankings fail drastically, one might have a spam penalty, and that can be removed manually. I'm hoping that things get solved soo, I have been hit hard financially, and this is supposed to be the best quarter. Being stupid does have consequences :)
Thank you for coming in here and talking sense. I generally lurk, too, and you have given me hope that there is someone sensible to discuss this with.
Graphs, norms, standard deviations. Perhaps different slopes and intercepts for different themes, sectors and regions. Whhhoooooossshhhhh. Just lost a lot of eyes, hmmm? Sorry, I will try to stay away from explaining my theories with integral equations.
Since I don't have anywhere close to the statistical sampling that G has, I have to just "visualize" the multiple variables of the algo (code, links, content,
user behavior, age, on and on) ; the graphs in my mind are actually three-dimensional, they do not have a constant slope over time (initially a high slope and over time flattening), and the standard deviation widens with time (age). What I visualize in 3D (they don't call me cws3di for nothing) ends up looking like what many people would call a cornucopia.
So, at the beginning of your domain's life, it is a very narrow margin that your site must fit in (the tiny space at the butt of the cornucopia). If your site has a long history, it tends to stay within the margins easier because of the big bell shape it can move around in. However, as we have all seen, even older sites can change content, lose links, or fall astray of new algo updates, then fall out of the cornucopia norm.
It is also understandable that G would not ever have conceived of a term like "sandbox", since that sort of brings to mind cat poop, but a cornucopia is full of fruit ;-)
Thanks again Sparkys_Dad, for inspiring some courage in me to post this.
Authority or hub status only comes from authority or hub links, trustrank is the game, in competitive terms, get it and rank.
Bingo. It's all about the patent. It's there to give us clues as to what is effective, and much of it is already in use.
Bottom line is anyone can call it what they want, there is a penalty, discounting or sandbox - pick your term - on new sites for what are perceived to be heavy competitive terms. This could be evaluated by Google re click throughs, adwords or a myriad of methods to determine how competitive a phrase is...
One thing I have done that has seemed to really help me is to register domains long before I intend to use them. I have domains for many of my genres that are over 2 years old. Long before I use the actualy domain I create some useful content on my site and get it indexed. I make sure its not a ridiculous domain as well - like this-is-my-ridiculously-long-gift-site-domain.com
Take the time to scour and do some real inventive thinking. Still lots of great domain names to go around that you would be surprised to get...
I have found that domains over 2 years are indexed almost instantly in Google with about 5-10 inbound links. Domains recently registered are taking over one month at times. This could confirm the weight of longer term domain registration.
Food for thought...
Welcome aboard. Excellent post and I could not agree more.
I have a little sandbox anecdote for you from an earlier post that you might enjoy. I think it's funny, which tells you what kind of a person I am--awful.
Last year I rebuilt an old site that had always done well in MSN (#8) and Yahoo (#3) for its main search term, but could not seem to grab a single top ten in Google. I sold the rebuild (nearly 10G) based on my experience with achieving top tens in Google. The site was for an owner-operated brick and mortar retail concern that wanted to expand its customer base and add e-commerce. Before the rebuild it was about 30 pages, but after adding individual product pages and e-commerce it ballooned up to nearly 300.
As soon as the rebuild is launched it goes right into the sandbox. A complete faceplant. The Google rankings went from bad to worse. The guy is livid and repeatedly calls me to discuss what we should do. I tell him to do nothing. Needless to say he is not thrilled with that response.
As he is now out of a lot of money with little to show for it he stops calling me. I half expected to be served with a lawsuit. I continue to monitor his rankings and sales--both abysmal.
Sure enough, about five very long months later he's suddenly getting top 5 Google rankings for his main terms. His business goes through the roof. From grabbing his ankles to king of the world overnight! The crazy thing is that we made no changes after I did the rebuild. Nor did we add any inbound links.
Here's the best part--prior to my rebuild, he had accumulated more than two hundred completely off-topic links from a friend who owned a huge number of domains. I warned him that those links could eventually turn into liabilities, but as Google was counting them all as backlinks I did not insist that he remove them. I thought it best just to warn him.
Sure enough Jagger shows up and he's nowhere to be found again. Gone, as in outta here. Back to grabbing his ankles. Three weeks later he sells the business.
Since then the new owner has called me to see what we can do about the rankings. I sent him a proposal, but he has yet to commit with a deposit.
BeeDeeDubbleU, a couple of months ago you and I had an argument about whether the sb was a feature or a bug, remember? Now, according to MC it is a feature of the algo, deliberately implemented. Yet you keep bragging about being Mr. Sandbox, having known it from the beginning and such. I wish you wouldn't have such a short memory ...
No I don't remember this perhaps you could point me to the thread? Whether it is a bug or a feature is surely not the point now and where did this "bragging" and "Mr.Sandbox" nonsense come from? I have nothing to "brag" about because I have not yet managed to beat the sandbox. What have you been smoking?
Contributions like this add nothing to the argument. I will admit that I get a little upset myself when people tell me there is no sandbox because I know from personal experience that there is. If I have a headache and someone tells me that I don't have a headache am I supposed to believe them? "Oh yeah! OK. I don't have a headache. Can you please pass the aspirin?"
The sandbox is not a good thing. As someone has already said it is an admission of failure on Google's part.
Cws3di, Sparkys_DAD and CainIV I am glad to see that you and several others see this in much the same light as I do. It gets so frustrating when people tell me that the nose on my face does not exist.
Bottom line is anyone can call it what they want, there is a penalty, discounting or sandbox - pick your term - on new sites for what are perceived to be heavy competitive terms.
I think the emphasis must be on the word "perceived" here. As I have already said, two of my earlier sandboxed sites are probably not in a competitive category. Having said that, if Google are satisifed with the collateral damage caused by the sandbox as a whole the risk of wrongly categorising competitive/non competitive sites is probably acceptable to them.
>>>can do a search for Web design country and I am in the number two position.<<<
Are you serious? I'm certainly not knocking your abilities, just your chosen example. Surely you must have a better example than that to give.
How many people come to your web site searching for 'web design country'? Overture shows 36 total searches...
It could be web design chile or web design canada or web design sri lanka or web design malta - see? Get it now?
Hope that helps. Think next time, OK?
"I was at the Q&A and listened to Matt's response. The part that I thought was interesting was that Matt said when they (Google) first started hearing about the "sandbox" as the term is used by webmasters they had to look at their algo to see what was causing it and then look at the sites it was affecting. Once they studied it, they decided they liked what it was doing." - Idaho
According to this statement the Sandbox effect is an unintended feature. I would conclude it is a bug.
I wouldn't conclude it's a bug. I'm sure the algo is complex and very deliberate. What I think Matt communicated and what I tried to report was that when they started noticing webmasters describing a certain affect of the algo (which we were calling the "sandbox") they had to figure out what we were talking about and then identify what part(s) of their algo was causing it. (Judging from this thread, the hardest part of the above was just trying to figure out what we were talking about).
I believe the term “sandbox” was originally used to describe the phenomenon of some new sites not ranking well until after a certain period of time had lapsed. Now, if we keep this narrow definition, it might be possible to identify a part of the algo that causes it. For example it could be that Google crafted a part of the algo to not give credit for certain “unnatural links” until those links had matured for a certain period of time. Eventually after several months, these links start counting and the site pops out of the sandbox. With a narrow enough definition of “sandbox” and with enough empirical evidence, we should be able to understand it.
Unfortunately, the biggest problem in getting to the bottom of it is that when we try to discuss it, someone else broadens the definition of sandbox by saying that older, more established sites can also go into the sandbox; (even though this is likely caused by an entirely different part of the algo.) Someone else notices some other penalty and attributes it to the “sandbox.”
Now all of a sudden, the term “sandbox” is much broader and harder to define because there is simply too much conflicting evidence.
The first litmus test is that it MUST be ranking well on Yahoo and MSN and very poorly on Google.
If it is not ranking well on MSN and Yahoo, why should you expect it to rank well on Google? Further, if it not ranking well on any of the big three, you need to bring your expectations into line with your abilities.
If it passes the first test, and you are not employing spammy techniques, and it is a new site you are in the box. End of story. And you will come out when the age and site characteristics fall into Google's model.
The second test applies to established sites only. Your Google rankings may be good, average or poor, but you do appear in the SERPs. Suddenly your rankings drop precipitously, but you are still in the index.
If you have recently added a substantial number of new pages OR inbound links in a very short period of time, you MAY be in the box. OR you may have incurred a penalty for what Google perceives as spammy tactics.
How do you know the difference? You don't.
If the site later reassumes or betters its previous Google rankings without substantive changes on your part, you were in the box until enough time had passed that the new pages and/or links could be attributed to natural growth and you once again fell into Google's model.
In other words, if you add 300 pages overnight, Google says, "Back off, seo dude. 300 new pages should take 6 months to accumulate. You must be trying to pull a fast one. Please step into the box, we'll talk again in 6 months."
Examples of situations where you are NOT in the box:
1. You are ranking on Google and a new algo--like Jagger--rolls into town and suddenly you are nowhere to be found in the rankings, but still in the index. You are not boxed, you have been penalized for what Google has suddenly deemed spammy tactics. You'd be better off in the box, because this penalty will not lift unless Google changes its algo or you remove the offending tactic--if you can figure out what it is
2. You are dissatsfied with your Google rankings, so you try some dodgy technique to improve them. Suddenly you are nowhere to be found in the SERPs, but still in the index. You are not boxed, you have been penalized for what Google deems spammy tactics. Once again you'd be better off in the box, because this penalty will not lift unless Google changes its algo or you remove the offending tactic.
If you don't have to do anything but wait to alleviate the penalty--YOU ARE IN THE BOX.
Hope that helps somebody out there.
Who knows? Who could possibly know? The Googletechs will, no doubt, test to find out what actually happens, rather than polling any number of people.
Proof there is no sandbox....
>"....and I now get 10-15 referals per day from google"
Why do we bother posting?!
It's interesting that the effect of some algo's is now acknowledged by Google as having what we have labelled as a 'sandbox' effect. I'm sure it's a mix of links taking time to mature and those links have to be quality and providing a hilltop influence for a search term. Searches where hilltop is not applied (uncompetitive searches) will allow a new site to rank well for one or two phrases, although when a completely unique word relevent for a site fails to bring a new site up baffles me. People often think a 'competitive search' is one where the number of results is high but this is not the case. When a new site ranks high for a term, it yeilds little traffic..... usually about "10-15 referals per day ".
There is no quick way around 'sandbox'. You just have to wait until the right boxes have been ticked by google and they allow you to rank well for many search phrases. This is done by acquiring good natural links, good seo and a lot of patience.
"While I agree that the term is overused, I don't think it applies solely to new sites."
Yes it does. The "sandbox" only applies to new sites. New sites tend to have a time lag applied to them before they are allowed to rank normally.
Anything having to do with older sites is something else and off topic of the "sandbox".
Now I don't expect that saying this will do any good. There are still going to be people who will say:
sandbox = ranking poorly. But that is just useless smoke blowing.
If you want to talk about established sites ranking badly, for heavens sake invent a new name instead of hijacking sandbox threads trying to reinvent words.
We could add more data to this list if we know and require it, but i feel we don't require.
Now, we only have to combine in valid ways two or more of the premises above, or the results of previous combinations, untill we reach a conclussion that answers our question.
4.1 + 4.2 + 5 = Sandbox affects sites that pass from nothing to competitive in a short time. We'll name this statement 'A'.
A + 3 = The 'sandbox' effect is something else but not 'sandbox' OR google is lying. This will be 'B'
A + 2 = The 'sandbox' effect, whatever it is, is an answer or part of an answer from Google to a non-trusted (this is: most probably 'abusive' SEO) improvement of rankings. 'C'
C + B = 'sandbox' is a collateral effect of Google's anti-SPAM algorythm, OR Google is lying. 'D'
Is time to add a bit of data:
6.-Google takes no profit by denying the existence of 'sandbox': no matter how it's named, we already know that it happens; so there is no reason to lie about it.
6=>Google is most probably not lying.
D + 6 = Most probably, 'sandbox' is a collateral effect of Google's anti-spam algorythms. This is reinforced by statement 5.
We can not reach a 100% trustable conclussion unless we know if Google is lying or not, so the last estatement, which already includes the 'most probably' particle.
I completely believe it, and my own experiences in SEO tend to confirm that: my 'sandboxed' sites got out of the sandbox just by slowering the SEO work, and I've completely avoided sandbox by starting SEO a bit slow, and increasing efforts gradually.
But I guess that this 'formula' to avoid sandbox will be much more interesting than the complete logical analysys about the hated sandbox.
Hoping be useful,
Herenvardö, Happy Hippie Heviata a.k.a. H4
BINGO .. on the mark!
But the reality is .. instead of thinking about slowing down the SEO development curves...make sure you are increasing the customer centric initiatives (focus on your customer base) .. and you will see a better chance to avoid the "sandbox"...
The more focused you are on your customer initiatives (be it content for adsense revenue or content for products/services) .. the better chance you have of being in the SERPs early on in a new site's life cycle and then establishing yourself over the long haul (unless you are a quick-flipper)
One thing that I've never seen mentioned in these discussions is the fact that nearly everyone who ever talks about the sandbox is also in the business of constantly adding new web sites. Wouldn't the act of constantly creating new web sites possibly be a big signal that these web sites are spam sites?
The only arguement that I can imagine to this would be that Google would not know when a new web site is one of a series of new web sites that a person/company is creating. My answer to that would be that I believe nearly all of us use the G toolbar, and I would think that it would be very easy for Google to take the surfing patterns indicated by the use of the toolbar to tie a series of web sites together, and thus determining a pattern of when too many web sites are added too rapidly. Think about it, it is often mentioned that after a sandboxed site is left alone for a period of time, it comes out of the sandbox. This also means that the sandboxed web site is not visited as often by the web surfer (the owner) with that particular G toolbar installed.
This could not be the only part of the algo as many of us have had new sites that were sandboxed while creating other new sites that have escaped the sandbox. But this could be an element of it.
Just a thought....
Two are now out and the other celebrates it's first birthday tomorrow and it's still in the box. It has a domain name like keyword1-keyword2.com and it is heavily optimised for these terms. I have tried backing off with the SEO recently to see if it makes any difference but as yet to no avail. It gets no Google traffic but I have noticed that it has started to get found for the occasional image search so I am hoping this is a precursor to release.
Note that none of these sites were launched with any more than two or three links. In my case I just cannot see the the problem being link related.
[edited by: energylevel at 10:41 pm (utc) on Nov. 20, 2005]
According to ****pedia:
US military slang: "term often used in an attempt to speak somewhat jocularly or euphemistically about the desert region in which daily life for US military members can be very uncomfortable."
In a metaphorical sense: "a place safe for play or experimentation."
Computer security: "a container in which untrusted programs can be safely run."
I thought it was maybe US golfing slang for "bunker" but more likely it refers to when for no apparent reason (except to Google) a new website fails to appear in their index for a prolonged period.
1) The Sand Trap only affects NEW sites.
2) MOST new sites are forced to spend time in the Sand Trap before entering the SERPs.
2) All sites eventually escape the Sand Trap and enter the SERPs.
3) Some sites escape the Sand Trap and enter the SERPs quicker than others.