Forum Moderators: Robert Charlton & goodroi
If your site is less than a year old you are likely sandboxed.
I can't believe most sites under a year's age are in some sort of penalty box. Google would be useless. So, I want to know:
1. Are all sites sandboxed, or do certain traits (like affiliate links, low content) trigger it?
2. How long does it last?
3. How variable is the duration?
4. How do you know your site is being sandboxed?
5. Does the effect taper off or is it a binary thing?
6. What gets you out of the sandbox? Is it merely time or do good links or whatever speed it up?
Thanks.
From my small sample of my own sites, I can tell you that both .biz TLDs created in the last 6 months are doing well in Google SERPS but all my .co.uk .com and .nets are nowhere.
I seem to remember hearing this somewhere else on WebmasterWorld recently too...
I'm sure someone can knock down this theory pretty soon if I am wrong.
so called 'sandbox' exists and does discriminate against alot of new new sites ( maybe as a side effect of something intended to catch spam sites) particlulary in commercial sectors ... I've seen non SEO'd sites get caught many times now ..... all depends on how you are measuring what has or has NOT been sandboxed, this is what causes a lot of confusion along with the inaccurate 'sandbox' tag it has been given.
Sandboxed sites do
1) get index by Googlebot (on a regular basis).
2) get listed in search results (but very poorly particlulary for competitive words/phrases).
3) Eventually get listed in a similiar more expected fashion on a par with older established sites (6 -12 months approx).
Sandboxed sites do NOT
1) get totally banned from the Google index
IMHO of course based on observations of many sites!
What if the answer was more simple than all of that?
Could it be possible that with all of the billions and billions of pages that they have indexed (far more than other SE's) that they do not have the computing power to process all of the pages with such a complex algorithm?
In this scenario, new sites would be indexed and then have their content parsed by a "preliminary" or broad based algo. Over time, these sites would be integrated into the main algorithm(s) across all of the data centers.
Even with a vast network of PC's, I can't imagine how much time it would take to process 8+ billion pages through an algorithm with hundreds (if not thousands) of variables.