Welcome to WebmasterWorld Guest from 188.8.131.52
In reply to a question from Brett Tabke, Matt said that there wasn't a sandbox, but the algorithm might affect some sites, under some circumstances, in a way that a webmaster would perceive as being sandboxed.
So, for some sites, in effect there IS a sandbox.
they have nothing to lose by saying it exists so why do they keep saying it doesn't.
I think they would have something to lose by admiting it which is why they don't. If Google admits publically that new sites (however great the content) don't rank well for up to 6 months or even a year, other engines, especially MSN can market theirs as the most up-to-date SERPs. Search is 50% buzz and Google could tarnish its reputation.
The existance of the sandbox makes perfect sense and is a major deterent for spam, but Google, however much they try, shouldn't be allowed to have it both ways... boast SERPs with less outright spam vis-a-vis other engines (for which the sandbox is a major aid) while also pretending that the sandbox doesn't exist and that their SERPs are are as up-to-date as the others (which they are not). For new sites, MSN ranks them far earlier than the others (but also pays a price for doing so).
As my sites are all commercial in nature, I wonder if there's anyone out there who has built a non-commercial site (no ads, affiliate links or e-commerce) that has been boxed?
I have. The site in question is informative, wikipedia-style site with no commerce at all and no ads, and claiming in TOS that it's never going to involve in putting ads. There are two language versions, in according domains, bought at the same time. Language versions are almost mirrors, hand-translated, but there are a few items existing only in one language version. There is white-hat SEO involved, exactly the same on both language versions.
Both versions alweys had the same tPR, it was PR2 after 3 months, PR3 after 6 months, now it's PR4. Each site has about 250 pages indexed, all with snippets, no URL-only listings.
One language version failed to rank for first three months, and then it suddenly jumped up to #4, now it stays #2 for half a year. Another version still is out of first 1000 results for its main phrase, however it ranks for a few specific terms and is in first 100 for allintitle:mainkeyphrase. It's 11 months old now, so I think there is a chance it will be out of sandbox in a few weeks or after another 4 months or longer. :)
Is the English language version the site that is boxed?
I ask because it is very interesting that between two sites that differ only in language, only one is boxed.
How can Google defend boxing such a site, especially since the other made it through?
Best wishes for a quick release.
joined:Nov 3, 2005
Site is not commercial - it doesn't sell goods and does not contain advertisements. But the targeted keyphrase is commercial, and quite competitive. Other sites that rank for this keyphrase sell items my site describes, and there are adwords shown on results page.
Version that still doesn't rank is in English, in .net domain, while the one that started to rank after 3 months is in Polish, in .net.pl domain. Both versions are almost identical, except the language difference and a few items typical for certain language, and there is a little linking from Polish to English version - like linking to main, international version of the site and to original language versions of described items.
First possible reason would be, that country-specific domains had shorter sandbox period than international domains.
But there are other possibilities. Both my sites have PR4 and domains are only 11 months old. I understand PR itself doesn't mean much, but still, it tells a bit about how many links the site has, and even if PR2 can outrank PR4, obviously PR0 can't match with PR8 (except if the reason is the delay in toolbar update).
The sites I 'compete' with in Polish language usually have less, however #1 is PR3 spammy site using crosslinked subdomains. Among other, there are some country-wide online salers. Google results count is 3,510,000.
My 'competition' in English language have much higher PR, often PR5 and 6, including Amazon subcategories and old, respectable sites from the subject. Google results count is 45,200,000.
So, there are enough reasons for my English version to fail to rank. However, I'm surprised it's still out of first 1000 of results.
I don't grow impatient, but I just keep adding content steadily and devise new features for my users, to make the site as good as I can, and sooner or later it will rank. This is a long-time project, so I can wait. Most sites I work with are at least 4 years old, so I can accept that this one has to age a bit before it start ranking.
If Google admits publically that new sites (however great the content) don't rank well for up to 6 months or even a year, other engines, especially MSN can market theirs as the most up-to-date SERPs.
I am not sure that I can concur with this. MSN could do this right now and I am sure they could come up with enough evidence to prove it if challenged. The evidence suggests that the search engines are not going to get into "ours is better than theirs" type marketing. Why? Well the fact is that the general public would not have a clue what they were talking about and they don't give a **** anyway.
If you made a site absolutely not commercial no affiliate no adsense there is 99% no sandbox.
Not in my experience. I created two 100 percent, non-commercial, hobby sites at the same time last year and both were sandboxed for about nine months. What did they have in common? They were both heavily (white hat)optimised ;)
joined:Nov 3, 2005
For my travel site, the site name (non competitive 2 word term) I was on the 2nd page of the results during the first 11 mths, then it jumped to No.1 and other keywords soon followed.
I don't believe that this algo for new domains is based on the subject matter of your website, but I do think that there is something that triggers it off within the algo, what this is, who knows.
Why? Because they subscribed to a pay-for linking campaign, they have had dynamic link featured in hundreds if not thousands of worthless, pseudo-content pages.
This site doesn't exist on Google. It is sandboxed.
On the down side, I have two sites that have remained buried deep in the pit since April, despite a slow, natural increase in content, pages and inbound links.
The sandbox is a tradeoff. Google gains something at the expense of something else.
I believe that they have made a philosophical decision that clean SERPs are more important than fresh results.
Are they right? Are they wrong?
They may be right.
Spam results are truly irritating for the surfer. They discredit the search engine, and throw doubt on the legitimate results.
It's too early to order that new car I had my eyes on, but certainly I am in very much doubt now as to the sandbox theory.
Congratulations Promis, but could you report back in three or four weeks time if your site is still ranking? Many of us have witnessed a temporary ranking effect immediately after sites are indexed but the sites usually bomb after enjoying a week or two of good results.
I think sites that make their deput to the web aggressively are more likely to have the sandbox experience.
Not in my experience, I have launched several sites both "aggressively" and non-aggressively during the last year and a half and they have all been sandboxed.
It is unnatural both for search engines and for human that a new site would have a bunch of other sites linking to it right from the very first day of its web presence.
Well perhaps from the very first day but anyone planning a website properly should surely have some sort of plan in place to get links from other sites. Isn't that what the Internet is based upon? I would say that this is very natural behaviour.
It is completely natural for a new site to have a bunch of sites linking to it the day it launches... unless you define "bunch" as hundreds or thousands. But the most natural thing is to have a bunch (say 10) sites linking to a new site when it launches. Sometimes I wonder how some people manage to define "natural". In this case, apparently the owner of a "natural" site has no friends!
Steveb, I agree that some links from friends would not be unnatural. I did specify I have tow incoming liks that helped it get indexed. Unnatural would be sidewide links from multiple sites, same calss c etc. I did not aim to more than 2 links just to be on the safe side.
Yes, I will report the how it progresses because as I said I am also still very sceptical about it. It seems to me though that Google is giving the chance to new sites to prove themselves. By the time it spiders all pages and all incoming links for a new site then things may be different.
What suprises is that I get even a few top placements over long established sites raking tops for the last couple of years. However I noticed that on these cases visitors need fewer links to get the information they requested from my site than from the long established sites.
joined:Nov 3, 2005