Forum Moderators: open
I have been posting in favour of the Sandbox's existence and I have 2 sites firmly stuck in the sand!
However...
2 weeks ago I registered a brand new domain and started to build a new site. I knew it would be at least 6 months before anything happened but..
This morning it entered the index for the first time - straight on page one for a one word search (a town, granted only 194,000 matches) but none the less the last 2 sites still cannot achieve similar results after 6 months.
Also preliminary early pages ranking very well
The site has only one incoming link, no adsense, banners or anything, vanilla html etc.
Built as per my last 2 sites so clearly something has changed!
Regards and hope to all
Rod
I'd be interested to here if anyone has set up a new site on a brand new domain that is in an industry that is related to what some refer to as 'money terms' and bypassed the filter for really competitive terms.
I have yet to be convinced that anyone has bypassed the filter on a 'money terms' related site for competitive terms. Maybe a few have got reasonable rank for second or third tier phrases in less competitive areas and think they have dodged the sandbox filter.
I would like to ask anyone who claims to know exacly how to get around the filter for new sites/domains in competitive, lucrative areas are you using black hat or white hat tecniques, guess they wouldn't admit to anything on public forum though!
Basically, new site struture held the same rankings in the SERPS when I used 301s for previous links to the old site structure. When I amended the incoming links to reflect the new site structure, bam!, no rankings.
So new pages were fine...new links were not...
I am convinced this is why it got through. I don't think there is a specific magic bullet, various things trip the filter.
I think launching a site conservatively and allowing it to initially grow more naturally is the answer.
What if G assigns a certain weight or value to different on-page optimization factors:
10 pts for keyword in <h1> tag
8 pts for keyword in <h2> tag
4 pts for keyword in description tag
3 pts for keyword in filename
2 pts for each instance of a bolded keyword
...and so on. And if you EXCEED a certain point threshold, you are "sandboxed." G can tell what words a page targets pretty easily (redundancy).
Just a shot in the dark.
After indexing by G it was #1 for that those three keywords, and stayed that way till the next PR update and then fell to page three. We spent the next two weeks watching it slowly climb back to #1 and then 2 weeks later it fell back to page 3 after a PR update. Two weeks later, which would make it June, it was #1 again and has maintained that position ever since.
Since my wife is no SEO expert, I doubt whether we can contribute the success of this website to her ability.
In fact we have no idea why this website remains #1 other than by June other serps had picked it up and by the time PR came around again (which we dreaded) it had enough links to stay #1.
The interesting thing is, that in June it was #1 out of 5,000,000 results. In October it is #1 out of 10,000,000 results, and when we checked yesterday it is #1 out of 17,000,000 results.
The three keywords are in her url, title tag, meta, and then scattered separately on her page.
Was this site sandboxed? We don't think so, the varied results came from the PR update. But this was 6 months ago and I know things have changed in the serps. The website is 20 pages and she has 5 outgoing links. Incoming links? Very few.
When searchers type in the three keywords together, the website will get pulled up #1 and it is exactly what the surfer is looking for.
[google.com...]
you will end up in the sandbox. (if you take the time to read them about 80% of “seo” is in there)
Thus the question becomes why? If you follow exactly what Google tells you to do, your site will not surface for its intended purpose. That’s what I am struggling with. If people have found a trick to get out, (whatever the nature of that technique) it doesn’t really change the fact this thing exists, and is contrary to the very instructions Google gives you.
They don't say you will get great rankings, just ranked.
>I'm not sure I get this ..... if the site is not optimised for Google then how would it compete with estabished optimised sites that don't face the prospect of the sandbox?
If phantombookman is now out of sandbox, he must be ranking and getting some traffic. To displace established sites always takes a bit longer... no change there. The question is, are the top sites there because of the same seo tactics that worked in the past, or are they there for new reasons? A site may have tons of h1 and anchor text, but this may not be the reason its number 1.
Wha...wha...wha...what? Still?
AFAIK, Brett & Co get lots of credit for letting these general sandboxing threads go on, and on, and on, and on.
A handful of members have been posting since May/June (albeit reluctantly given the sometimes rude and even nasty responses they get), that this is not a "sandbox" but rather a combination of badass algos and filters. It has also been repeated over and over again that this so called sandbox is not absolute. It does not, and never has, applied to all new sites. It does apply to a far greater percentage of new sites than ever, which all by itself should be very informative. IMO, a variation of it applies to older sites too...or maybe not. Hmmmm...what if it's the same rules? ;-)
Also, there are various ways of getting around it.
Rather than coming here and making posts like, "Please tell me how to crack it" or "I don't believe..." or "...money terms..." or "If people know, someone would have posted the solution...", more posters should get out there and do their own research.
There are sites out there that have cracked the code (including some that probably never heard of SEO), and they come in different shapes and sizes, for money and non-money terms. And they run the gamut from black to white hat. There are multiple ways to show well with any algo. Some innocent, some hard core. Lately, hard core seems to be getting all the attention. :-(
I have no idea if WW would allow it, but new threads on specific aspects of the current algo might be more productive. For example: "How is ageing measured: Site, page, internal/external links, none of the above, or some combination of the above?"
As far as money terms go, forget that. It is perhaps closer to think in terms of volume than cash. They can't easily connect cash with searches without violating rules of separation that they've sworn they won't violate. If they violated those rules and were found out, they would lose all creditibility, and the press really would have something to talk about.
MHes, your interview with the Sunday Times...very funny. :-)
That’s just it though, if your first page for allinanchor, allintext, allintitle, and you are getting a fresh tag applied every 2 or three days, and yet your main key word is not in the top 1,000; your not getting ranked.
Yes you are, you are being ranked as outside the first thousand.
Sorry to be picky over your use of language, but you are ranked because you are deemed to be 1001 or lower in the list. All those above you rank higher. The fact that you can't see below 1000 is a separate issue.
DerekH
What exists is a barrier for new, quality sites that searchers would objectively want to see if they type in a query, with the example of typing in a famous person's name and not getting a decently optimized four month old official site in the top 100 for that name. That is simply rotten search engineering, and there is one whole stream of issues relating to that. If adding 10,000 blog links would kick that official site free of the sandbox, that is just a tactical optimazation issue that flows among a very different stream of issues.
The barrier exists. Google's results and the searching punblic are poorer for it. Tactics to beat it really have nothing to do with that.
Good point.
So OK. The sandbox has *not* been abandoned, because it *never existed.* It's not a thing apart from the previous forms that G took.
The question, "Has the Sandbox been Abandoned?" is like asking, "Has Florida been abandoned?" Well, yes, and no. The current algo is an extension of Florida and all that came after.
Also, since people have been calling it the "sandbox" it has changed a number of times. Most notably in late September...and again just recently. So what is this? Sandbox 3? 4?
This algo is just that. An algo. Anyone who wants a sandbox should go to the playground. ;-)
PS, I agree, it's a dumb algo, but that also is a topic for another thread.
What I think can cause the problem is some of the things I have seen posted recently, i.e.
I very quickly got 20,000 pages indexed
I managed to get 1,000 incoming links etc etc
I venture to suggest that Google sees these sort of things as the territory of the professional high optimmising webmaster and that they regard them as most likely to spam the serps.
Whilst there are always exceptions to the above how many people here could build a site, add 20k worthwhile original pages and naturally attract 1000+ links in a few weeks?
Google are using a large hammer to crack a small nut and catching genuine sites but they have a track record for doing this!
No, most sandboxed sites don't get huge numbers of links. They don't add lots of pages at once. They don't do anything even mildly seo-y. That is because most sandboxed sites are simply almost all new sites. You don't "do something" to get into the sandbox (aside from put up a site). You can only "do something" to avoid the sandbox.
I'd repeat what I said before that maybe a combination of factors is more likely
Yeah right, it must be worth ranking in the top 100 if its decently optimised.... 'official'? So that's it, I'll go and stick the words official on my sites. Jeeze this easy.....
>You don't "do something" to get into the sandbox (aside from put up a site). You can only "do something" to avoid the sandbox.
Your talking dribble. You either rank well or you don't. If you've got a new site, it just got harder, that's all. This myth that new sites are worth ranking highly is nonsense.
Why talk such nonsense? No one says that. In fact, saying something so ridiculous proves that you aren't understanding this topic.
"If you've got a new site, it just got harder, that's all."
Um, did you just flip sides? I personally couldn't care less about how hard it is. That is a tangent. Google is not ranking domains based on quality of content and accuracy of reply to a searchers query. The tactical issues are just trivia.
This myth that new sites are worth ranking highly is nonsense.
WHAT A LOAD.
So, you're telling me if a highly respected researcher published a paper online that revolutionized "hydrogen fuel cells", and thousands of highly respected eductional sites linked to it of their own accord, it would not be worth ranking?
The fact that some have had sites that have been around longer does not necessarily make them better per se, only older. This myth that old sites should receive a boost simply because they are old sites is nonsense. To truly be RELEVANT, a search engine MUST be able to evaluate every document about every subject area, and rank them by that relevance. Plenty on info online is vastly outdated, but merely retains it's high ranking (in google, at least) due to their (seeming) inability to come up with an algo that filters out the crap while keeping and ranking ALL legit sites.
I owe much of my living to the big G.
However I have sat back and watched this long enough.
I was a early G supporter, as I was a Alta supporter earlier on.
Spread the word for G . . . I did it, just like I promote FireFox today.
However it is deja vu, just change the names and the dates.
[websitepublisher.net...]
[pcworld.com...]
I don't care what anyone says, G has taken it's eye off the ball this time.
Will anyone care? Not sure. But when G becomes a big yawn, and loses it's coolness, what does it really have left.
G was new and chic, but peoples perspective of what chic is changes when everyone thinks something is chic.
Which is why age is a valid algo ingredient, and was in the old Google, where it took a month or so to get a decent handle on true value. But then Google went down the fool path of "fresh is good", which made no sense on any level. And then Google went down this "new is bad" path which is almost as foolish.
Sites having to prove themselves is valid, but waiting six or nine months is absurd.