Forum Moderators: open

Message Too Old, No Replies

Has the Sandbox been Abandoned?

         

phantombookman

8:54 am on Nov 23, 2004 (gmt 0)

10+ Year Member



Sorry to start a new thread but felt it may warrant it.

I have been posting in favour of the Sandbox's existence and I have 2 sites firmly stuck in the sand!

However...
2 weeks ago I registered a brand new domain and started to build a new site. I knew it would be at least 6 months before anything happened but..

This morning it entered the index for the first time - straight on page one for a one word search (a town, granted only 194,000 matches) but none the less the last 2 sites still cannot achieve similar results after 6 months.

Also preliminary early pages ranking very well
The site has only one incoming link, no adsense, banners or anything, vanilla html etc.

Built as per my last 2 sites so clearly something has changed!
Regards and hope to all
Rod

borisbaloney

10:05 am on Dec 15, 2004 (gmt 0)

10+ Year Member



There is overwhelming evidence that since early this year Google has not, by default, been featuring new sites in their search results.

Incorrect. They have generally not been featuring new sites in the first few pages of compeditive SERPS. They are indexed and it's an important distinction.

BeeDeeDubbleU

10:45 am on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Incorrect. The word Featuring was carefully chosen.

BeeDeeDubbleU

10:49 am on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Perhaps I should have added that I don't consider position 278, 367, etc. to be featured results ;)

energylevel

11:07 am on Dec 15, 2004 (gmt 0)

10+ Year Member



I think we're getting a little closer to what I believe to be the case. The filter/ranking-lag/sandbox ot whatever you care to call it does exist and it is triggered by most new sites under certain circumstances. It is NOT total exclusion in my experience so those that keep saying Google has no room left! ..... that throws their theory out of the window, cos your in the search results just a long way down from where you should be. I've been seeing Google index all pages on small sites say 20-50 pages within a few days and list them all but without any rank worth mentioning for good related phrases.

I'd be interested to here if anyone has set up a new site on a brand new domain that is in an industry that is related to what some refer to as 'money terms' and bypassed the filter for really competitive terms.

I have yet to be convinced that anyone has bypassed the filter on a 'money terms' related site for competitive terms. Maybe a few have got reasonable rank for second or third tier phrases in less competitive areas and think they have dodged the sandbox filter.

I would like to ask anyone who claims to know exacly how to get around the filter for new sites/domains in competitive, lucrative areas are you using black hat or white hat tecniques, guess they wouldn't admit to anything on public forum though!

AnonyMouse

11:57 am on Dec 15, 2004 (gmt 0)

10+ Year Member



Certainly concur with the links theory, as you may have read in the thread I started here: [webmasterworld.com...]

Basically, new site struture held the same rankings in the SERPS when I used 301s for previous links to the old site structure. When I amended the incoming links to reflect the new site structure, bam!, no rankings.

So new pages were fine...new links were not...

phantombookman

2:44 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I would take issue with the in sandbox poor seo theory.
The only site I have had bypass the sandbox was in fact the least optimised site I have launched!

I am convinced this is why it got through. I don't think there is a specific magic bullet, various things trip the filter.
I think launching a site conservatively and allowing it to initially grow more naturally is the answer.

siteseo

3:49 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I've never been a believer in the "over-optimization" penalty, but maybe we should be asking ourselves what an SEO-savvy web designer typically does that someone with NO SEO knowledge would NOT typically do. Knowing G, it's probably not ONE thing that trips the "sandbox filter," but rather an accumulation of factors.

What if G assigns a certain weight or value to different on-page optimization factors:
10 pts for keyword in <h1> tag
8 pts for keyword in <h2> tag
4 pts for keyword in description tag
3 pts for keyword in filename
2 pts for each instance of a bolded keyword

...and so on. And if you EXCEED a certain point threshold, you are "sandboxed." G can tell what words a page targets pretty easily (redundancy).

Just a shot in the dark.

Vlad

3:57 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



My wife launched a website in early April (and may have just gotten under the wire as someone suggested) The three keyword phrase she used is highly competitive, yet not a "money" word.

After indexing by G it was #1 for that those three keywords, and stayed that way till the next PR update and then fell to page three. We spent the next two weeks watching it slowly climb back to #1 and then 2 weeks later it fell back to page 3 after a PR update. Two weeks later, which would make it June, it was #1 again and has maintained that position ever since.

Since my wife is no SEO expert, I doubt whether we can contribute the success of this website to her ability.
In fact we have no idea why this website remains #1 other than by June other serps had picked it up and by the time PR came around again (which we dreaded) it had enough links to stay #1.

The interesting thing is, that in June it was #1 out of 5,000,000 results. In October it is #1 out of 10,000,000 results, and when we checked yesterday it is #1 out of 17,000,000 results.

The three keywords are in her url, title tag, meta, and then scattered separately on her page.

Was this site sandboxed? We don't think so, the varied results came from the PR update. But this was 6 months ago and I know things have changed in the serps. The website is 20 pages and she has 5 outgoing links. Incoming links? Very few.

When searchers type in the three keywords together, the website will get pulled up #1 and it is exactly what the surfer is looking for.

MHes

4:06 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The only site I have had bypass the sandbox was in fact the least optimised site I have launched!

So surely that is the best optimisation?

energylevel

4:19 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I'm not sure I get this ..... if the site is not optimised for Google then how would it compete with estabished optimised sites that don't face the prospect of the sandbox?

Vlad

4:30 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



Fresh content that nobody else has? That is so unique, it can't be scraped?
What we are spending our time racking our brains on, is how to monetize this site. : )

HayMeadows

4:46 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I'm not sure I get this ..... if the site is not optimised for Google then how would it compete with estabished optimised sites that don't face the prospect of the sandbox?

Great question. The question of the day! This one really makes you think.

randle

4:58 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you strictly adhere to these guidelines in creating your site;

[google.com...]

you will end up in the sandbox. (if you take the time to read them about 80% of “seo” is in there)

Thus the question becomes why? If you follow exactly what Google tells you to do, your site will not surface for its intended purpose. That’s what I am struggling with. If people have found a trick to get out, (whatever the nature of that technique) it doesn’t really change the fact this thing exists, and is contrary to the very instructions Google gives you.

MHes

5:11 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google says "Following these guidelines will help Google find, index, and rank your site,"

They don't say you will get great rankings, just ranked.

>I'm not sure I get this ..... if the site is not optimised for Google then how would it compete with estabished optimised sites that don't face the prospect of the sandbox?

If phantombookman is now out of sandbox, he must be ranking and getting some traffic. To displace established sites always takes a bit longer... no change there. The question is, are the top sites there because of the same seo tactics that worked in the past, or are they there for new reasons? A site may have tons of h1 and anchor text, but this may not be the reason its number 1.

caveman

5:49 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I have yet to be convinced that anyone has bypassed the filter on a 'money terms' related site for competitive terms.

Wha...wha...wha...what? Still?

AFAIK, Brett & Co get lots of credit for letting these general sandboxing threads go on, and on, and on, and on.

A handful of members have been posting since May/June (albeit reluctantly given the sometimes rude and even nasty responses they get), that this is not a "sandbox" but rather a combination of badass algos and filters. It has also been repeated over and over again that this so called sandbox is not absolute. It does not, and never has, applied to all new sites. It does apply to a far greater percentage of new sites than ever, which all by itself should be very informative. IMO, a variation of it applies to older sites too...or maybe not. Hmmmm...what if it's the same rules? ;-)

Also, there are various ways of getting around it.

Rather than coming here and making posts like, "Please tell me how to crack it" or "I don't believe..." or "...money terms..." or "If people know, someone would have posted the solution...", more posters should get out there and do their own research.

There are sites out there that have cracked the code (including some that probably never heard of SEO), and they come in different shapes and sizes, for money and non-money terms. And they run the gamut from black to white hat. There are multiple ways to show well with any algo. Some innocent, some hard core. Lately, hard core seems to be getting all the attention. :-(

I have no idea if WW would allow it, but new threads on specific aspects of the current algo might be more productive. For example: "How is ageing measured: Site, page, internal/external links, none of the above, or some combination of the above?"

As far as money terms go, forget that. It is perhaps closer to think in terms of volume than cash. They can't easily connect cash with searches without violating rules of separation that they've sworn they won't violate. If they violated those rules and were found out, they would lose all creditibility, and the press really would have something to talk about.

MHes, your interview with the Sunday Times...very funny. :-)

MHes

6:01 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Face it Mhes, If myself and steveb are on the same page here, its got to be serious.;)

... and always entertaining :)

randle

6:07 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



“They don't say you will get great rankings, just ranked.”

That’s just it though, if your first page for allinanchor, allintext, allintitle, and you are getting a fresh tag applied every 2 or three days, and yet your main key word is not in the top 1,000; your not getting ranked.

DerekH

6:09 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That’s just it though, if your first page for allinanchor, allintext, allintitle, and you are getting a fresh tag applied every 2 or three days, and yet your main key word is not in the top 1,000; your not getting ranked.

Yes you are, you are being ranked as outside the first thousand.
Sorry to be picky over your use of language, but you are ranked because you are deemed to be 1001 or lower in the list. All those above you rank higher. The fact that you can't see below 1000 is a separate issue.
DerekH

steveb

6:12 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Beating the sandbox is the subject for other threads. Optimizing around something is different than issues related to the existence of that something. Beating the sandbox is irrelevant to its existence.

What exists is a barrier for new, quality sites that searchers would objectively want to see if they type in a query, with the example of typing in a famous person's name and not getting a decently optimized four month old official site in the top 100 for that name. That is simply rotten search engineering, and there is one whole stream of issues relating to that. If adding 10,000 blog links would kick that official site free of the sandbox, that is just a tactical optimazation issue that flows among a very different stream of issues.

The barrier exists. Google's results and the searching punblic are poorer for it. Tactics to beat it really have nothing to do with that.

caveman

6:31 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Beating the sandbox is the subject for other threads. Optimizing around something is different than issues related to the existence of that something.

Good point.

So OK. The sandbox has *not* been abandoned, because it *never existed.* It's not a thing apart from the previous forms that G took.

The question, "Has the Sandbox been Abandoned?" is like asking, "Has Florida been abandoned?" Well, yes, and no. The current algo is an extension of Florida and all that came after.

Also, since people have been calling it the "sandbox" it has changed a number of times. Most notably in late September...and again just recently. So what is this? Sandbox 3? 4?

This algo is just that. An algo. Anyone who wants a sandbox should go to the playground. ;-)

PS, I agree, it's a dumb algo, but that also is a topic for another thread.

phantombookman

6:49 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



To clarify my point:
I think standard on page optimisation is fine, H1 etc, good internal linking, anchor text etc.
This is something anyone building a site can/may do and still looks 'natural'

What I think can cause the problem is some of the things I have seen posted recently, i.e.

I very quickly got 20,000 pages indexed
I managed to get 1,000 incoming links etc etc

I venture to suggest that Google sees these sort of things as the territory of the professional high optimmising webmaster and that they regard them as most likely to spam the serps.

Whilst there are always exceptions to the above how many people here could build a site, add 20k worthwhile original pages and naturally attract 1000+ links in a few weeks?

Google are using a large hammer to crack a small nut and catching genuine sites but they have a track record for doing this!

caveman

6:51 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Indeed.

steveb

7:28 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Google are using a large hammer to crack a small nut and catching genuine sites but they have a track record for doing this!"

No, most sandboxed sites don't get huge numbers of links. They don't add lots of pages at once. They don't do anything even mildly seo-y. That is because most sandboxed sites are simply almost all new sites. You don't "do something" to get into the sandbox (aside from put up a site). You can only "do something" to avoid the sandbox.

skunker

7:33 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



So, what's worse? Sandbox or Florida?

energylevel

7:52 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I have to disagree whole heatedly with the acquiring links issue ... I've taken it very easy indeed on link building (example site: 5 to 10 inbounds for first 4/5 months then acquired an additional 40 or so good quality inbounds in month 6 ) then woke up one day in month 7 to find the site was suddenly ranking very well for quite a few money terms. I used H1, H2 and H3 tags sparingly, tried not be cute with title tags and a whole bunch of other things in an effort to NOT do what an SEO would normally do!

I'd repeat what I said before that maybe a combination of factors is more likely

MHes

9:58 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>with the example of typing in a famous person's name and not getting a decently optimized four month old official site in the top 100 for that name

Yeah right, it must be worth ranking in the top 100 if its decently optimised.... 'official'? So that's it, I'll go and stick the words official on my sites. Jeeze this easy.....

>You don't "do something" to get into the sandbox (aside from put up a site). You can only "do something" to avoid the sandbox.

Your talking dribble. You either rank well or you don't. If you've got a new site, it just got harder, that's all. This myth that new sites are worth ranking highly is nonsense.

steveb

10:26 pm on Dec 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"This myth that new sites are worth ranking highly is nonsense."

Why talk such nonsense? No one says that. In fact, saying something so ridiculous proves that you aren't understanding this topic.

"If you've got a new site, it just got harder, that's all."

Um, did you just flip sides? I personally couldn't care less about how hard it is. That is a tangent. Google is not ranking domains based on quality of content and accuracy of reply to a searchers query. The tactical issues are just trivia.

WebFusion

12:49 am on Dec 16, 2004 (gmt 0)

10+ Year Member



This myth that new sites are worth ranking highly is nonsense.

WHAT A LOAD.

So, you're telling me if a highly respected researcher published a paper online that revolutionized "hydrogen fuel cells", and thousands of highly respected eductional sites linked to it of their own accord, it would not be worth ranking?

The fact that some have had sites that have been around longer does not necessarily make them better per se, only older. This myth that old sites should receive a boost simply because they are old sites is nonsense. To truly be RELEVANT, a search engine MUST be able to evaluate every document about every subject area, and rank them by that relevance. Plenty on info online is vastly outdated, but merely retains it's high ranking (in google, at least) due to their (seeming) inability to come up with an algo that filters out the crap while keeping and ranking ALL legit sites.

minnapple

1:29 am on Dec 16, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am not trying to be a G basher.

I owe much of my living to the big G.

However I have sat back and watched this long enough.

I was a early G supporter, as I was a Alta supporter earlier on.

Spread the word for G . . . I did it, just like I promote FireFox today.

However it is deja vu, just change the names and the dates.
[websitepublisher.net...]
[pcworld.com...]

I don't care what anyone says, G has taken it's eye off the ball this time.

Will anyone care? Not sure. But when G becomes a big yawn, and loses it's coolness, what does it really have left.

G was new and chic, but peoples perspective of what chic is changes when everyone thinks something is chic.

steveb

1:32 am on Dec 16, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"a search engine MUST be able to evaluate every document about every subject area"

Which is why age is a valid algo ingredient, and was in the old Google, where it took a month or so to get a decent handle on true value. But then Google went down the fool path of "fresh is good", which made no sense on any level. And then Google went down this "new is bad" path which is almost as foolish.

Sites having to prove themselves is valid, but waiting six or nine months is absurd.

This 338 message thread spans 12 pages: 338