Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

internetheaven

2:37 pm on Nov 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Are sites that are currently "sandboxed" able to change their on page content and get out?

They're able to do quite a few things to "get out" (or in other words, stop being ranked so low).

conroy

2:39 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



any hints? is it only onpage?

BeeDeeDubbleU

3:39 pm on Nov 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think it is unrealistic to assume that Google is "flawed" and that if this "effect" was a mistake that they wouldn't simply switch the algorithm back.

I don't think you quite got my point. Sometimes mistakes or defects are not easily rectified. Perhaps they did not "switch" anything. Perhaps it's just plain and simply broke?

If indeed you do have the answer then well done! My only comment would be to wonder why you are wasting time back here gloating? If I knew for sure that I was the only one who had the answer I doubt that I would be shouting it from the roof tops. I think I would be far too busy to be wasting time with that.

So what are your motives? Unless I am missing something you don't seem to be here to offer any help.

If they were delivering results as bad as what you are saying then surely no-one would be using them anymore?

Joe Public does not know the results are bad ;)

[edit]
I forgot to add that you did not answer my question ...

"If Google had developed some new algo formula to weed out spam why would they apply it to new sites only? Clearly they would apply it to all newly found pages. It would not make sense to allow existing spammers to carry on regardless while penalising all new and legitimate sites. Not when they could prevent it."

mark1615

4:51 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



internetheaven - I really want what you say to be true but some of our experience undermines what you say:

1) We use the same techniques on sites new and old. On old sites we can get new pages with signifcant competition into the top 10 rather quickly - under a month more often than not. On new sites doing same - but often with even more links we are not in the top 1,000. This has lead us to wonder about the value of new links.

2) (Being devil's advocate here - no disrespect meant) You note that the believer's in the sandbox theory just don't like G's results. The flipside of this is that you don't believe in it, because you do like the results.

mark1615

5:01 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



Sorry - I got cut off there.

Questions:

How repicable is this process/technique?
Are you optimizing for highly competitive kws (2MM+)?
If it is that easy, why are talking to us and not counting your money? ;)
Is the technique on page or off?

Again, no disrespect at all meant so don't take anything badly - just very interested in this subject.

petehall

5:50 pm on Nov 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



fjpapaleo:
I have a pr6 site with over 60,000 pages "indexed" since May. Plenty of back-links. Lots of content, anchor text and all "white hat". Trust me, there's a sandbox. Or more accurately, a supplemental index.

I think you'll find the supplemental index contains either pages that no longer exist (I see no point to this in all honesty) or pages with very low internal PageRank being passed to them.

If your site is a PR6 and you've tried to split that that PR quickly over 60,000 pages, I think you may fit into the latter category.

You'll also find it incredibly difficult, if not impossible to resolve this issue.

Google simply won't let go of our 3,000+ supplemental pages in their index - and I have now reduced the site to about 300 pages which makes this annoying.

I am sure we incur some sort of penalty from having so many supplemental pages, as that particular site has never recovered despite huge amount of time I've spent on it (it is also a PR6).

lizardx

6:12 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



"there is something built into the pages that are holding them back"

This isn't true from what I can see, although if you're looking to point people in the wrong direction that's a good thing to say. Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?

bignet

6:17 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



If they were delivering results as bad as what you are saying then surely no-one would be using them anymore?

  • maybe bad but not as bad
  • switch overnight?
  • Small Website Guy

    6:26 pm on Nov 22, 2004 (gmt 0)

    10+ Year Member



    In August I created a site that has the title "kw1 kw2 noun". Naturally, I created links to it from all my other sites, then I sought links from other sites. According to Google I have 24 sites linking to me (by searing for the term "kw1 kw2 noun" which is unique to my site).

    The site now has PageRank 5.

    What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.

    I have zero visibility in ALL search engines.

    Theories:

    (1) Sandbox
    (2) Over-agressive links from the sites I control (link on every page) tripped a filter which causes the site to be penalized.

    Course of action:

    I have no idea what to do. This is a real shame because I was sure that if the site got free search engine traffic, it could be the first site I've created that would actually make decent money.

    I've stopped bothering with the site, why spend a lot of time programming a site that no one is ever going to see? I guess this is the strategy that They have, to make webmasters give up.

    gomer

    7:47 pm on Nov 22, 2004 (gmt 0)

    10+ Year Member



    What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.
    ...
    I have no idea what to do.

    What you are describing is very similar to what happened to a site of mine. We aquired on-topic directory listings. When we searched for our site name, the directory listings ranked higher than our site.

    When the site came out of the sandbox, it ranked well not only for the site name but all the keywords it was optimized for.

    My suggestion is to be patient, your site will rank, you just have to give it time and stay on course.

    I read this here at WW and it stuck with me:
    Sometimes the hardest thing you can do as an SEO is nothing.

    gomer

    7:59 pm on Nov 22, 2004 (gmt 0)

    10+ Year Member



    rj87uk, yes, I agree with your observation in post 71.

    If I understood your post correctly, you are in effect saying that for less competitive terms you can rank but not so for more competitive terms. This is a common effect of what many believe to be the sandbox.

    Even sandboxed sites can get traffic from less competitive search terms where the total results returned by Google are low. As the search terms become more competitive, the total results from Google increase and sandboxed sites can't compete as well.

    Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results. For less competitive terms as the total number of results returned decreases, Google needs to dip into the sandboxed sites to fetch results. (I know this seems similar to what people have speculated about the supplemental index.)

    mark1615

    8:38 pm on Nov 22, 2004 (gmt 0)

    10+ Year Member



    I think gomer is on to something - there is more to the so-called sandbox than a simple in it or out of it calculation. We have a site that is about 6 mos old. We have spend a great deal of time on it. It is nowhere for any important keyword. But it can be found for some off the wall four and five word variations on our main 2 word keyword. Think instead of widget widget you have eastern blue tasty widget widget.

    Bear in mind, and I suspect this is true for others, as you get to the three, four, and five word terms, there is probably no anchor text with those terms in them. I tend to think the "sandbox" may have something to do with the links, but I really don't know what. I would love to have internetheaven's input here.

    tomasz

    8:46 pm on Nov 22, 2004 (gmt 0)

    10+ Year Member




    Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?

    Well, I purchased thru Afternic domain name which has been online since 1996 and transfered my old (301) site which ranked very well,
    After 3 months my PR and my back links transfered but new site nowhere to be found, I sent email to Google and they assuring me my site is not in penalty and they are giving me standard blah, blah..
    What else my, old 'Supplemental' page ranks better than my new site, so I do not know if this only applies to "new" sites

    BeeDeeDubbleU

    10:09 pm on Nov 22, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



    Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results.

    Remember that Google indexes pages, not sites. How come new pages on established sites are ranking normally? I have still to hear a sensible reply to this question. If this was an algo change Google would surely apply it to all new pages not just those on new sites.

    MHes

    10:19 pm on Nov 22, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    BeeDeeDubbleU
    I think sandbox applies to links, not pages. Internal links may not be subject to sandbox, but external ones are.

    A recent thread questioned the reality of hilltop. I think this thread not only proves its existence, but also how effective it has been.

    This 472 message thread spans 32 pages: 472