Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

gomer

10:36 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



BeeDeeDubbleU, I understand what you are saying and perhaps I should have said it as, "Google does not need to go into the sandboxed pages to fetch results". Either way, I don't think it makes much difference. That is really just speculation on my part spawned from ideas gathered here at WW.

While I tend to think this is a capacity issue, there are arguments to me made that the sandbox relates to spam fighting or algorithmic changes.

How come new pages on established sites are ranking normally?

I don't have direct experience on this as I have not put up new pages on existing sites. (I have just done so and eagerly await to see the results.) However, from what I have read here, I tend to get the impression that some new pages on existing sites are still sandboxed. Am I right in thinking that? Has anyone here had new pages on existing sites exhibiting sandbox behaviour.

internetheaven

10:43 pm on Nov 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From this thread it seems as though the actual concept of "sandbox" is unclear. So many are saying they're not sure what sandbox actually is or what it applies but yet they are adamant that it is the reason they can't get higher rankings.

I think describing heaven would be easier ... a mystical place that is whatever you want it to be ... ;)

Sometimes the hardest thing you can do as an SEO is nothing.

... and you thought I was trying to mislead you ...

mark1615

10:51 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



Our experience is that new pages on existing sites do not exhibit the so-called sandbox behavior. Here are two things we have observed:

New pages on old sites get much better SERPs than new pages on new sites for highly competitive keywords. This is of course even more true for less competitive keywords.

New pages on new sites can get ranking relatively quickly for uncompetitive terms - 3,4,5 words in length.

The point here is that the variable appears to be the age of the site on which the page is located. But what other factors are there? What could we be overlooking?

gomer

11:06 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



I am not trying to mislead anyone. There is a lot of wisdom in that statment and apparently it is wasted on some.

I have had a site come out of the sandbox. I did nothing to get it out of the sandbox. For me, it is that simple.

BeeDeeDubbleU

11:15 pm on Nov 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



New pages on old sites get ranked quickly. This is a fact AFAIAC.

I still see no reason for Google to apply algo change only to new SITES as opposed to new PAGES. It just does not compute with me.

Vec_One

11:16 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



How come new pages on established sites are ranking normally? I have still to hear a sensible reply to this question.

My favorite theory is that most of the PR (80%-90%?) is not assigned to pages attracting inbound links for a certain unkown period of time (1-3 months?). PR is, however, deducted from a page as soon as a new outbound link is created. This would offset the effects of recipricol linking, and other link trickery.

Consequently, large, mature sites with good PR could afford to add new pages without noticeable harm. If they add too many pages too fast, though (like I did), they will have the PR sucked out of them.

*If* my theory is correct, there must be something in the algorithm that allows new pages to be added to non-sandboxed sites without a significant PR penalty.

Then again, maybe it's just broke.

gomer

11:16 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



I think sandbox applies to links, not pages. Internal links may not be subject to sandbox, but external ones are.

That is a really neat thought MHes, thanks. Some questions about that: Do you think all new external links are sandboxed or just some new external links? What is the purpose or reason for sandboxing external links? Do you think this could relate to topic sensitive page rank calculations (just a thought)?

A recent thread questioned the reality of hilltop. I think this thread not only proves its existence, but also how effective it has been.

Can you explain what led you to this, I can't see the link/jump you are making here?

mark1615

11:39 pm on Nov 22, 2004 (gmt 0)

10+ Year Member



I don't see the relationship to Hilltop either - I had actually come to the conclusion that it is less of a factor than is often assumed. Though this assumption on my part is mostly due to the fact that we see so many sites ranking so well that have hundreds if not thousands of totally unrelated links.

prairie

5:47 am on Nov 23, 2004 (gmt 0)

10+ Year Member



For me, the most interesting and least talked about possible factor is link age and relevance.

People say the Google SERPs are stale, and I agree, but as far as I can tell they do shift around more often than they used to (albeit much less dramatically).

If Google's of the opinion that their current index is a good benchmark, then perhaps they see no reason for it to be easily upset.

If they're interested in accurate and useful SERPs, then high Page Rank and links from link pages aren't much use.

So many people have been hoarding Page Rank (directed at their home page) and relegating their outlinks to link pages. That isn't particularly democratic, or useful. Its also far removed from the original idea of hypertext.

We only have a few sites. The ones that rank well have old links that use spot-on anchor text from Yahoo and DMOZ. 6 month old spot-on anchor text from Yahoo doesn't work as well (given up on DMOZ).

neuron

8:26 am on Nov 23, 2004 (gmt 0)

10+ Year Member



gomer--I would say you are dead on-target in post 86.

I have 3 main sites. They all rank high for the same keyterms in yahoo, msn, and the new msn beta. Two of them are +200 or worse in goole, while the 3rd is tops. The two that are sandboxed were launched in late Febuary and May. They should both rank higher than the one that is tops in google, because I've pounded them with an incredible number of links from unique domains.

One thing that hasn't been mentioned here, and I don't want to cloud the issues already raised, but it is puzzling nevertheless, is that I can get some pages to rank high on my sandboxed sites when they are first created. They rank for 4 to 6 weeks and then drop into oblivion with the rest of the site.

BDW--if the sandbox is related to capacity issues, and I am camped out with gomer on this one, and while it may be that google indexes pages, capacity issues may have caused google to seperate domains into separate indexes because of the methods uses in matrix calculations. The correct way to do calculations across an index is to create a matrix encompassing all pages (and of course all sites), but if they have to divide it, then it would likely make sense to divide it along domain boundaries. Why? Because the links between pages on a single domain are considered more relevant than links crossing from one domain to another. Thus, the way to divide out domains from one index to the other is by choosing those domains that get the greatest amount of hits, and they go into the main index. Since a site thrown into the sandbox does not get the hits, it stays in the sandbox.

One issues briefly raised here is that google is doing this for money. If that were true, then they surely would have switched their results by now and taken the sites that have continued to rank well, and make money, and thrown them into the sandbox because they've been earning unopposed money for months now, and surely many a sandboxed site is now depleted of resources from paying for adwords trying to get traffic.

Some other stuff not mentioned here is that the number of pages indexed by google over the past few years continued to grow and that number was proudly displayed on their homepage. This number suddenly froze last March at 99.6% the capacity of a 32-bit capacity index and remained there until only two weeks ago when it just less than doubled. Again, it seems as if they threw one more bit on it to identify sites in the other index but again hit a capacity problem. Why didn't it just barely exceed twice that capacity instead of going just under it again?

My bet is that there are 4 indexes now. There is the main index from which all the high traffic terms are called. There is a supplemental index the same size as the main index, and when it got full google simply added this to the main, but sites in it still do not rank when sufficient returns are pulled from the main index. Then there is a 3rd index, which has the same capacity as the main and the supplmental, which is taking on the overflow from the first two. Then, there is a new index being built from the others based on 64-bit hardware and 64-bit linux running a 5-byte (40-bit) inverted index.

Surely if google wanted to inflate their stock prices, they would release this new index prior to the upcoming stock sale that was announced yesterday. By doing this, those sites that have remained high in the SERPs as a result of newer sites not ranking, and as a result obtaining uncompeted income, will then need to take out adwords to maintain their accustomed level of traffic. And, they'll be better able to afford than the sites that have been sandboxed but which will suddenly rank well once the whole kit-and-caboodle gets put together and a 'normal' matrix can be established across which all sites can be fairly algorithmed.

And as for the claims of some, well, we all want attention and I can just imagine the stickies being sent to the wizard who can turn snails into horses. As for those that claim a site that ranks at +200 or +300 is proof that it's out of the sandbox, get real, that's not rank. My sandboxed sites rank #1 for a lot of esoteric terms that no one's had the gall to compete on before.

This 472 message thread spans 48 pages: 472