Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
Am I correct in assuming that the sandbox theory is either "in" or "out", according to one's preference? And that pages (or sites - I'm not sure which) - are either in this box-thingy or not?
Well, when I step back, I find that the whole concept of Google's PageRank, its SERPS, its toolbar and everything else is based on "better than" or "not as good as" - real quantifiable measures (whether you agree with the measure or not).
I personally find it rather sad that we're wasting time discussing the sandbox like it was some sort of portcullis - you're in or you're out of the Google Castle.
Do others really believe that the massive matrix calculations that define PR are then going to be adjusted by a coin-toss? In or out? Heads you win, tails you're on page 100?
I've a Masters degree in Mathematics, but I'm finding that I'm turning into a philosopher in this debate, trying to understand what I see, rather than making up "in or out" theories that are quite, quite childish.
We would, I think, be better served by trying to make sense of the contrasting and contrary things we're seeing here, instead of heaping coals onto some vast fire.
The fact that we ARE seeing constrasting and contrary things is what we should be grasping - not that we don't think we ARE seeing them.
----
OK - I've had my rant and I feel better now.
Sorry about that - us Brits tend to keep our emotions bottled up far longer than is good for us...
Why not skip over this and read the next post instead....
DerekH
Page are what is ranked not sites
Old sites cannot be subject to the same part of the algo that he thinks new sites are because the data wasn't kept prior to the algo change.
Links are tracked and aged.
One question on this though, a new page on an old site would still seem to be subject to the algo - no? Yet many people have experience that suggests this is not true.
Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?
And one other thought: The so-called anti-spam tactics employed by G are to fight a "problem" they largely invented. The basic premise of the G algo, we are lead to believe, is that links are votes. Well, then webmasters go out and get links. And anchor text in links is important, but the fact is that truly natural anchor text very often is totally unrelated to the keyoword and is thus devalued (we think) by G. So now in response to webmasters actions to get links - G (again, we think) takes action to combat aggressive linking - which causes this problem because of course, new pages and new sites have new links. This has resulted in G becoming unarguably stale.
Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?
Well, I'm not sure that this is anything more than the way the reverse or inverse index that looks things up is updated and made current.
After all, in addition to the algorithms that decide results, there is the data that is fed to those algorithms. With some pages on one of my sites indexed yesterday, and some not visited since last February, the spread of currency of the data is massive. Who can say what effect the age of the last visit of one of your competitor's pages has on the weight that page is ascribed.
For a long long time I've seen my sites rank really well for one keyword and not for another, and yet for the pair to beat sites that beat me on both searches.
I don't regard that as anything more than "something" in my site doing well for an obscure combination of keywords, any more than I regard the fact that a site doing well for an obscure search means anything more important than the fact that other sites don't.
My god what a sentence that was!
What I meant is that it's easy to do well in an obscure search. That's what obscure means.
And what I didn't say was that I don't actually have a view one way or the other about the sandbox. Some of my pages have done well, some have been wiped out; but the last thing I think is that it's something quite so black and white.
Anyway - you shouldn't as me to justify my rant <grin> - it was just something I needed to get off my chest...
DerekH
The sand box does not exist.
Google updates roughly every three months.
Im talking deep update.
If you have enough seo in time for the update you move
if not you stay.
I have 100 s of sites between my partner and I .
we have seen this happen many times.
If your site has not moved in 6 months than you havent done enough seo or you are doing it wrong.
If you have time to complain on this board chances are you havent done enough.
THERE IS NO SANDBOX!Google introduced new algorithms in February and these algorithms are tough, tough ant-spam algorithms. They are based on lots of factors like:
How quickly the links were amassed
Quality of links
How quickly pages were increased
Etc etc
When you search for a restaurant by its name and city and google does not return the restaurant's website even though it is indexed, just because the site's links aren't aged to perfection, it doesn't matter what its called. It's a reason to leave Google.
People aren't leaving Google in droves because every other aspect of the search engine is far superior to the competition. But each day the 'tough anti spam algorithms' continue, this aspect becomes more noticeable.
Google updates roughly every three months.bak70 if you think we've had a deep update in the last nine months, you may be in for a shocker soon. At least I hope so.
Im talking deep update.
If you add a new page to an old site (+2 years old), and get hundreds of inbound links (from external sites) to it (for a competitive search term), this new page would not rank well at all for many months because the incoming links have not aged yet, right?
This can only be proved or disproved by real-life results that people have had doing this recently. Anybody experience this problem with a new page on their old site?
Explain to me how the following can be explained by anything else than the SB theory (I really am open to suggestions):
If I move a page from our domain with the new (possibly penalized) name to an older subdomain, it shoots up to #3 position and stays there. The PR of the linking pages are the same (PR6 index page => PR5 subpage => page in question). All outgoing links on the page were kept the same.
The only difference I can see is the newer versus older domain.
Oh, and DerekH, I'll see your M.S. and raise you a Ph.D. ;)
I also started a brand new site and it should be either 1 rst or second page after the next update.
(Its already first page on the msn beta)
I feel the reason why some people do better in Msn is that it is updating more frequently than google right now.
I havent seen a change in the way google updates in about 2 years. Pretty much every three months.
Again some people might not notice these updates because popular terms are dominated by people who just do seo better. This results in the serps looking the same.
I have all kinds of sites so I see these smaller changes when they happen.
As far as restaurants not showing for there specific search.
It will take a few updates but if the site has some incoming links it will show up.
There was a very popular diet pill that didnt show up in the top ten for its name for almost a year.
I took a look at it and noticed the site had little seo work on it.
So it took a while.
[webmasterworld.com...]