Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
The only theory that makes any sense to me is the "Age of Links" theory. At some point they built a trusted database of links, and if you have newer links, your site is trusted less.
And it isn't ALL new pages that don't rank either, so assuming this is all planned behavior be Google seems unlikely.
My problem is that I get annoyed for my clients, who incidentally have NOT paid me for optimisation, and I have no obligation to them with regard to SEO or their sites' ranking. But when I build interesting websites for others I believe that they deserve a fair crack of the whip. Currently Google is not giving them this.
"GOOGLE, ORGANIZING THE WORLD'S INFORMATION. ALL THE WAY TO 2003!"
But when I build interesting websites for others I believe that they deserve a fair crack of the whip. Currently Google is not giving them this.
This points out a "first mover" advantage these older sites now have. Why did your client wait so long?
I view this as a test of wills. I am not going to hide or give up just because my site is still not ranking well. I've looked at the competition and improved on their offerings. Google will eventually recognize this.
I am not going to hide or give up just because my site is still not ranking well. I've looked at the competition and improved on their offerings. Google will eventually recognize this.
Good thinking. IMHO, there's a tendency here to focus on the short term instead of the long view. Even if Google does have a lead time of six months for new sites (or new commercial sites, as the case may be), that's fairly inconsequential in the overall scheme of things. And in any case, it's likely that the "sandbox," if it does exist, is a temporary phenomenon rather than a permanent fixture of Google Search.
Ahh, the poor surfer. Every day we here the cries of the average surfer: "Oh no, Google is stale. No sites with Whois information of March 2004 of newer are in the index. How stale".Oh wait, that's not a web surfer's lament, but rather a webmaster's.
Not to be melodramatic, but a person who doesn't know they have cancer doesn't lament about not getting cancer drugs, either. Doesn't mean they don't need or deserve them.
But that's over the top. How about a less extreme example (and perhaps closer to home)...
Say a web designer makes a website for a client. The web designer, being in the business, knows all about the importance of keyword placement, semantic markup, seperation of structure and style, anchor text, ect...all the things that are general best practices for designing a page. But they don't use them. instead, they throw together a nested table, spaghettified, un-optimized peice of tagsoup trash and sell it to the client.
Now, on one hand we could say that the client deserves their trash-site, because they should have known enough to look into how the designer makes the page, but let's assume for a moment that they DID look into it, and it turns out that all the designers literature, all their portfolio peices, all their references, say that they design within modern standards. It says all this because up until recently, the designer in question DID use all the best practices. They really did deliver on their promises. It's just that recently, they've started making inefficient tagsoup.
And, of course, the client doesn't know. They get their page put online. They can load it up in IE and look at it. They think they got the optimized, standards compliant page they were promised.
The question is: doesn't the client in this case deserve a well-made page? And hasn't a wrong been done to them, their expectations not met, despite the fact that they never know to complain about it? They were told they would get a well-made page. They are paying for a well-made page. They should get a well-made page.
In the same sense, a wrong is being done to searchers, who patronize the Google search engine because they believe/are led to believe/want to believe that it will provide them with accurate and comprehensive results for their search. If you disagree that this is what Google claims to offer, okay. We can talk about that. But if we agree that this is at least Google's implied promise, certainly searchers are being let down, whether they know it or not, whether they ever complain about it or not.
cEM
To echo what another member posted, if there is a sandbox, it exists only because SEOs and their clients made it necessary.
Personally, I think a sandbox is the wrong approach; I'd much rather see users have a choice between searches that are weighted toward information or commerce. Having one massive, undifferentiated index may have worked in the early days of Google, but it's simply too unwieldy to be practical now that the number of pages on the Web is greater than the earth's population.
Google could just as easily argue that the sandbox is designed to improve the accuracy and value of search results to the user by temporarily filtering out sites that have the highest statistical probability of being fluff.I remember reading that study that showed sites developed after May have a higher statistical probability of being fluff. Oh wait, thats right, I didn't read it because it doesn't exist. In my neck of the woods its the old sites that suck, they've been on top so long that now everything on their pages is an ad or something copied from a public domain government site.
There is maybe a 2% chance of this. The sandbox exists because Google has had a massive failure in its handling of data. The idea that the sandbox has anything to to with seo is laughable.
All last year Google trumpted how fresh it was... fresh tags, constant updates, etc. This was a porr idea from the very start, as "fresh" never meant "good". However, to think that Google's response to this phenomenon is to not rank new sites but rank most new pages is to believe that Google has the brain of a two year old. They are in the business of being a SEARCH ENGINE. It is their job to be able to discern the new official site of a famous person from the reams of scraper sites that go up every day. That is what they do. To think that they are intentionally choosing to hold back official sites while letting 5% of the scraper bilge through makes no sense on any level.
SEO may be responsible to for Google's bizarre dramatic downgrading of authority in its algorithm, but other than that, seo has little to do with where we are. Google's data collapsed around February. It was noted here at the time. They were in the process of making the best serps they have ever had, and then "poof", it was all gone... and we go to all anchor text (almost) all the time again.
We all know that there were times when Google had delayed updates, and we all remember how everyone freaked out when an update took more than a month. Then GoogleGuy told us about quarterly updates. We also know that the Google index has reached 8 billion (but includes the main index and the supplemental index). We also know that Google is using a supplemental index (which seems to be a throwback from Inktomi btw).
Is it possible for them to calculate PageRank through both the main index and the supplemental index?
Keep in mind that to calculate PageRank they must do several iterations through the entire index. I think we are getting what I will define as PageRank Lite. They just can't keep up, and are now working through problems (that they will likely solve), but cannot tell the public because it will affect their stock price.
And to add to what steveb is saying, Google has remained 100% silent about the sandbox affect, and and have not even acknowledged its existence...
Makes sense. I'll bet they wish they'd never acknowledged the existence of PageRank. :-)