Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
As I said previously, I remember a thread here at WW (about May or June) where there was somewhat of a large consensus that sites had come out of the sandbox. I know one of my sites came out of the sandbox then. (I have many sites launched after that that are still in the sandbox.)
I have tried to find that thread but have not been able to. I am not sure of the date of the sites coming out of the sandbox but I belive it was in May or June of 2004 and before June 7, 2004.
Can anyone remember that period or find that thread?
Also, does this mean that once your out of the sandbox, you can create a new page and rank for a semi-competitive or competitive phrase within a week or so? (lets say everything is optimized and you have a decent amount of backlinks)
Does this mean that that there's a specific date for new sites to come out of the sandbox? If so, when do you think that date will be for sites that are still not out yet?
I don't think there is any specific date and if there is one, only people at Google know about it.
I am of the camp that this is a capacity issue at Google rather than an anti-spam filter. While I feel that way, I am not convinced of that 100% myself. Google would have lost its head if in trying to fight spam, they kept new sites out of the index such as election sites, new movie sites etc. If it was a capacity issue, this is something they should have let investors know about in their filings to go public.
If the sandbox is related to fighting spam, it is a terrible measure and just a matter of time before someone eats Google's lunch. Either way, this creates a very poor user experience. They may not notice this now, but they are starting to and the problem will only get worse.
Also, does this mean that once your out of the sandbox, you can create a new page and rank for a semi-competitive or competitive phrase within a week or so? (lets say everything is optimized and you have a decent amount of backlinks)
That is a good question which seems to yield different answers here at WW. I personally have not realeased new pages on sites already in the index and therefore cannot say. From what I have read though, I am tending to think that even new pages on non-sandboxed sites are struggling more often than not to get into the index in earnest.
Nice work by the way in stirring up the pot with the email to WSJ. Whatever the sandbox may be, it is something that should be addressed by the media.
The media must have been notified about this many times during the last few months. How many people in here have tried to get this some publicity yet not a whisper has been seen anywhere. The sinister (perhaps the wrong word but I can't think of a better one) thing is that it is at least worthy of some coverage and it has had none.
Let's face it Google is big news nowadays so why no coverage on what is essentially a really major problem with their search technology? It can't be that the press are better informed than those in this community so why are they blanking this?
To back this up:
I have a new site still sandboxed it cannot rank high for its number one term (my city) it can rank high for areas in my city above other websites.
Further more if i add my main keyword Back into my search i cannt be found.
Does anyone else think this? - or am i way off :)
Internetheaven I can only congratulate you on being the only one in the World to have beaten this thing
I'm not, and I've known it hasn't existed since the first time the myth was created as have alot of other webmasters. It's just that I'm the noisy one! Do you really think that webmasters who aren't affected want the "sandbox" theory to end? What if everyone knew the algorithm changes that have caused what people are blaming as a "penalty", then the thousands of us who do know would have competition ...
There is obviously something built into G's algo to hold them back.
It's the reverse, there is something built into the pages that are holding them back. Google didn't create the algorithm to penalise new pages, it changed the algorithm to weed out the mountains of junk. Unfortunately, most SEO's build their pages the same way as spammers.
Does anyone have a
1. Site placed on a brand new (never registered before) domain
2. That was launched after May
3. Doing well on google.com
4. for a phrase that is competitive on google.com
Yes, and to answer the other question regarding geographical location I can honestly say that it this is the case for both UK and US as I have servers in both countries.
I was not a huge fan of the sandbox theory as many sites I looked at that claimed to be sandboxed, just ranked poorly (and they deserved to).
I would attribute this to 99% of claims on these boards personally! ;)
Do you really think that webmasters who aren't affected want the "sandbox" theory to end?
Well you certainly seem to be doing your best to debunk the theory.
If I had your knowledge I would not be idling away my time on forums. I would be SEO'ing like a man possessed ;) In the present Google climate you could be worth millions!
It's the reverse, there is something built into the pages that are holding them back. Google didn't create the algorithm to penalise new pages, it changed the algorithm to weed out the mountains of junk. Unfortunately, most SEO's build their pages the same way as spammers.
1. Google has not weeded out the "mountains of junk". Their results are spammier (and staler) than they have ever been. As long as they allow Adsense on sites that have not been manually reviewed this will just get worse. Google has turned the full circle. They are now ultra commercial, that's just a fact of life.
2. I have placed new pages (using trusted methods) on established websites recently that got good results very quickly, i.e. two or three weeks.
3. If Google had developed some new algo formula to weed out spam why would they apply it to new sites only? Clearly they would apply it to all newly found pages. It would not make sense to allow existing spammers to carry on regardless while penalising all new and legitimate sites. Not when they could prevent it.
Let's be realistic about this. This is not an antispam measure. Something as blatantly flawed as this cannot possibly be deliberate.
So, to get back on McMohan's topic, IMHO you don't come out at all and currently there is little prospect of this happening.
Let's be realistic about this. This is not an antispam measure. Something as blatantly flawed as this cannot possibly be deliberate.
I think it is unrealistic to assume that Google is "flawed" and that if this "effect" was a mistake that they wouldn't simply switch the algorithm back. If they were delivering results as bad as what you are saying then surely no-one would be using them anymore? If the results are that bad and this has been going on for almost a year then why are people still in these forums obsessed with Google? What you mean is, they aren't showing the results YOU want, they are still miles ahead of Yahoo and MSN which is why you are all obsessed about getting good rankings on them.
By the way, everything in this thread so far has been on topic, what makes you think it has deviated?
I think it is unrealistic to assume that Google is "flawed" and that if this "effect" was a mistake that they wouldn't simply switch the algorithm back.
I don't think you quite got my point. Sometimes mistakes or defects are not easily rectified. Perhaps they did not "switch" anything. Perhaps it's just plain and simply broke?
If indeed you do have the answer then well done! My only comment would be to wonder why you are wasting time back here gloating? If I knew for sure that I was the only one who had the answer I doubt that I would be shouting it from the roof tops. I think I would be far too busy to be wasting time with that.
So what are your motives? Unless I am missing something you don't seem to be here to offer any help.
If they were delivering results as bad as what you are saying then surely no-one would be using them anymore?
Joe Public does not know the results are bad ;)
[edit]
I forgot to add that you did not answer my question ...
"If Google had developed some new algo formula to weed out spam why would they apply it to new sites only? Clearly they would apply it to all newly found pages. It would not make sense to allow existing spammers to carry on regardless while penalising all new and legitimate sites. Not when they could prevent it."
1) We use the same techniques on sites new and old. On old sites we can get new pages with signifcant competition into the top 10 rather quickly - under a month more often than not. On new sites doing same - but often with even more links we are not in the top 1,000. This has lead us to wonder about the value of new links.
2) (Being devil's advocate here - no disrespect meant) You note that the believer's in the sandbox theory just don't like G's results. The flipside of this is that you don't believe in it, because you do like the results.
Questions:
How repicable is this process/technique?
Are you optimizing for highly competitive kws (2MM+)?
If it is that easy, why are talking to us and not counting your money? ;)
Is the technique on page or off?
Again, no disrespect at all meant so don't take anything badly - just very interested in this subject.
fjpapaleo:
I have a pr6 site with over 60,000 pages "indexed" since May. Plenty of back-links. Lots of content, anchor text and all "white hat". Trust me, there's a sandbox. Or more accurately, a supplemental index.
I think you'll find the supplemental index contains either pages that no longer exist (I see no point to this in all honesty) or pages with very low internal PageRank being passed to them.
If your site is a PR6 and you've tried to split that that PR quickly over 60,000 pages, I think you may fit into the latter category.
You'll also find it incredibly difficult, if not impossible to resolve this issue.
Google simply won't let go of our 3,000+ supplemental pages in their index - and I have now reduced the site to about 300 pages which makes this annoying.
I am sure we incur some sort of penalty from having so many supplemental pages, as that particular site has never recovered despite huge amount of time I've spent on it (it is also a PR6).
This isn't true from what I can see, although if you're looking to point people in the wrong direction that's a good thing to say. Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?
The site now has PageRank 5.
What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.
I have zero visibility in ALL search engines.
Theories:
(1) Sandbox
(2) Over-agressive links from the sites I control (link on every page) tripped a filter which causes the site to be penalized.
Course of action:
I have no idea what to do. This is a real shame because I was sure that if the site got free search engine traffic, it could be the first site I've created that would actually make decent money.
I've stopped bothering with the site, why spend a lot of time programming a site that no one is ever going to see? I guess this is the strategy that They have, to make webmasters give up.
What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.
...
I have no idea what to do.
What you are describing is very similar to what happened to a site of mine. We aquired on-topic directory listings. When we searched for our site name, the directory listings ranked higher than our site.
When the site came out of the sandbox, it ranked well not only for the site name but all the keywords it was optimized for.
My suggestion is to be patient, your site will rank, you just have to give it time and stay on course.
I read this here at WW and it stuck with me:
Sometimes the hardest thing you can do as an SEO is nothing.
If I understood your post correctly, you are in effect saying that for less competitive terms you can rank but not so for more competitive terms. This is a common effect of what many believe to be the sandbox.
Even sandboxed sites can get traffic from less competitive search terms where the total results returned by Google are low. As the search terms become more competitive, the total results from Google increase and sandboxed sites can't compete as well.
Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results. For less competitive terms as the total number of results returned decreases, Google needs to dip into the sandboxed sites to fetch results. (I know this seems similar to what people have speculated about the supplemental index.)
Bear in mind, and I suspect this is true for others, as you get to the three, four, and five word terms, there is probably no anchor text with those terms in them. I tend to think the "sandbox" may have something to do with the links, but I really don't know what. I would love to have internetheaven's input here.
Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?
Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results.
Remember that Google indexes pages, not sites. How come new pages on established sites are ranking normally? I have still to hear a sensible reply to this question. If this was an algo change Google would surely apply it to all new pages not just those on new sites.