Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
I think it is unrealistic to assume that Google is "flawed" and that if this "effect" was a mistake that they wouldn't simply switch the algorithm back.
I don't think you quite got my point. Sometimes mistakes or defects are not easily rectified. Perhaps they did not "switch" anything. Perhaps it's just plain and simply broke?
If indeed you do have the answer then well done! My only comment would be to wonder why you are wasting time back here gloating? If I knew for sure that I was the only one who had the answer I doubt that I would be shouting it from the roof tops. I think I would be far too busy to be wasting time with that.
So what are your motives? Unless I am missing something you don't seem to be here to offer any help.
If they were delivering results as bad as what you are saying then surely no-one would be using them anymore?
Joe Public does not know the results are bad ;)
[edit]
I forgot to add that you did not answer my question ...
"If Google had developed some new algo formula to weed out spam why would they apply it to new sites only? Clearly they would apply it to all newly found pages. It would not make sense to allow existing spammers to carry on regardless while penalising all new and legitimate sites. Not when they could prevent it."
1) We use the same techniques on sites new and old. On old sites we can get new pages with signifcant competition into the top 10 rather quickly - under a month more often than not. On new sites doing same - but often with even more links we are not in the top 1,000. This has lead us to wonder about the value of new links.
2) (Being devil's advocate here - no disrespect meant) You note that the believer's in the sandbox theory just don't like G's results. The flipside of this is that you don't believe in it, because you do like the results.
Questions:
How repicable is this process/technique?
Are you optimizing for highly competitive kws (2MM+)?
If it is that easy, why are talking to us and not counting your money? ;)
Is the technique on page or off?
Again, no disrespect at all meant so don't take anything badly - just very interested in this subject.
fjpapaleo:
I have a pr6 site with over 60,000 pages "indexed" since May. Plenty of back-links. Lots of content, anchor text and all "white hat". Trust me, there's a sandbox. Or more accurately, a supplemental index.
I think you'll find the supplemental index contains either pages that no longer exist (I see no point to this in all honesty) or pages with very low internal PageRank being passed to them.
If your site is a PR6 and you've tried to split that that PR quickly over 60,000 pages, I think you may fit into the latter category.
You'll also find it incredibly difficult, if not impossible to resolve this issue.
Google simply won't let go of our 3,000+ supplemental pages in their index - and I have now reduced the site to about 300 pages which makes this annoying.
I am sure we incur some sort of penalty from having so many supplemental pages, as that particular site has never recovered despite huge amount of time I've spent on it (it is also a PR6).
This isn't true from what I can see, although if you're looking to point people in the wrong direction that's a good thing to say. Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?
The site now has PageRank 5.
What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.
I have zero visibility in ALL search engines.
Theories:
(1) Sandbox
(2) Over-agressive links from the sites I control (link on every page) tripped a filter which causes the site to be penalized.
Course of action:
I have no idea what to do. This is a real shame because I was sure that if the site got free search engine traffic, it could be the first site I've created that would actually make decent money.
I've stopped bothering with the site, why spend a lot of time programming a site that no one is ever going to see? I guess this is the strategy that They have, to make webmasters give up.
What's pathetic is that not only does my site not come up if you search for kw1 kw2, if you search for "kw1 kw2 noun" my site is listed at number seven. There are six sites listed above that, all of which have a link to my site.
...
I have no idea what to do.
What you are describing is very similar to what happened to a site of mine. We aquired on-topic directory listings. When we searched for our site name, the directory listings ranked higher than our site.
When the site came out of the sandbox, it ranked well not only for the site name but all the keywords it was optimized for.
My suggestion is to be patient, your site will rank, you just have to give it time and stay on course.
I read this here at WW and it stuck with me:
Sometimes the hardest thing you can do as an SEO is nothing.
If I understood your post correctly, you are in effect saying that for less competitive terms you can rank but not so for more competitive terms. This is a common effect of what many believe to be the sandbox.
Even sandboxed sites can get traffic from less competitive search terms where the total results returned by Google are low. As the search terms become more competitive, the total results from Google increase and sandboxed sites can't compete as well.
Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results. For less competitive terms as the total number of results returned decreases, Google needs to dip into the sandboxed sites to fetch results. (I know this seems similar to what people have speculated about the supplemental index.)
Bear in mind, and I suspect this is true for others, as you get to the three, four, and five word terms, there is probably no anchor text with those terms in them. I tend to think the "sandbox" may have something to do with the links, but I really don't know what. I would love to have internetheaven's input here.
Are you sure you aren't doing your clients a big favor and budgetting in the purchase of pre-existing domain names into the projects, thus magically evading the sandbox?
Here is some speculation on this. As the total results for a search term increases, Google does not need to go into the sandboxed sites to fetch results.
Remember that Google indexes pages, not sites. How come new pages on established sites are ranking normally? I have still to hear a sensible reply to this question. If this was an algo change Google would surely apply it to all new pages not just those on new sites.