Forum Moderators: Robert Charlton & goodroi
Google supposedly likes natural link building. But natural link building is going to occur at different rates for different sites. I could debut a site tomorrow with the most incredible widget the world has ever known. Millions of sites would link to it because this widget is unlike any widget and provides a useful service to everyone. But if I understand this right, Google would sandbox this site because it got links too fast? So basically Google would be dumping a site because too many people liked it. And that's not even taking into account relevancy, which should override all parameters in my opinion.
But of course, it may happen in 1 case in a million, and in this case they can throw the baby out with the bath water, but I don't think there is anything they can do about it, unless somebody manually reviews the site and says "wow!", but it's impossible.
as for your original question, I once asked the same here, can an old site go to the sandbox, and was answered "no", if and old site disappears from serps, it's not a sandbox but a penalty for smth. Not for too many inbound links of course, since it's an old site, for smth else. Sandbox filter goes away eventually, while penalty doesn't.
The big question is: how long can the sandbox filter last? my new site is in the sandbox for 7 months already, and I have no idea if it's STILL in the sandbox, or ALREADY penalized for something...
The big question is: how long can the sandbox filter last? my new site is in the sandbox for 7 months already, and I have no idea if it's STILL in the sandbox, or ALREADY penalized for something
This is a good point Natashka. I wonder how this works? If a site has been sandboxed/filtered because it is new then released, is it possible that upon release it could have another immediate penalty applied so that we would not know that it has been released?
If a site has been sandboxed/filtered because it is new then released, is it possible that upon release it could have another immediate penalty applied so that we would not know that it has been released?
Exactly! I expected to be sandboxed, but not for so long! Now I really start worrying... I wish there is a way to find this out.
If my 9 year old site is not in the sandbox, then I don't know what is wrong with it. I did a search for my websites name and didn't come up within the first 100 results.
An old site may have ranked well in the past because of hundreds of links from scraper sites acquired over years but Google has become very good at weeding these out recently. Auto generated directories giving sites links are dropping like flies and thus the benefit they used to give is diminishing. Unless an old site continues to get quality links in, then its ranking will erode.
I don't think you are in a sandbox. You probably need new links in which are on theme and good quality. If you have dropped a hundred places in the rankings then this could be the answer plus it could be an algo change which has caused it. There is no point in doing massive changes to your site. Get more links in and you will become less vulnerable to algo changes. If you have dropped hundreds of places, then there may be a more serious issue. You may have upset google somehow and need to check your links out and your onsite optimisation, or maybe you have a server or highjacking issue. You also have to consider whether your site is really as good as you think it is and does it really deserve a top ranking. Its easy to get self obsessed but in truth the site may be like hundreds of others, giving you a duplicate content problem.
There is another issue. Google wants to look fresh for the user. We have an old site that dropped last month to half its traffic and is now back. During this time others had a chance of good exposure. This was good for the user and fair to the other sites. If a site is any good then it will have enough bookmarks and links in to survive these troughs. If google is just changing the rankings because in reality there are hundreds of good relevant sites for a search term, then this is a good strategy for helping its users and appearing fresh. We webmasters may groan, but when I search each month for sailing equipment its nice to see different sites to explore. Its really up to me to bookmark the sites I like and build up a collection and I return to Google each month to find new ones. The important bit is that new ones will only appear if they have passed the sandbox, but when they do, they will be on a level playing field with older sites.
In short. old links are good but only if new links are evident.
In my case, I've always had a steady trickle of new links, but with traffic down so far that's obviously going to dry up. (I've hardly ever requested links from anyone, so 99% of my links are unsolicited.)
Read caveman's Dropped Site Check List [webmasterworld.com] ... then read Combatting Spam with Trust Rank [dbpubs.stanford.edu]
Too many people are blurring the line and blaming the "sandbox" for every and all problems they may encounter with ranking.
Adding my own cautionary advise. Be certain you are not "keyword stuffing" as I believe the keyword filter has (once again) been turned up a notch with this last update.
The tolerance for keyword density was lowered drastically starting with the Florida update more than a year ago. It is my opinion that Google has been turning that knob even further down ever so slightly on an ongoing basis ever since.
I will reiterate that this is my opinion and is one of the reasons why there is so much confusion out there in regards to seemingly good sites suddenly losing rank. Each webmaster was using whatever keyword density they felt worked for them and because everyone calculates KWD differently, it is one thing which we as a group could not agree on as being one of the main causes of our difficulties.
After the Florida update, my site tanked. I knew there was absolutely nothing on my site except KWD which might have caused this drop, so I quickly rewrote every single page using a much more normal and fluid syntax, while still being certain that the search engines could identify what my pages were about. It worked like a charm!
Many people are reporting (myself included) that scraper sites are finally disappearing from the search results in their sectors.
>Speculation< It is my belief that duplicate content has come under heavy fire from Google lately as a result of their effort to rid the SERPS of scraper sites. It is very possible that innocent sites with "some" duplicate content taken from other (older) sites have been caught in the backwash.
Read Caveman's post and then carefully read the Trust Rank paper. Take into consideration your KWD and if you feel in your heart that your copy does not read naturally, then rewrite it. Scour your site for duplicate content and if found rewrite it. Paraphrasing is fine, but don't duplicate someone elses copy.
Good luck!
Still wondering, is it normal for a new site to sit for almost 8 months in the sandbox?... my topic is very competitive though: dating. But when I check my position with roberttaft tool (without sandbox filter) I would have been on the first page of serps already! damn.. How long does usually the sandbox effect last for highly competative topics? If it lasts more than 6 month, does it mean it's already a penalty, and not the sandbox anymore? Any ideas?