Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
Why dont you make it easier by throwing example of so called only 5 such Normal sites performing well on compititive KWS?
I cannot give specific examples, and also I am not saying "I have lots of sites that have beaten the sandbox. I am expressing our hopefully well thought out and well researched reasons for why sites from February onwards are not in general performing in google
Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.
I am not sure that I understand the question, but what I do now is that it is simple to get existing sites to perform in google under new keywords, even new categories of keywords, and it is a whole different ball game with sites created since February. This entirely fits in with our theory and simply reflects the fact that PageRank flows down through a site, so new pages will benefit immediately from the existing PageRank of the site.
mark1615 - See my comments above. Sure each web page is judged on its merits, but the large majority of rating of a web page comes from internal links (i.e. it is linked to from the home page of the site). So, in reality in most cases, you are actually looking at web sites rather than web pages when assessing rating. It is for this reason that new pages on an existing site have no problem with ranking. The problem for new sites is that they can't get a good rating and therefore cannot pass that rating on to the individual web pages which are the ones that perform.
If I move a page from our domain with the new (possibly penalized) name to an older subdomain, it shoots up to #3 position and stays there. The PR of the linking pages are the same (PR6 index page => PR5 subpage => page in question). All outgoing links on the page were kept the same.
I think the answer to that is don't believe PageRank is everything. The algorithms that are suppressing new web sites since February are anti-spam algorithms, not ranking algorithms per se. The above is entirely consistent with our theory that a new site must do far far better in our traditional measurement terms to beat an old site.
To me it just seems like links are taking longer to have effect. Really old links from DMOZ and Yahoo are gold.
Right on the money! The simple fact is that with a lotta lotta hard work you can beat the "sandbox" effect, although it is highly questionable whether it is worth it. And that is exactly what google wants to happen - we all give up because it is no longer worth it, and google can revert to making its own decision about what the most relevant sites are, without any interference from us nuisances.
As regards the sandbox effect, someone posted a messaghe saying call it what you like, the effect is the same. Well, I think we are talking about a fundamentally different thing here. There is no sandbox because there are lots of sites launched since February that are doing perfectly well. "Sandbox" suggests that every sites is affectd, which simply is not the case. That is why I don't believe in the Sandbox.
And that is exactly what google wants to happen - we all give up because it is no longer worth it, and google can revert to making its own decision about what the most relevant sites are, without any interference from us nuisances.
So you subscribe to the belief that Google, supposedly the World's best search engine, is happy not to feature any (or very few) new sites for upwards of nine months?
Here is the perfect example that ec=veryone has been harking on about. Bridget Jones, The Edge of Reason is just released. Go to google and search for it. Find the official site and check when it was registered:
24 May 2004
This is exactly the type of example that everyone has been quoting saying google will not reflect newly released movies' websites. Well it does and the question everyone should be asking is - how the hell did it manage that when I can't get my sites to rank!
And the answer is I have said before - sites from February onwards have to comply with much much tougher google algorithms compared to sites pre-February 2004.
Now look at the quality of sites linking to the official site and you will see why they have beaten the filters and you have not.
But look also at the unofficial site that is occupying top slot - domain registered back in 2001 (had she written the book then!). It is there either because the site was created and indexed before Feb or else the quality of links that they have, which are not bad.
But one thing for sure - Google is not manually letting sites out of the sandbox and keeping everyone else in.
I guess what I am saying here is that I believe that a lot of good brains are wasting their time discussing this sandbox and how unfair it is and how google is going to end on the scrapheap, instead of concenntrating on what it takes to get out of the sandbox.
Let's have a few positive postings, eh!
In non-competitive areas, the sb is not apparent - at least in my experience. My sandboxed sites receive more traffic from three or for word queries than from the more competitive two word queries. But they do receive good traffic, so they're not at all hidden. Consequently, the user which does more specific searches is happy, too.
The sb only induces problems for sites that have to offer fresh and unique content (like breaking news) in competitive areas. But the users will quickly learn, that they can use Google's news search or that they need to be more specific. The more specific the queries get, the easier Google can read the user's intent and deliver more accurate serps.
It does feature them, they just do not feature very often and under competitive search terms.
In my book that means NOT featuring them ;)
We are talking percentages here. How many new sites have escaped the (very real) sandbox? 1%? 2%?
How many new sites are SB'd 98%? 99%? No matter how you look at it the sandbox is very real to most of us. It may be an algo function bit and you can call it what you like but it is very real and it is stopping new sites from being featured in the serps.
Regarding your point about Bridget Jones, if I had a site containing the phrase "Bridget Jones the edge of reason", I would expect it to be found in spite of the sandbox. None of these words are competitive terms.
So what do you consider competitive? Presumably mortgage or gambling or travel terms etc? In these areas sites post Feb have got a mountain to climb to overcome the entrenched (and yes favoured) positions of sites pre-Feb.
One of these sites is for a small specialist consultancy. It contains lots of useful information about the service that is provided and I really mean that. I even found it very interesting myself while I was doing the site.
The site has a few inbound links and PR4. When I do a Google search for a nine word string of text (that's 9!) that is a page title on this site and that contains no real competitive terms it cannot be found in the top 200 results.
The term is something like ...
Teaching widget as a widget widget in location country
When I do the same search in Yahoo it is number one. Nuff said?
This is not spam filter behaviour and no one on earth will convince me that a search engine that performs as badly as this does not have a major problem. It should surely be capable of determining that a coincident, nine word search phrase in a page title must be relevant.
Incidentally, when I do the same search enclosed in quotes it comes up number three, beaten by two sites with zero PR and nothing related to the term in their page titles.
I really do sympathise with your plight, but on the one hand you are saying that the bridget jones example is a bad one because the term is not competitive and then you are showing your clients examples which are all very niche and non-competitive. My point remains - competitive or non-competitive, sites are getting listed and ranked and the focus of this forum should be on how to "break that filter", not to moan about google being broken, a cr*p search engine etc etc.
The fact that msn lists your client's site at the top does not mean it is a good search engine, or "fixed". A case in point - search msn for bridget jones and see if you can find the official movie site. You won't. Now I say google is better than msn at delivering the right results.
Does that reasonably focuse the practical question?
Personally, I have lots of pages that rank well in several industries. We sometimes sell advertising on these. If I could figure out how to get a list of sites that have been created since Feb 2004 (WHOIS database?), I could offer our services to them.
My heart goes out to anyone relying heavily on the web to get a newly established law firm or insurance company off the gound these days. I would think that these folks would jump at the chance to get into #1 position.
Good idea, or too labor intensive?
What you have to do is about 10 times what you had to do before. Get lots of good quality links, preferably on theme, mak sure the site is indexable, optimise the pages reasonably (nothing over the top) and you will get the good rankings eventually. It may take several months but it will happen.
The difference to before was that you could launch a site, link to it from a few existing sites within your network and bang within a week you are top of the SERPs. No more!
And this is what I mnean when I say that google has written very very tough anti-spam algorithms that only affect sites created since Feb.
Now the simple fact is that for many people the amount of effort required to do the above makes the return debatable and many will decide not to go down that path, and google will be delighted!