Please tell me more, my pr is 4 on domain.co.uk, index.asp is 0. How do i resolve this issue in simple steps 1 to 10 as i am a plank at times.
Sounds like you are on a windows server, so a mod-rewrite redirect is likely not an option, as message #334 on [webmasterworld.com...]
Maybe try the [webmasterworld.com...] and also make sure all your links to the homepage are to www.site.com/ and not www.site.com/index.asp (or to site.com/ if you have opted for non-www).
Sorry... can't be more help.
Going back onto the subject I strongly believe this update has had a huge trustrank role.
Perhaps the TrustRank algo has comeout of Beta and this is what we'll see more often.
4 years old - PR 4 - 5 pages of new content added every month
no tweaks to meta tags or titles.
Result rankings are solid, hardly moves been there for the past 2 years at least.
2 years old - PR 3 - many pages added every month often receives tweaks to titles and meta tags
Result web site is up and down currently disappeared for some phrases.
I think theirs now a mjaor shift in the way you need to build sites to rank well.
From day 1 keep it clean no black hat SEO
Keep title short and clean and relative
Don't adjust meta tags concentrate on building content.
Don't exchange links just encourage people to link by providing code.
Wait 12 months before expecting anything special.
If you can't wait and make changes you could loose some trust.
I believe that I have evidence that there is in fact a sandbox.
I had to give up a domain that ranked on page one for many years for relevant terms. I had all pages removed from the Google Index, then put the same exact pages up under a new domain. The new domain has been indexed and in Google, but ranks for none of the terms it did before. Either there is a sandbox or the concept of "Content is King" is dead.
>>that ranked on page one for many years for relevant terms
Metro, while it's possible that your new site is seeing a sandbox effect, it's also likely that the old site had much better linkage, particularly if it had good content and was at the top of the SERPs for a while.
I like your theory.
|The everflux crawl determines that an existing site has changed ownership, moved or for some other reason decides that its TrustRank is invalid, so it revokes its existing TrustRank. |
Is private domain registration spammy? If so, would changing it to public helpful?
"I believe that I have evidence that there is in fact a sandbox.
I had to give up a domain that ranked on page one for many years for relevant terms. I had all pages removed from the Google Index, then put the same exact pages up under a new domain. The new domain has been indexed and in Google, but ranks for none of the terms it did before. Either there is a sandbox or the concept of "Content is King" is dead."
I have them same exact problem.
Don't bother the poor guy he has his Cat Frank to take care ,as he and his bosses take care of many of you.
I think the bottom line is anybody who knows what it takes to get a site ranked well on Google in competitive 'money' areas can tell you whether or not a site is sandboxed, given all things are equal a new site may most probably be held back from it's 'true' rank for at least 3 months and an established site is unlikely to get the same treatment. Then one day you will wake up in the morning to find that as if my magic Google all of sudden likes your site and has decided to give it a major boost in the SERPS...
There are lots of reasons why your site may not rank well but in this instance what we have come to call the sandbox is basically Google treating new sites in a much harsher way than existing sites, it happens it exists, it may avoidable and if there are some golden rules to avoiding it no one is prepared to let the cat out of the bag to my knowledge and the other factor causing confusion is how Google determines which new sites to apply the sandbox filter to .. in my experience new sites in highly competitive 'money' sectors seem to be a hopeless case and are almost certainly going to get sandboxed!
A good synopsis Energylevel. I also think that over optimisation is one of the factors that determine both whether the filter is applied and its duration.
What we've tried to do here is launch a number of sites in the last year or so with different levels of on page and off page optimisation. All the sites were in highly competitive money areas, we consistently had to wait at least 6 months to get what we now call their 'real' rank on Google, in one case the site seem to come out of the sandbox only to be dumped back in after a month or so later, then after 3 a further 3 months the site came out of the sandbox again with some great results on Google and has been doing well ever since... I do think agressive link acquisition for new sites could be a major player in keeping you in the sandbox longer, beyond that not got the answer to avoiding the sandbox!
The absolute extreme of non optimisation we did with a site was just a couple of inbound links (waited a few weeks after launch) with none of the 'normal on-page optimisation techniques .... the site still got sandboxed...
The plot thickens! .... maybe some do now the exact steps to avoiding the sandbox for a new site, even for the real commercial sites, I don't blame them for keeping it to themselves if they do...
Conclusion... it's grossly unfair, I see really good new sites from respectable legitimate businesses doing nothing in Google SERPS, whilst sites that have been around for a while continue to ignore Google's guidelines and get rewarded for clear blackhat techniques and huge inbound link acquisition which must clearly not be natural IMO, some of the sites I see have been doing this for years without getting the boot!
How can Google claim to have the best results when they continue to keep large numbers of new good quality sites from featuring anywhere of significance in their results, that could mean in many areas Google is 6-12 months behind in terms of featuring seriously many new quality sites.
One thing you perhaps haven't considered is whether Google used whois data to apply the filter to all your sites? Could they have attributed them all back to you / your company and applied the sandbox that way?
Just a thought...
I don't think that's the problem. I usually get my clients to register their own domain names and hire their own hosting space. This means that the sites I launch are on different IPs but they still get the treatment.
BTW I also agree that it is grossly unfair. It's a bit like everyone in the class getting caned because the culprit won't own up :(
Ho hum... back to the drawing board!
>>How can Google claim to have the best results when they continue to keep large numbers of new good quality sites from featuring anywhere of significance in their results?
If there is anything that will cause Google to back off the sandbox effect it is this exact question. If people find fresher, but not spammier, content at other search engines, Google will see some attrition.
Even "regular" users can perceive a difference in quality and, over time, will migrate toward the best results.
>How can Google claim to have the best results when they continue to keep large numbers of new good quality sites from featuring anywhere of significance in their results?
I think this reflects a major misunderstanding of what people perceive as quality, as well as what Google wants to produce.
It doesn't matter how many "good quality sites" don't appear in the top ten. Fact is, all but ten of them won't, regardless of the search engine.
All that matters is, how many "poor quality sites" appear in the top ten. THAT's what makes results looks spammy. And if Google could guarantee every search resulted in ten adequate sites in the top ten, by simply contriving to put all websites in a sandbox for five years -- reckon you could count the milliseconds it took them to make that decision on your fingers? Yes, and have some fingers left over!
Put it another way: there are very very few sites that couldn't disappear from the internet without visibly degrading Google searches. I very much doubt that more than 0.1% of webmasters have ever created one of those. And so ... losing any particular one of those sites is no harm at all.
What DOES harm surfers, is listing one spammy site.
It's not about asking whether the glass is half full or half empty. It's about asking whether the glassholder is collecting water or pushing out the air.
|if Google could guarantee every search resulted in ten adequate sites in the top ten, by simply contriving to put all websites in a sandbox for five years |
But of course they can't.
Doesn't this line of thinking presume that Google believed the spammy sites wouldn't be built if the builders thought they wouldn't show in G serps?
I don't think Google is likely to believe that, nor that the spammers are going to give up on G because of it. Especially since the spammers have other SEs to target while waiting to rank in the G serps.
Listening to Matt Cutts at Pubcon I came to think that the ranking delays we are talking about were the unplanned, but perhaps beneficial (in Googles mind at least) result of two or more perhaps random algo factors working together.
For all we know it could be the interaction of any two or more of the many factors in the algo.
Personally I think of these delays as being the result of a Delayed Ranking Syndrome (DRS)
Hutcheson post # 1082
That idea is very interesting. I think you have hit on a very important concept. I have had similar thoughts but couldn't express it as well as you did.
Let me see if I understand your idea. It seems reasonable to suggest that What any SE is trying to do is to provide the best results that they can. But it matters very little what the results are beyond the first page or two. It is obvious that the searchers very seldom search much lower than that. They might look but they don't click. My experience is that this is likely true. The search numbers below spot 20 are so small that it hardly matters what the results are. A case in point. For a particular KW #3 spot for 2 months, average unique visitors 293/day. Fell to #21, average unique visitors 7/day. Moved up to #16, average 11/day.
If it is the first 10 results that are in the minds of both the SEs and the users, this is a very serious matter for both WebMaster and Search Engines. There are hundreds of equally good sites available to be listed in the first 10 spots in the SERPS.
The Search Engines job:
1. For every search phrase, get 10 good sites (out of the hundreds of equally good sites) to show up as the top ten.
2. Don't put even one Spammy or irrelevant site in the first 1 or 2 pages.
The Webmasters job:
1. Try to find out how to be one of those top 10.
2. Try not to ruin your site for your customers while doing #1
If this analisis has any validity, it certainly changes the focus. It make the SEs job easier, but it makes the WM job much harder, if not nearly impossible.
I don't know how to edit a post. In my last post I meant Hutcheson post #227 (ops).
hutcheson is correct. One reason Google is still a better search engine than MSN in the minds of many searchers (me, anyway), is that Google's first SERP page usually has all good sites, while MSN's is maybe 5 good and 5 bad sites.
It's not a matter of bringing the best sites to the first page a lot of the time. It's a matter of bringing good enough sites to the first page and not bringing up junk. The sandbox keeps the junk out. It keeps out many good sites, too, but the existance of the sandbox can result in an overall better search experience for the users, who get annoyed by bad sites in the SERPS. It's also a reason Google is a better search engine than it was two years ago (as measured by how few bad results it produces.)
Most topics on the web have been covered adequately 100 times over. You don't need the "best" site to be at the top, just adequate ones.
What would you rather see?
1 - A website that looks like it was built in 1995, black text on a white background, with outdated information? or
2 - A spam filled search engine of advertisements for advertisements. That wants you to set you home page to their site before you leave?
Better safe than sorry. Google's just playing the odds. Hutcheson's right.
It seems reasonable to assume that Google's priorities are to keep the spam sites out of the top 10 and keep the top 10 useful.
It just doesn't seem that those priorities are achieved by merely delaying the entrance of a spam site by 6 to 12 months.
It is obviously less important to keep that spam site out of the SERPs for the first 6 months than it is to keep it out for the next 10 years.
That does seem like pretty bad logic for Google to just keep the spam out for 6 months or so when the spam ought to be prevented from ever getting into the index at all.
Maybe that faulty logic would be explained by Matt Cutts comment that this effect was not created by a logical programming of the Algo but rather just something that happened by the bringing together of some of the elements in the Algo.
Perhaps better things can come from this now that it has been brought to Googles attention.
|The absolute extreme of non optimisation we did with a site was just a couple of inbound links (waited a few weeks after launch) with none of the 'normal on-page optimisation techniques .... the site still got sandboxed... |
If a site that is poorly linked and devoid of detectable seo engineering can make it into the box, maybe TrustRank (as previously theorized by Selkirk) is the x-factor.
Further, knowing that some sites manage to evade the box, it would follow that there has to be a way for a site to have some TrustRank at inception (DMOZ link? Expert or Hilltop link?)
Matt's indication that the sandbox effect is a combination of factors not previously seen adds weight to the idea that TrustRank--or the lack thereof--is the culprit.
As it affects so many sites, it amazes me that Google's own pre-introduction testing of TrustRank did not yield evidence of the Sandbox effect. Is it possible that they only tested it on aged sites?
Letís assume for the moment--I know that I'm pushing it here--that being outside of the Google site model voids any positive TrustRank factors.
Outside of site model + zero TrustRank = Sandbox
Outside of site model + positive TrustRank = Sandbox
Fit site model for new site + zero TrustRank = Sandbox
Fit site model for new site + positive TrustRank = no Sandbox
If TrustRank were only tested on aged sites (far more likely to fit the site model) the Sandbox effect would have been difficult to detect.
There seem to be too many variables at play here to come up with anything definitive. I hope that better minds than mine are working on this. Short of that, I think I'm just going to have to learn to live with the box.
>>if Google could guarantee every search resulted in ten adequate sites in the top ten, by simply contriving to put all websites in a sandbox for five years
To which the reply:
>But of course they can't.
Yes, what I suggested would not work. And I'm sure Google knows it.
>Doesn't this line of thinking presume that Google believed the spammy sites wouldn't be built if the builders thought they wouldn't show in G serps?
I don't think so. I believe Google when they said it was an unintended consequence. (Which may be another way of saying that confirmed my suspicions.)
But I also believe them when they said it seemd like a good consequence. (again, perhaps because it confirmed my suspicions.)
But -- now assuming that Google has noticed that what they DID do (for the sake of quality) creates something you're calling the "sandbox effect" on new sites: do you think they would really care that all of the top ten sites were more than a year old? Or would they consider it an additional feature, a free bonus -- an indication that the site was stable, not a fly-by-night?
I don't think it's worth while doing something because it'll discourage spammers -- they're chasing dreams of avarice anyway. But I am sure Google is careful not to do anything that could be used to OVER-inflate the rank of new sites. After all, that was the Achilles' heel of Altavista. And see what happened to them -- spammers killed killed them with spamming page suggestions, and Google ran over the roadkill.
After this is said, there is still the frustration that we all feel when we try to get our sites found and dont even have a chance to compete for the few valuable spots in the SERPS. I for one feel like this sandbox effect is a bit of a cheap shot, not to mention the exceptional site that occationally comes along that is not available to searchers. There has got to be a better way. If it truly was just an inadvertant result, I am hopeful that Google is working on an intentional solution.
|I for one feel like this sandbox effect is a bit of a cheap shot, not to mention the exceptional site that occationally comes along that is not available to searchers. |
Many of us in here make the mistake of thinking that our own sites play an important part in the overall content of the Intenet. The fact is that there are very few sites around that would be really missed if they disappeared overnight. There are so many other to take their place. When you start to think out of the box the sandbox has no real effect on what Google's users see.
I have a site that has just entered its second year in the sandbox. It is based on a free service that I created and that you normally have to pay for elsewhere. Google sends me no traffic and this is painful for me because I put a lot of work into this site. But, do Google's results appear any worse for this? The real answer must be no. This is why they are unconcerned about the collateral damage caused by the sandbox effect.
Sandbox keeping many new but relevant sites out may not become noticeable for searches where there are hundreds of relevant websites available to rank from. But when it gets more and more specific, the absence of relevant sites become noticeable.
As a rule of thumb, for any search if there are couple or more monster sites, which have an oblique reference to the search term show up, while newer/smaller, but relevant sites sitting down the SERPs, you feel the presence of sandbox. With Jagger, I am seeing more of those monster sites. It is like you search for "Red Widgets" and www.onebigoldsite.tld/page1/news/date091198.jsp ranks with content that has "Ever been to the Red city? Red city is known for its summer Widgets... blah..blah in its 17th line.
I have almost swithed onto searching with allintitle: on Google.
As a side note, I like the results of Clusty.com a lot better. I don't rank #1 for all my sites though ;)
|We know from experience that many sites come out of the box after a quarantine period when nothing changes apart from their age. |
And if Google representative says AGAIN there is no Sandbox, don't you think you should start looking into the other direction?
That quote above could be an excellent starting point.
So, if you (just for the fun, of course) begin with the "No Sandbox" theorem, what could be the possible answer(s) for the effect observed?
- Something on the page started getting its points, due to age or some other value
- Something off page started getting its points, due to age or some other value
- New addition of some off page elements
- ...Anything else you can think of in that direction