Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
The problem is that legit sites with some SEO are being dropped down, while spam with very little content for the human visitor and massive SEO is rising to page 1 like never before.
You make it sound like only the cream of the crop are going to rise to the top, but you are wrong.
GoogleNews exposes the story to everyone on Earth, before the nephew is even finished stuffing his meta-tags.
(They send out email alerts, too ;) )
If you think I'm calling good sites spam, just because they are page 1, you are wrong (again). To imply that I call some sites/results spam because I'm bitter is insulting.
You seem to think quite highly of your own opinion, which might make it hard to see that everything you say isn't correct.
Spam is spam, by definition a bunch of crap that nobody wants, and Google has a bigger problem with it than ever from what I can see.
I'm talking about self contained networks of useless garbage, not well established information sites with natural links from other on-topic sites, patiently built up over time.
Some webmasters will think they know everything despite evidence to the contrary, business as usual.
>Google does not see it this way so Google has a problem!No, you have a problem. Your problem is that you think your site should be ranked highly and it isn't. Until you get enough people to agree with you and link to you then you will continue to have a problem.... this now takes time.
I'll mention again...the consensus here seems to be that google needs to explain the sandbox effect or fix the problem. Failure to do so will adversely affect their relationship with webmasters, who provide the content they index.
The problem I see is, since the time of the sandbox, I've noticed a lot more clutter. It seems that for some terms, one person (or one team) can fill 20 pages of SERPs with their computer generated junk pages on throwaway domains, while good sites are spread thin throughout.
The first page used to be full of very informative or authorative sites exclusively in sectors I use for hobbies etc, now I don't see that as much.
Having to dig down to page 10 for relevant sites is something I'm getting used to, but I don't like. I know that good sites are down there though, so I do dig that far.
Why? Their share price looks fine, they had better than expected results recently and if everybody's website that contributed to this thread disappeared I doubt it would have much effect on google..... oh, hold on, apparently everybody's website has disappeared.... and it didn't effect the share price :)
Joe public won't know about sandbox or apparently care's. Lets have a reality check, Google has its own agenda and to assume they have a problem that they will fix is pure wishfull thinking. They have a policy of secrecy, which is their privilage.
I can see no reason why they should rank new sites highly just because they are new. It makes sense to me that they err on the side of caution and only when they are good and ready start ranking them.
Presumably, with sandbox, they will have a problem in the future. Which is why I believe the sandbox is there. These sites are getting lucky now, but no discerning webmaster will link to them, so in the long game they will be overtaken by the 'cream' which will attract the quality links.
Its a brave move by Google, and they can't announce "oh yeah, we know we have poor results at the moment but we are playing a long game..."
I can see no reason why they should rank new sites highly just because they are new.Do you think there is even one surfer anywhere who is hoping that Google won't show him the answer to his query simply because it resides on a new domain? It's not that we want a bonus for having a new site. We just want our sites to be judged based on the quality and usefulness of the content, same as our old sites.
They have a policy of secrecy, which is their privilage.Yes, that is their privilege. But it is a policy that is rarely, if ever, rewarded.
These sites are getting lucky now, but no discerning webmaster will link to themWhy wouldn't a webmaster link to them? After all, they are apparently the authority now.
I doubt there is a query that cannot be answered by an existing established site. If the search phrase is non competitive then a sandbox site will appear anyway. However, the issue is that Google would rather show sites it trusts, than new ones with few links in and/or natural 'recommendation'.
>Why wouldn't a webmaster link to them? After all, they are apparently the authority now.
Clever :) But in truth how many know about 'authority' and in the grand scheme of things most will not link to them out of principle.
I can see the tempation to believe that Google is fully in control, Google continues to enjoy some degree of what they called 'teflon' in the USA during the reagon years. However, if you reject the assumption that google is fully in control currently it tends to create a scenerio where almost all the observed events can be explained, not just the ones that make the company look good.
"has googleguy said anything"
Who cares what googleguy says? What he says will be one of the following:
1: true
2: false
3: contain some truth, just enough to make it convincing.
since I can't know which it will be, I ignore everything he says.
[edited by: lizardx at 9:56 pm (utc) on Nov. 28, 2004]
That is completely backwards. Getting high quality links is the worst thing to do in terms of sandboxing. If you want to fight the sandbox, step one is to not get *any* links from the top 100.
(The low quality way to avoid the sandbox seems to be working less well this month, but high quality links still are the kiss of death.)