Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
We are all to blame for the problems we have now, it is not googles fault but our own.
Speak for yourself. All of my websites are legitimate and provide useful information and services. NO Directories or scraper sites.
What you are also forgetting is that Google claims that "Google's mission is to organize the world's information and make it universally accessible and useful."
"Universally accessible?" I don't think so.
With the advent of every chancer now building sites, new sites probably are:1) 75% content scraped affiliates with duplicate content
2) 5% doorways to existing sites
3) 10% rehashed content with little value
4) 9% bizarre ramblings
5) 1% fresh and original
Yes, new knowledge is rare and any new knowledge can be served by qualified sites already listed.
These two comments are sweepingly moronic. More new sites are being built because more people are realizing that they can contribute something to the web, be it a content site, a business brochure, a sales letter...the key to remember is that this is what we want. This is the whole raison d'etre of the web: to disseminate free knowledge.
To claim that it is unproblematic for only old, established sites to contribute to that dissemination is to assume that only individuals who realized they could contribute PRIOR to May 2004 (or whenever) have anything useful to say.
Are you saying that because someone did not launch their web site four months earlier than they did that they have nothing to contribute to their niche? As if timing has anything to do with knowledge? Please.
Obviously, many (probably most) existing, indexed, SERP-populating pages are regularly updated. Spend any appreciable amount of time here on WebmasterWorld and you can see that most site-runners know the importance of new content for staying at the top. But just because www.establishedWidgetSite.com updates their widget information every day, doesn't mean that www.brandNewWidgetSite.com doesn't have anything new or valuable to offer.
And it certainly doesn't mean that search users don't deserve to have access to that information. Whatever the cause of the sandbox effect - antispam (doubtful), mistake (doubtful), secondary-tertiary index (probably) - the bottom line is that searchers suffer. You go online to find as much relevant information on your search as you can. You use a search engine assuming they are making every effort to deliver those relevant results to you. Only, behind the scenes, it turns out there's a whole 'nother internet out there that isn't showing up in your browser. Disappointing, to say the least.
cEM
I take your point, but it is not for a spider to decide what is good or not, it is for other humans. Google made the mistake of ranking new sites too quickly and filled the serps with rubbish.
" As if timing has anything to do with knowledge?"
Exactly - New sites does not mean they are any good. It is illogical to rank them above other sites that have pedigree just because they are new. Quality would suffer if you only showed new sites without the human process of 'voting'. These votes, in the form of links, need to be evaluated very carefully and this takes time.
There are practicalities that need to be accounted for:
1) There are only 10 spots on the first page. If you had a policy of only showing new sites you would last ten minutes before another 100 new sites came online. Logic dictates you only show the best as percieved by quality of links in as you cannot possibly show all.
2) Spiders are dumb. Therefore you have to wait for people to vote and make sure those votes are real.
"the bottom line is that searchers suffer."
And they will continue to suffer unless the human voting system is rigourously applied. Older sites that are already in will drop without votes, new sites won't get in without votes. Thats the situation now, the effects take time to surface but quality will result.
" it turns out there's a whole 'nother internet out there that isn't showing up in your browser."
They are there, just very deep. As a searcher, you want sites that others have found useful and voted for, these will usually be older sites. At present there are sites ranking well based on corrupt votes, but they will drop as the new system takes over.... its called hilltop.
It is illogical to rank them above other sites that have pedigree just because they are new.
I have had a few beers (well it is Saturday night!) but here goes.
It is not illogical to rank them above other sites if they are better than the other sites. To claim otherwise is just plain stupid. The Internet is based on democracy (I think?) Search engines should put the best sites at the top of the rankings, new or old. If they cannot do that then then they have failed. They may be making their shareholders happy, they may be making money but they have still failed.
(Did I do OK with a few pints of Abbot ale and Guinness in me?)
They are there, just very deep. As a searcher, you want sites that others have found useful and voted for, these will usually be older sites.Perhaps if you are searching for widgets. But if you search for a company, restaurant, candidate, charity, etc. by its name, even the joest of joe surfers will hope to actually find that company, restaurant, candidate, charity, etc. somewhere on the first page of results. When other engines do present these entities on the front page of results it is and should be an embarrassment for Google.
I once tried to significantly reduce the amount of email spam our company was receiving. I spent way too much time creating server-side message rules, etc. I eventually came to the conclusion that the web is much too big to hold off manually. I'm sure Google figured that out a long time before I did.
Conclusion: There is no apparent rational reason to discriminate against newer sites.
Personally I figure there is a dampening effect on new links and the knob is turned up too high.
BeeDeeDubbleU, you did very well. :)
I bet MSN comes out with a way bigger index than google's wimpy fake 8 billion page index when it goes live to the public. The beta is just a test with a small amount of pages. The way they've been spidering makes me think they have a much bigger capacity than google.
But a huge amount of OLD sites that appear in the Serps are NOT pedigree.
It is logical to remove those that aren't but this will never be achieved by an algo - only by human intervention :)
EW
As I said earlier, Google claims that its mission is "to organize the world's information and make it universally accessible and useful." It is not now doing that but it has made no statement or admission about this so it clearly feels that it is above the law in this respect.
I only have two or three real competitors and the rest of the results are just single pages that mention the term once, some even geocities sites that rank higher than me!
I rank well for one term, but not at all for the other. I’ve just never been ranked at all in Google for the second term. Seeing as there are only two sites that come close to my SEO levels and anchor text then I should be on the first page and both Yahoo, MSN and every other search engine agrees, apart from Google it would seem. My current search engine breakdown is like this:
Term 1
Google: 1st
Yahoo: 2nd
MSN: 2nd
Term 2
Google: Not ranking
Yahoo: 6th
MSN: 10th
I started my site in Dec 03' and I am still not visibly ranking for the keyword (at least the first 1,000 results in Google). Now this is really strange because I have the most anchor text (of the term), back links and PR out of all the sites targeting that term. What makes it more frustrating is that some sites that don’t even mention the word on their site, have any authority rank, or have any PR are ranking higher than me!
At first I thought my site was a good example of the so called 'sandbox effect', but I just can't believe that Google could sandbox my site, for this term, as long as it has.