Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
Tell that to advertisers who paid for phantom circulation in newspapers like NEWSDAY and the CHICAGO SUN-TIMES. :-)
True, there are abuses and it's also well known that they also inflate circulation by continuing to send out publications after cancelations, but at least there is some regulation and oversight... and these abuses aren't really the rule. Watch what happens to the career of those responsible. With some 2/3 of rev coming from ads, they've really hurt their rags credibility for a long time.
Adwords? Competators can click through expensive links and they don't work anywhere near as well as natural listings. I don't think they work as well vis a vis a targeted ad in a business mag in our case.
...sandbox? Off topic... sorry.
Our major site has been sandboxed since Sept. 23rd... has anyone yet make a comback? What is the record for time spent in the box?
What I see is failure, folding to the pressures, trying to adjust problematic components to achieve higher income, currently successfully, but that's a strategy that depends on the earlier success and surfing patterns engendered by that success. Plus the current lack of meaningful competition.
Oddly, there are easy to find precedents for this type of behavior, when MS has had serious competition, it has put out very good products, when competition was destroyed, innovation slowed, and in some cases, like IE, stopped altogether.
If a search engine can't deal with the web it won't last. The web is a fluid, ever changing medium, not a collection of old established sites.
Altavista failed for the same type reason. Making excuses for a company's failure is an odd approach to take with SEO work, but each to their own I guess.
Anyway, you can rest assured that MSN does not share your beliefs that the web can be searched and served to users only by applying a massive block to new material and sites. Killing spam has to be done differently long term, and the company that figures that out while maintaining freshness will win.
My theory why google has not updated is :
Msn will be releasing there search engine in the next couple of months.
Msn is constanly tweaking its algorithym daily, this is pretty obvious to any one repeating searches on msn beta. I think msn is checking there results against googles results trying to at least be on par with google.
Google is the gold standard and it would only make sense that in order to compete the search results would have to be as good as googles.
I think google knows this and will wait for msn to launch its new search results(on par with googles current stale index). Msn will have all kinds of press about how great the results are and how it is a good alternative to google.
And then whamo google releases the database of updated serps it has been perfecting for nine months,
less spam , more relevent, more content etc.
It then has independent companies compare google results with msn results, release the comparisons and
google wins the upcoming pr war.
This is just my thoery though. It could happen differently lol
Google is the standard, they have a pretty large index relative to everyone else. When MSN and Yahoo can get their index near Google's size they will have the same challenges. It's one thing to hold 2 billion pages, it's quite another to store 8 billion.
My site is probably sitting in a sandbox but it is in a pretty competitive area. Why should Google think my website is any better than those that have been around for years? Most of the "money" term questions have been answered thousands of times.
I have spent many hours writing content, sure I hope it pays off. Until then, I just keep writing - 6 pages a day - 400 - 800 words a page, every day, after my regular job is over.
I have to admit that my typing accuracy and speed is greatly improved!
Anyway, you can rest assured that MSN does not share your beliefs that the web can be searched and served to users only by applying a massive block to new material and sites. Killing spam has to be done differently long term, and the company that figures that out while maintaining freshness will win.
Correct! And spam will never be killed by relying on algorithms. Algorithms will always be beaten eventually. To kill spam will require manual intervention on a large scale but the web would be far better for this. All it takes for the search engines to announce that they are going after spam, roughly define it, ask people to report it then nuke it.
Dead easy and for the benefit of all concerned.
1) 75% content scraped affiliates with duplicate content
2) 5% doorways to existing sites
3) 10% rehashed content with little value
4) 9% bizarre ramblings
5) 1% fresh and original
Look at these threads.... 'I have launched 10 new sites and they are nowhere...' I bet most of these posters are not doing sites for new business's but are generating sites on topics they no little about and trying to make a fast buck.
99% new sites are probably worthless. It is better to organise and filter existing sites before you add 99% more rubbish.
If google continue to weed out the existing rubbish, which takes time, and continue to wait for new sites to pass some strict tests, like decent links in, then this seems a very sensible startegy.
Joe public dosen't notice if a site is fresh or not. Look at the search terms they use. Search terms are so basic and simplistic that you don't need a new site to satisfy them, an old site that has passed all the tests is just as good.
When joe public uses a search engine it is because they don't know any sites about the topic, so how can they care if the site is old or new! Existing sites cover news stories so the spectrum is covered. Obscure searches can pull up new sites, so thats that done. Game over.
Joe public does not need new sites, because to them old sites ARE new.
If old sites were not valuable then why do people 'bookmark' and return to a site over and over again? You have to look at the profile of a search engine user, they are generally NEW to a topic so all results are 'fresh' to them. Those people researching a topic in depth will use more sophisticated search terms and newer or specialist sites will rank for those.
The point is, and will remain, that Google has developed a serious incompetency.
Archive.org already has the job of old site repository.
And the notion of 'game over' -- what are you saying? ... that there's no new knowledge to be had out there?
Do pre-Sandbox sites represent some kind of final, resting, informational nirvana only equalled by the Renaissance?
Perhaps when all text ever written is available online we'll be a little closer to the web being a "full" resource.
In any case, I'd guess that the sandbox (assuming that it exists) is merely a temporary bandaid that Google has applied while developing longer-term solutions to the problem of boilerplate affiliate pages, "made for AdSense" scraper sites, and other clutter that makes it harder for Google to fulfill its stated corporate mission of organizing the Web's information and making it universally accessible and useful.
I'm just going by what people say here. Nearly all are complaining about affiliate/scraper sites and spam dominating in the current index. Therefore, unless some miracle has occured, most of all new sites are equally proportioned with affiliate/scraper spam.
Seo caused the need for the sandbox. We are all to blame for the problems we have now, it is not googles fault but our own. We have shared ideas and tactics on how to get our sites top, with a greed and arrogance that our sites are best.
The top positions should be gained through fair democracy (honest links) and not insider knowledge.
"..that there's no new knowledge to be had out there?"
Yes, new knowledge is rare and any new knowledge can be served by qualified sites already listed.