Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

McMohan

6:24 am on Nov 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Any ideas how many pages does MSN beta has indexed? It has larger no. when I search for www. Isn't the capacity problem affecting MSN too?

gomer

6:45 am on Nov 26, 2004 (gmt 0)

10+ Year Member



Nice posts renee, Scarecrow - thanks.

Vec_One

9:19 am on Nov 26, 2004 (gmt 0)

10+ Year Member



Yes, excellent posts. Kudos to both of you, and others.

buvar

10:25 am on Nov 26, 2004 (gmt 0)

10+ Year Member



ditto

Hanu

11:22 am on Nov 26, 2004 (gmt 0)

10+ Year Member



ScareCrow, best WebmasterWorld post for a couple of months. Thumbs up.

BeeDeeDubbleU

11:52 am on Nov 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Everything is fragmented, and it's all in the direction of less predictability and less quality in the SERPs. While less predictability in itself this may serve to make life difficult for spammers, by now it's gone way beyond anything that can be construed as purely a set of anti-spam measures.

Absolutely on the money!

I cannot argue about search engine technology or the mathematics of this problem but I do have a masters degree in common sense (from the university of life). This qualifies me to state that this is definitely not an anti-spam measure. How anyone can still argue the case for this is beyond me. This is not anti spam, it's anti new content. "New" in any other commercial context is attractive and Google would never deliberately restrict all new sites from featuring. This would be committing commercial suicide.

Also, if this was an effective spam measure Google, as a commercial entity, would be bragging about it everywhere, "Google announces amazing new spam prevention technology.", etc.

Can I refer back to the point I made yesterday in message 188? On a search for a NINE word phrase, a search engine that cannot find a web page with that NINE word phrase as its page title MUST be defective in some way.

Hanu

12:52 pm on Nov 26, 2004 (gmt 0)

10+ Year Member



BeeDeeDubbleU, mind stickying me that phrase?

BeeDeeDubbleU

1:30 pm on Nov 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Will do!

prairie

2:29 pm on Nov 26, 2004 (gmt 0)

10+ Year Member



Its really hard to believe that they couldn't fix what is happening if they wanted to. No one's better resourced than Google, and lesser engines don't have this same issue.

Such, deliberate or not and to what extent, it might be fair to assume that the current situation suits them. That being the case this could be a very long term thing.

And keep in mind Yahoo search, because its always been lagging behind in sorting its information out. The rate at which they spider and sort, while obviously publically acceptable, is decrepit when compared to Google, even with its problems.

I've given up trying to fight this head on, I just use an older and established domain for important content these days.

espmartin

4:31 pm on Nov 26, 2004 (gmt 0)

10+ Year Member



Whatever the cause (Google, why do you do this?) - I am also
affected.
I had a top 5 ranking for the competitive term "web
design standards" - when my site was on a sub-domain for
a popular web host (aboho).

Since I have bought my new domain, my site no longer ranks
anywhere in the top 1000!

It has been about 5 months now :~/ Googlebot visits very
regularly, crawling my sitemap and newly content-ized pages.

How long must I wait?

This 472 message thread spans 48 pages: 472