Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

McMohan

6:15 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If one has a new page on an old site and promotes it heavily with new links what do people think would be the effect

In all likelihood it will exhibit the similar behaviour as a new site, minus a month or two. But remember, if that new page happens to be a page on a CNN, Stanford kinda site, it might rank within a couple of weeks, for the old links are doing a BIG favour.

Mc

Powdork

6:28 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In all likelihood it will exhibit the similar behaviour as a new site, minus a month or two.
New pages on mature domains can rank well within hours. It doesn't have to be CNN or even remotely close.

McMohan

7:48 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



New pages on mature domains can rank well within hours. It doesn't have to be CNN or even remotely close

Yes. But if the term happens to be a competitive one, even a mature domain will take time.

BeeDeeDubbleU

8:24 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hanu

Making guesses like that can be dangerous. For one, the growth of the web is not from new sites only. It is from new pages on old sites, too.

I think claiming that it is "dangerous" is a bit strong :) This is just a speculative figure but remember that I used the term "as much as 15%". So I will stand by this.

Uh, wait a minute! You've managed to maneuvre yourself into a corner there. Yes, GG has debunked wrong theories in the past. The fact that he hasn't yet debunked the sb tells me that ... bingo!

"Manoeuvred myself into a corner"? I am afraid that I don't get what you mean here? This problem just happens to be one of the most significant things to happen with Google since day 1 and they haven't been able to comment. GoogleGuy has obviously been silenced on this subject. If it were just the effects of an attempted clean up he would be all over this forum like a rash, "There is no sandbox", "Have a look at our guidelines.", etc, etc.

Bring on the media ;)

eyezshine

9:08 am on Nov 24, 2004 (gmt 0)

10+ Year Member



I just think google is building a bigger better database that can hold more than 4 billion pages. But for now, to compete with msn they have built a supplimental results index with 4 billion pages so they can say they have 8 billion pages now.

Then they will go with that until they finish their new bigger database system with the new faster mozilla 5.0 crawler etc...

I think it's all in the works right now. Just about 3-4 weeks ago google spidered all my sites pretty deep. Even my "banned/blocked/pr0'ed/hijacked etc" websites got spidered just like they used to back in the good ol' days before all this sandbox stuff happened.

But I didn't see any of those new pages in this new 8 billion page index. As a matter of fact, nothing changed for my sites rankings or traffic from google.

So I expect to see a new update soon when they switch over to the new improved system.

What do you all think?

BeeDeeDubbleU

9:36 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This has been going on since February. It doesn't take more than nine months to build a new index. Does it?

Hanu

9:58 am on Nov 24, 2004 (gmt 0)

10+ Year Member



BeeDeeDubbleU,

I think claiming that it is "dangerous" is a bit strong

Sorry for having used the word 'dangerous'. But let's not get distracted by rethorics.

This is just a speculative figure but remember that I used the term "as much as 15%". So I will stand by this.

What do say about my other points? You stand by your vague guess but you don't respond to my rational claims against it. You said the G sandbox hides "as much as 15%" of the web. I responded that your figure can't even be aproximately right because A) the web's growth can't only be attributed to the addition of new sites and B) the sb only affects competitive areas. What's your opinion on that? And what's your opinion on the -sdasadad effect?

GoogleGuy has obviously been silenced on this subject.

Has he told you so? I'd stick to the facts here. And the facts are that he (or she) has been
silent. Why the silence? We don't know. But I admit that the silence can be interpreted my way (the sb exists) or your way (the sb is a technical problem). We maneuvered you out of that corner. ;)

Anyway, I wish GG would say something ...

BeeDeeDubbleU

10:36 am on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hanu ...

What do say about my other points? You stand by your vague guess but you don't respond to my rational claims against it.

As you say, we should not get distracted by rhetoric but this was my response ;)

Has he told you so? I'd stick to the facts here. And the facts are that he (or she) has been
silent.

Well, amen to that :)

digital

11:33 am on Nov 24, 2004 (gmt 0)

10+ Year Member



I didn't read the whole thread, but let's somebody do a little test. Open let's say www.google.fi or www.google.bg or Google in a language which is not so common and search for you position on your main keywords for sites which you think are in the Sandbox. Where do you rank? My site is experiencing the following think. On www.google.fi for example(I checked that on .es, .bg and etc.) I rank #2-#4 for my main keywords, but on www.google.com, google.co.uk, google.jp.. I rank always #715-#715. My site is around 3-4 months old and I guess I am in the Sandbox too.

Please do that test and post the results.

Thanks.

Namaste

12:37 pm on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My observation is that it is not so much websites per-say that are "sandboxed", but keywords for that site which as "sandboxed".

One of my sites is about Widgets. 6 months back we added a few new pages about Midgets :), and there is no news of them in the SERPS; yet the page is indexed, pr 7 and features under site:

However, pages added with widget keywords came into the SERPS within 2 weeks.

So, I tried an experiment. I added the word Widgets to the exiting Title of the page. Walllah, the page was soon ranking well not only for the keyword string with Widgets in it, but also for the keyword string with Midgets. Somehow adding Widgets, under which the site was previously classified got the page around the "sandbox"

This shows that Google has developed a machine-made directory of keywords and all sites are in one or more keyword categories.

It then follows, that Google is taking time to add a site to a keyword category. Once added, all the pages of the site will come under the primary results of the keyword and ranking is done using traditional factors. Pages on the site lacking the keyword category will not feature till the site is listed for the additional keyword category.

neuron

1:48 pm on Nov 24, 2004 (gmt 0)

10+ Year Member



This has been going on since February. It doesn't take more than nine months to build a new index. Does it?

Google's current main index (and any others they may have public as supplements) is 32-bit, both in OS and hardware. If they switched to a 40-bit (5-byte) index (which would have a capacity of 256 times the current 4.2B size), then would be best off running it on 64-bit hardware and software. This would be especially true if they intend to continue adding semantic indexing and ranking features, since the size of the index causes exponentially greater calculation cycles. Moving to 64-bit would greatly reduce that processing time.

I saw an estimate somewhere about what it would cost them to go fully to 64-bit and it came to about $10M USD, including the proprietary rewrite of the software they run, but I doubt the purchase would have happened prior to the IPO (at least not for the hardware). Also, if I was working on such a thing, I think I would put my efforts into the new 64-bit index and not put any more time than necessary maintaining an old one that was just going to get thrown away as soon as the new one went online.

BeeDeeDubbleU

2:48 pm on Nov 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Did you say $10M USD? Google probably have more than that in the coffee fund :)

Pimpernel

3:00 pm on Nov 24, 2004 (gmt 0)

10+ Year Member



I have not read the whole thread here, so forgive me if I am repeating something already mentioned, but here is our take on this sandbox issue:

THERE IS NO SANDBOX!

Google introduced new algorithms in February and these algorithms are tough, tough ant-spam algorithms. They are based on lots of factors like:

How quickly the links were amassed
Quality of links
How quickly pages were increased
Etc etc

The bottom line is that Google has dictated that all sites in the future will not rank well unless they behave like a normal site would behave and unless they are well considered by the Internet population.

But here's the rub: Google does not have an archived history of the building up of links and pages for sites that were already in its index, so it has to start afresh just with new sites with this algorithm and give already established sites the score that they previously had (apart from of course the anti-spam algorithms it applied in February on existing sites - interlinking etc).

The result: existing sites carry on being rated well and new sites have a mountain to climb to rate well. They are not sandboxed, they are just having to beat google's algo from base zero, whereas existing sites are beating it from base 5, or 6 or whatever.

hdpt00

3:07 pm on Nov 24, 2004 (gmt 0)



If this is the way they are going to work and make it impossible for new sites to rank, M$ will not have a problem becoming dominant. They better rethink. A lot of what is good or cool come first from webmasters, if they satrt telling everyone M$ is the stuff, it will only be a matter of time.

Pimpernel

3:35 pm on Nov 24, 2004 (gmt 0)

10+ Year Member



Who said they were going to make it impossible. Just very difficult and seriously anti-spam. I think there is a lot of nonsense written about google being broken and people are going to leave in their droves etc etc. Every time there is a change in google's algorithms half the people (the ones who lost out) think google is on the verge of bankruptcy and the other half (the winners) think it is worth a trilion dollars. Get real guys and lose that bias. It is a very very good search engine. Criticise it for specific problems with its results, not just because you are no longer there.

All IMHO of course! :)

This 472 message thread spans 32 pages: 472