Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
If one has a new page on an old site and promotes it heavily with new links what do people think would be the effect
In all likelihood it will exhibit the similar behaviour as a new site, minus a month or two. But remember, if that new page happens to be a page on a CNN, Stanford kinda site, it might rank within a couple of weeks, for the old links are doing a BIG favour.
Mc
Making guesses like that can be dangerous. For one, the growth of the web is not from new sites only. It is from new pages on old sites, too.
I think claiming that it is "dangerous" is a bit strong :) This is just a speculative figure but remember that I used the term "as much as 15%". So I will stand by this.
Uh, wait a minute! You've managed to maneuvre yourself into a corner there. Yes, GG has debunked wrong theories in the past. The fact that he hasn't yet debunked the sb tells me that ... bingo!
"Manoeuvred myself into a corner"? I am afraid that I don't get what you mean here? This problem just happens to be one of the most significant things to happen with Google since day 1 and they haven't been able to comment. GoogleGuy has obviously been silenced on this subject. If it were just the effects of an attempted clean up he would be all over this forum like a rash, "There is no sandbox", "Have a look at our guidelines.", etc, etc.
Bring on the media ;)
Then they will go with that until they finish their new bigger database system with the new faster mozilla 5.0 crawler etc...
I think it's all in the works right now. Just about 3-4 weeks ago google spidered all my sites pretty deep. Even my "banned/blocked/pr0'ed/hijacked etc" websites got spidered just like they used to back in the good ol' days before all this sandbox stuff happened.
But I didn't see any of those new pages in this new 8 billion page index. As a matter of fact, nothing changed for my sites rankings or traffic from google.
So I expect to see a new update soon when they switch over to the new improved system.
What do you all think?
I think claiming that it is "dangerous" is a bit strong
Sorry for having used the word 'dangerous'. But let's not get distracted by rethorics.
This is just a speculative figure but remember that I used the term "as much as 15%". So I will stand by this.
What do say about my other points? You stand by your vague guess but you don't respond to my rational claims against it. You said the G sandbox hides "as much as 15%" of the web. I responded that your figure can't even be aproximately right because A) the web's growth can't only be attributed to the addition of new sites and B) the sb only affects competitive areas. What's your opinion on that? And what's your opinion on the -sdasadad effect?
GoogleGuy has obviously been silenced on this subject.
Has he told you so? I'd stick to the facts here. And the facts are that he (or she) has been
silent. Why the silence? We don't know. But I admit that the silence can be interpreted my way (the sb exists) or your way (the sb is a technical problem). We maneuvered you out of that corner. ;)
Anyway, I wish GG would say something ...
What do say about my other points? You stand by your vague guess but you don't respond to my rational claims against it.
As you say, we should not get distracted by rhetoric but this was my response ;)
Has he told you so? I'd stick to the facts here. And the facts are that he (or she) has been
silent.
Well, amen to that :)
Please do that test and post the results.
Thanks.
One of my sites is about Widgets. 6 months back we added a few new pages about Midgets :), and there is no news of them in the SERPS; yet the page is indexed, pr 7 and features under site:
However, pages added with widget keywords came into the SERPS within 2 weeks.
So, I tried an experiment. I added the word Widgets to the exiting Title of the page. Walllah, the page was soon ranking well not only for the keyword string with Widgets in it, but also for the keyword string with Midgets. Somehow adding Widgets, under which the site was previously classified got the page around the "sandbox"
This shows that Google has developed a machine-made directory of keywords and all sites are in one or more keyword categories.
It then follows, that Google is taking time to add a site to a keyword category. Once added, all the pages of the site will come under the primary results of the keyword and ranking is done using traditional factors. Pages on the site lacking the keyword category will not feature till the site is listed for the additional keyword category.
This has been going on since February. It doesn't take more than nine months to build a new index. Does it?
Google's current main index (and any others they may have public as supplements) is 32-bit, both in OS and hardware. If they switched to a 40-bit (5-byte) index (which would have a capacity of 256 times the current 4.2B size), then would be best off running it on 64-bit hardware and software. This would be especially true if they intend to continue adding semantic indexing and ranking features, since the size of the index causes exponentially greater calculation cycles. Moving to 64-bit would greatly reduce that processing time.
I saw an estimate somewhere about what it would cost them to go fully to 64-bit and it came to about $10M USD, including the proprietary rewrite of the software they run, but I doubt the purchase would have happened prior to the IPO (at least not for the hardware). Also, if I was working on such a thing, I think I would put my efforts into the new 64-bit index and not put any more time than necessary maintaining an old one that was just going to get thrown away as soon as the new one went online.
THERE IS NO SANDBOX!
Google introduced new algorithms in February and these algorithms are tough, tough ant-spam algorithms. They are based on lots of factors like:
How quickly the links were amassed
Quality of links
How quickly pages were increased
Etc etc
The bottom line is that Google has dictated that all sites in the future will not rank well unless they behave like a normal site would behave and unless they are well considered by the Internet population.
But here's the rub: Google does not have an archived history of the building up of links and pages for sites that were already in its index, so it has to start afresh just with new sites with this algorithm and give already established sites the score that they previously had (apart from of course the anti-spam algorithms it applied in February on existing sites - interlinking etc).
The result: existing sites carry on being rated well and new sites have a mountain to climb to rate well. They are not sandboxed, they are just having to beat google's algo from base zero, whereas existing sites are beating it from base 5, or 6 or whatever.
All IMHO of course! :)