Forum Moderators: open
Lets look at things on a wide perspective. Google indroduced Freshbot quite a while back and slowly but surely Freshbot has been getting better and better and "Deeper" at gathering website content. GG himself has stated that he would not mind a "Slow waltz" compared to the gruling update each month. I belive that Google is not "just" playing around with spam filters ( to which I applaud them). Judging by all the freshbot updates that have been going on the past few days without a trace of the deep crawler this late in the month makes me wonder if Google hasn't already started the "Slow Waltz" with consistant updates all month long instead of a huge crunch. Case in point. I had a friend come to me asking me why his site will not list in google at all. I took a look and he had a grey PR. The site had been up for over 6 months so I knew it didn't suffer from the "Newsite Itus" I looked closer at his html code and found out he had a few major mistake. Most noteably a closing HTML tag in the middle of the page. I repaired the errors and no less than 3 hours later his PR jumped not only to white, But to a PR2. I have seen this type of senerio play out on another site, but the PR didn't show til the next day. These are both recent events happening after the April 7th update. At around the same time SJ's database suffered a rollback of some kind preparing it for a content update. Some of you are wondering what I ment by rollback. Well I will try to explain. A friend of mine who is a web designer called me and asked why several sites dropped out of the google index and noted that the PR for the sites were all at 0. I checked the backlinks and google had none. I searched through all 8 datacenters for listings for each of his domains that dropped to PR0. All datacenters except SJ had his pages. So I did a search for an older domain that has been in the index for 6 months or more. SJ had it but the strange thing was... the cached pages were 4 months old. I checked several other sites and I noticed that the cached pages ranged from 2-4 months old. I belive the database was being prepared for a new bot on the block and retiring the deep crawler for good. Another reason I believe this is google now has a 9th known datacenter that has similar content to SJ. It appears to be updating a little faster than SJ but the results in its index are fairly similar. Also Datacenter 6 appears to be going through a change also. I guess the whole point of this post is to ask GoogleGuy point blank.
Is Google preparing its datacenters for a consistant month long update and is this scenerio playing out right now, and is this the cause of all the PR0 and lack of backlinks. Or did SJ crash from a bad filter and a restore from a backup was done to repair the error.
I know you are not obligated to say anything GoogleGuy. I know your response will be cryptic at best but a little truth is let out in each of your posts. I think I have read enough of them to figure it out :)
re: dominic GG thread:
BTW, Brett is it possible to code it so that all of googleguy's responses go in one thread (or automatically get copied there) so we can see what he has to say w/o chasing him? ;)
regads
Mark
MC thanks for the welcome. I have been visiting the forums for over a year now, but until now I haven't felt a need to post. I usually live by the moto "Always a student and listen until it is time to raise your hand" I felt it was time to raise my hand and ask a question :)