Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
When the FI update first broke did www2 and www3 point to www-fi?
What's the point of www2 and www3 these days?
I am on the road and I need some info. Can someone please tell me if google's update from last week completed and settled?
Are we now looking at the new index across all datacenters?
Was this a traditional update Or are we still encountering strange things to this current (June) update like we did last month?
Please Advise. Thanks!
By the way, I dont know if anybody else notices but what is happening, at least on my field and for my chosen keywords is being an EXACT REPLICA from Dominic. I mean that is following the very same moves than dominic did all over the prior to this update, in the way the pages first shows, in the way, it spreads over the datacentres and fluctuates, in the way that some websites are loosing their positions at a particular stage of the update, etc. Exactly as in Dominic
I hope I am wrong as if I am right, I should get my website dessapearing from the index in just a few days as it happened with Dominic - finger cross - (I am just now number 1 to 3 for many great keywoards)
Anybody else seeing this?
If I had their resources, I'd divide the task of ranking among each of the data centers, and then stew it all together in stages. Hypothetically and arbitrarily,
#1 would rank based purely on anchor text
#2 purely on H1
#3 keyword density
#6 temporary results
#7 temporary results
#8 temporary results
then you take #1&2 results, interpolate an average rank and store to #6
#3 & 4 to #7
then #6&7 to #8
then #5&8 to index
such a division of responsibility seems much more efficient in terms of computations and would explain the differences - and similarities- across data centers. Then to keep the pros guessing, each update I would switch which data centers were the bulk information gathering points (6,7,8), and which had the initial, uncomplicated ranked information (1,2,3,4,5). I would also arbitrarily switch the DNS around to keep people guessing, such that one could not take a particular data center, determine that its rankings are all based on anchor text per se and determine the exact recipe for that particular trait without some degree of difficulty.
Granted, I'd be doing it all on local machines before sending it out for public viewing. Unless they like to measure response at places like this to give them new ideas...
It could at least catch up - delete dead backlinks from 2 months ago, give PR to a site that has had backlinks for over 2 months, upgrade PR for new backlinks, give a PR to millions of pages out there that are showing PR0 when the rest of the site is PR6 or 7.