Forum Moderators: open
The update or what was formerly known as the update - is over for this month. On behalf of the other 99,900 members of webmasterworld - please, lets move on!
Well, I would like to apologize for resurrecting this issue. However, I have lurked in the forums for the past few months, but never saw any evidence to support this statement. I'm sure many others are similarily confused.
Can the current flucuations in the SERPs be explained by: a result of an ongoing update or an exaggerated form of Everflux or something else (which requires a new label)?
So, on behalf of the other confused members of webmasterworld, please take a moment to explain your reasoning on this issue. Thanks for your patience.
(Added 5 minutes later) I never intended to start another dinosaur of a thread. PLEASE no superfluous quips, no updates on the current SERPs, only attempts to answer the question posed. (No wonder Esmeralda threads get a bad reputation!)
[edited by: Umbra at 7:12 am (utc) on July 2, 2003]
The thing that sets PageRank apart from other link value crediting algos is the mathematical equation that provides a difinitive value of each and every page in relation to all other pages in the universe of pages being analyzed.
If page A suddenly gets PageRank credit for a new link from page B you can no longer accurately define the "PageRank" of either page without considering all other pages in the universe. This is why there has always been a monthly update to take the universe of pages at a given moment in time and establish value (PageRank) for their inter-relationship, one to another.
If you were to skip this update and arbitrarilly add value to page A for the new link (without considering the impact on all others) you soon end up with a dataset of pages with values that are not finitely accurate in relationship to one another.
When you realize that true PageRank can put all 3 billion pages in exact order, you can easily see why an estimated value may move a given page up or down the list. If the estimate is off by the slightest amount you can see how a page could also move up or down in the SERPs accordingly.
If what we are seeing is an attempt to "estimate" the value of a link in order to update that page's PageRank "on the fly", it would allow Google to do updates less often. The cumulative effect of the error in "guestimates" on PageRank would however, compound the longer it goes without an update.
Any thoughts?
Just what I've been thinking, however the old system was pretty poor using 'many weeks old' data with calculations taking longer and longer.
I wonder how they'll calculate the PR on the move values? Leave the numbers crunching away whilst feeding in new data and distributing new values daily?
If what we are seeing is an attempt to "estimate" the value of a link in order to update that page's PageRank "on the fly", it would allow Google to do updates less often
Are less frequent and/or less deep bot spidering what has been observed lately?
To my best knowledge - by virtue of the posts here on ww - the opposite seems to be the case.
pages with values that are not finitely accurate in relationship to one another.
Interesting concept, sounds almost like probability calculations. Nice.
/claus
One website I watch is very new and went straight to number 1 for our chosen keyword (not a very competetive area). Then it disappeared, and looking at the datacenters, I found that it was still at number one everywhere except www-va (and main google). So I assumed (?) that main Google index for the last few days was coming from va - at least for that keyword.
Today, the site is back at number 1. But the datacenters remain the same. ie site top everywhere except va. So does this mean switching datacenters? Is this what everflux means?
As I said, don't know if this means much anyway.
I think they can apply the appropriate estimated value based on other links leaving the page and assign similarity ratings based on the page content. In that scenario the value of the new kink should be reasonably accurate.
The part that is in question is the fact that the receiving page now has a higher PageRank. So... Did the receiving page have an increase in PageRank? Did the value of all other links from the sending page reduce in value? Is the increase or reduction in value of these link passed on to their receiving pages?
It leaves lots of questions to be answered.
I can see that a constant shifting of sites in say, the top twenty based on new content/new backlinks would be appealing to someone who wanted freshness, but not From #1 to Eternity and Back (starring Deborah Kerr & Burt Lancaster!).
My vote is that the update process itself is in everflux and that we won't know what's going on for a while. Up the meds, webmasters!
Everflux, as we have come to know and define it, seems to be changing as well. The term has been defined and redefined here many time over, so I won't even try. We will all have to wait and see what "new" definitions many of our common terms now carry.
Best advice with a new site is to be patient.
Come to think of it, that's probably good advice for all sites.
Cheers!
I believe there is more than one new change happening. The "1st to eternity" is a difficult one to comprehend. I am inclined to think that a dataset was created at some point that had major flaws due to an algo effecting listings in a way that Google had not expected. Google may be going back and forth from that dataset to more accurate ones throughout the system to get the best of both datasets. I think this will go away when they are done. It doesn't make sense that when someone searches for "internal revenue service" that the searcher can't find the site.
It does appear that the bots may have new and possibly overlapping responsibilities. I am no expert on bots but there seems to be something there for us to learn in the future.
From what I understand PR is calculated iteratively and values only become finalized after many iterations. So any given page has a PR calculated for it based on the PR of all the pages linking to it, this is then repeated for all pages in the network until some terminating condition.
I'd just throw the new data for a page into the db and they can be used by the algo for any page that is affected on the next iteration. OK there will be inaccuracies but then there are probably ways of dealing with the most extreme cases.
I think that is exactly how they would do it, and for a while the estimates would work. I think the question is... How long?
It's not that. They are changing how they are doing things. It has been a long process and it isn't over.
I am with Clark, and GoogleGuy has said on a number of occations to be patient. I am not happy about what is going on with Google right now, but that does not mean I am going to worry to much.
I have taken my Network of sites through some major changes and overhauls. I can say this is no easy task and that you end up with a big mess for the first few months. In some cases I found problems 4 and 5 months later that called for another overhaul. Google is just going through some growing pains. I think things will settle down in the next two months. Well... REALLY HOPE anyway.
Best thing we can do is hold on to your pants, the ride ain't over just yet.
However, this theory of a rolling update is hard to prove. If you had one... it could look just like fresh bot and who would be the wiser. I have two sites that are getting hit every 4 to 5 days. Dominic started it with every two weeks. Since Esmerelda the pace has quickend. Strange things are deffinately afoot.
You guys think Google Watching will make it into the 2008 Olympics?