tedster
 Msg#: 4109138 posted 10:29 pm on Apr 2, 2010 (gmt 0) 
Internal PageRank is recalculated continually and it has been this way for several years. Google introduced some kind of mathematical shortcut several years back, from what I understand, and that makes continual recalculation a feasible thing to do. I'm not 100% sure when that started (I'll do some research) but I believe it was in place at least by the time that the Big Daddy infrastructure went in place..

tedster
 Msg#: 4109138 posted 1:36 am on Apr 3, 2010 (gmt 0) 
Well, I've tried bit I can't find the original information, maybe I should try Bing ;) I'm pretty sure the information came from Matt Cutts, but the closest I can find right now is his 2006 reference to the earlier post: I believe that I’ve said before that PageRank is computed continuously; there are machines that take inputs to the PageRank algorithm at Google and compute the resulting PageRanks. So at any given time, a url in Google’s system has uptodate PageRank as a result of running the computation with the inputs to the algorithm. [mattcutts.com...] 
 As originally published, the PageRank formula was a computationally intensive procedure. Two things have changed: 1) The formula has been modified but Google has not shared the specifics. 2) A mathematically equivalent form was found (that's common enough in mathematics) and it made continual calculation realistic.

claus
 Msg#: 4109138 posted 7:57 am on Apr 3, 2010 (gmt 0) 
Here are some leads to the modification of the formula / mathemathical equivalent form: Computing PageRank using Power Extrapolation [academic.research.microsoft.com]  promised a 30% reduction of computation time Adaptive Methods for the Computation of PageRank [academic.research.microsoft.com]  also promised a 30% reduction of computation time Extrapolation methods for accelerating PageRank computations [academic.research.microsoft.com]  promised to speed up PageRank computation by 25–300% Efficient PageRank approximation via graph aggregation [academic.research.microsoft.com]  promised to reduce computing time by 50% I'm not saying that these *are* the new methods. Only Google knows that.

internetheaven
 Msg#: 4109138 posted 10:15 am on Apr 3, 2010 (gmt 0) 
maybe I should try Bing ;) 
 That, and the fact that claus quoted microsoft academic, was one of the funniest things I've read in a long time. Don't think I've laughed out loud at my screen for a while and it hopefully indicates the shift power we need. However, in relation to my original post, my recent test seems to have gone wrong then. We had a site that just could not get in to the top 10. Everything above us has thousands more backlinks. So, I spent a couple of intensive weeks and got 30+ PR37 backlinks in the hope that such could outway the thousands of lesser quality links to the sites above us. But it was almost six weeks from getting the first link to actually moving in the SERPS. We've gone up to #3 which is great, but if Pagerank is calculated continuously, then why was there no ranking movement for six weeks? How is a continuous Pagerank calculation (and by that I assume a continuous backlink calculation) factored in to the algorithm? P.S. I'm not fixated on Pagerank as a ranking method and am aware it is only part of 200+ signals. This question is just one of 200+ that I'm trying to answer. ;)

BillyS
 Msg#: 4109138 posted 10:56 am on Apr 3, 2010 (gmt 0) 
But it was almost six weeks from getting the first link to actually moving in the SERPS. We've gone up to #3 which is great, but if Pagerank is calculated continuously, then why was there no ranking movement for six weeks? 
 The above implies the assumption that all links of similar quality are both factored into PR and weighted evenly. Just because PR is calculated constantly doesn't mean immediate (noticeable) response. For example, the PR calculation could place more weight on older links than newer ones. Perhaps someone had read published information to the contrary, but if you’re trying to stabilize SERPs, then PR (among other factors) should remain fairly stable – even if constantly recalculated.

Marvin Hlavac
 Msg#: 4109138 posted 11:17 am on Apr 3, 2010 (gmt 0) 
But it was almost six weeks from getting the first link to actually moving in the SERPS. 
 It could have taken several weeks for G* to reindex the old pages with your new links. P.S. By the way, it seems there was a PR data export tonight (Apr 2nd to 3rd).

internetheaven
 Msg#: 4109138 posted 1:25 pm on Apr 3, 2010 (gmt 0) 
It could have taken several weeks for G* to reindex the old pages with your new links. 
 The links were showing up in my WMT account after the first week  I normally see changes in my WMT backlink profile every week or so. Five weeks later the rankings improved. For example, the PR calculation could place more weight on older links than newer ones. 
 I never heard this before. Are you saying that 50 links from PR4 pages that are 1yearold will result in a LOWER resulting Pagerank for the linkedto site than if they were to have 50 inbounds from identical PR4 pages that were 5 yearsold?

FranticFish
 Msg#: 4109138 posted 4:57 pm on Apr 3, 2010 (gmt 0) 
It's always seemed to me that time for links to kick in depends on Google's quality assessment. The best have almost instant effect, and the worst (directories for example) can take months to have a cumulative effect.

brotherhood of LAN
 Msg#: 4109138 posted 5:25 pm on Apr 3, 2010 (gmt 0) 
[googleblog.blogspot.com...] To keep up with this volume of information, our systems have come a long way since the first set of web data Google processed to answer queries. Back then, we did everything in batches: one workstation could compute the PageRank graph on 26 million pages in a couple of hours, and that set of pages would be used as Google's index for a fixed period of time. Today, Google downloads the web continuously, collecting updated page information and reprocessing the entire weblink graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, we do the computational equivalent of fully exploring every intersection of every road in the United States. Except it'd be a map about 50,000 times as big as the U.S., with 50,000 times as many roads and intersections. 
 WebmasterWorld thread at the time [webmasterworld.com]

pavlovapete
 Msg#: 4109138 posted 5:02 am on Apr 6, 2010 (gmt 0) 
I spent a couple of intensive weeks and got 30+ PR37 backlinks 
 Plus your IBL profile presumably spiked in these intensive weeks and then dropped back to something approaching the normal. Spikes might trigger some kind of damping?

