One, two and three are relevant to internal linking and PR distribution, where I've seen some changes for a while, but to go forward to what's been changed, it's probably helpful to go backward first and take another look at the older PageRank paper.
This paper describes PageRank, a mathod for rating Web pages objectively and mechanically, effectively measuring the human interest and attention devoted to them. We compare PageRank to an idealized random Web surfer. We show how to efficiently compute PageRank for large numbers of pages. And, we show how to apply PageRank to search and to user navigation.
There are some points that could easily relate to changes we're seeing, like the basic about how ranking is divided on a page by number of links. And an interesting thought:
Is the surfer's walk really as random as they originally stated, in view of years of accumulated statistics, and looking at eyetracking studies and hotspots for relative value of the location of links on pages?
Added - this patent application (published February, 2008) is worth a good look:
Systems and methods for analyzing boilerplate are described. In one described system, an indexer identifies a common element in a plurality of related articles. The indexer then classifies the common element as boilerplate. For example, the indexer may identify a copyright notice appearing in a plurality of related articles. The copyright notice in these articles is considered boilerplate.