Forum Moderators: open
Now, how about those links that Googlebot isn't allowed to follow? If I have 20 links on a page, and 10 lead to a directory that is forbidden for bots (as specified in the robots.txt file), will that be 10 or 20 links on the page from Google's perspective?
The mechanics of the situation could very well be what rogerd hypothesizes, but it goes against Google philosohpy to allow non-followable links to dilute PR.
The pubicly available docs talk about normalization of PR which is kind of like the conservation of mass in the universe. You cannot send (or vote in their lingo) PR without a page target and expect the numbers to add up across all the pages in the index.
In addition, there are a lot of links (in the a href= sense) out there now that do not have another page at the end. There are all the "add item," "update information," "upload image" type of things that should have nothing to do with PR.
In a sense, the Google definition of a link would almost have to include the ability to indicate PR (i.e. a link is something that connects one page to a different page).
It does seem a little odd for 404s to count for dilution, but in terms of "conservation of mass" those URLs don't seem much different from pages with no links.