Forum Moderators: open
Why I think this, is because when you go to the google home page and you click on the google tour home page, you get to this page [google.com...] which has a PR8. This value is a guestimate value as it is not in google's database yet (has no cache).
There are 3 slashes in the URL so for a PR10 page the guestimate PR should be 7 but it's not its 8.
Type these imaginary URL's into the address bar look at the PR and you'll see what I mean.
[dmoz.org...]
[google.com...]
both PR10 home pages but different guestimate PR.
Not sure of the significance of this, but I thought it was interesting all the same.
>A page with PR10 can actually be 10.something, even 10.999999999999. So, 10 is not the top, the (theoretic) top is 11.
This is total guessing, but I'll bet that Google is precisely PR11. Because the toolbar doesn't show beyond 10, we just can't see it. This would be the simplest and most straightforward way to do it. Set the lowest (the seed value all pages start out with) at PR0, Google having the highest PR value at 11, and log scale between the two extremes. I am assuming that the toolbar just rounds down always. All PR10 pages are really between PR10-11, PR9 are between PR9 and PR9.99999999999, etc.
I put up a chart within the last week that illustrates the wild discrepancy between the top 1000 PR sites and their Directory PR values. It was well indexed for about 3 days-- and has now mysteriously vanished from Google's cache. Good luck finding it.
Note that macromedia and others are also "11" / 44 in directory terms.
[searchnerd.com...]
Google was the only 11.
[access.adobe.com...]
is an 11
same with
[real.com...]
I haven't checked all 100+ PR10s, but I am thinking these might be it. Any others?
google.com
macromedia.com
access.adobe.com
real.com
I don't understand why "The PR of a page is always above 0 and below 11." should be a total guessing.
The PR of any page is always above 0, due to the fact that PR formula assigns a minimum PR value >0.
I admit that the "always below 11" statement is a bit guessed, thought. ;) But it actually matches with many experiments I did, comparing "directory PRs" with "toolbar PRs".
About the "I'll bet that Google is precisely PR11", I don't think that Google needs to do a thing like this to calculate toolbar PR values.
When they monthly recalculate the PR of all pages, they can simply take the page with the maximum PR as the top value. So the "top page of the month" can theoretical be any web page, including Google home page.
I agree with you that toolbar values are rounded as you said. :)
Basically, the road to PRelevenness seems to be, offer a software download everyone needs.
Some old PR10's here: [webmasterworld.com...]
In the XML files, the PR of a page is between the <RK></RK> tag. This is also for pages for which Google estimates PR (like dmoz.org/page). Interestingly, if a PR is estimated, there is another tag in the XML files. It is <XP></XP> and it is located right below the RK tag. The XP tag is not there when a URL is in Google's index. When you ad new folders to dmoz.org, it looks like this:
<RK>9</RK>
<XP>1</XP>
<RK>8</RK>
<XP>2</XP>
<RK>7</RK>
<XP>3</XP>
It appears that the XP tag shows the difference between the domain root's PR and the estimated PR. It has been like this for all domains that I've checked. For google.com, the tags look like this:
<RK>10</RK>
<XP>1</XP>
<RK>9</RK>
<XP>2</XP>
<RK>8</RK>
<XP>3</XP>
I think that's another good indicator that PR 11 really exists.
LowLevel
> The PR of any page is always above 0, due to the fact that PR formula assigns a minimum PR value >0.
That's true for raw PageRank, but on a log scale this can translate to negative numbers.
If, as I suspect, Googlebot behaves in a very similar way to PageRank's random surfer then I get the strong impression that Google will list pages below 0 on the Toolbar PR scale (maybe somewhere between -2 and -3).
I doubt this happens. Every update, Google just changes the base they use for the log scale such seed value is 0, and some value close to the max is 10. However, the fact Macromedia also is a PR11 suggests they just don't scale everything to the highest value in the index.
So does this mean that the PR of any page located in a subdirectory can't go above the Domain's PR?
No, just for when the toolbar "guesses" pagerank based on the PR of the root domain, otherwise its "real" PR score (when google has assigned the page its PR) can be much higher/lower than the domain page.
//added
anyone think its possible that these growing PR11's are maybe linked to something else thats growing - the amount of pages in the index.
anyone think its possible that these growing PR11's are maybe linked to something else thats growing - the amount of pages in the index.
sure - it must be.. The way I figure it, the sum of the PR of all pages in the index is always equal to the number_of_pages * the start value..
so if every page is given a unit 1 to start with .. over the 20 or so itterations the PR of the whole index is still "3,083,324,652" (*1)? ( not allowing for sink correction and stuff like that ) .. 's that right?
:)
Scott
BTW, my own theory to the higher PR internal pages is plain old good internal linking.
I have several internal pages on my site that are the same PR as my index page (and they don't have any external links pointing to them). :)