Forum Moderators: open
There are three things that I have noticed with google recently that suggest they are about to move (or at least trial) to continuous PR and link updates similiar to the way freshbot works.
1. People are commenting freshbot is, on occasions, acting more like deepbot. The extra depth and amount of hits in a period of time is the strange activity of freshbot.
2. There was in incedent at www-sj.google.com in which every site was restricted to a maximum of 1000 incoming links. If this was the case the top 1000 links would provide a good indication of the sites "true" PR if they moved to progressive updating, and not require massive calculations.
3. Many complaints here and elsewhere that the lack of continuous updates means slow indexing of new content with it's true PR.
Anyway, I could just be way off.
The problem is that I have no access to raw server logs so have no idea if I'm being visited by 64. or 216.
All I get is a top ten user agents report that always says Googlebot 2.1 every day.
$ zgrep -c 216.239.46 BLAH
20030401 to 20030412: 0
20030413: 0
20030414: 0
20030415: 1090
20030416: 26
20030417: 9
20030418: 5
20030419: 10
20030420: 15
20030421: 28
20030422: 35
20030423: 5
20030424: 2
20030425: 0
20030426: 874
20030427: 0
20030428: 0
Interesting, but i hope any continuous PR does not cut off the less than 4 PR URLs, I have a large amount of less than 4 PR links that account for most of my SERPs. I have always enjoyed getting ratings without the reasons being easy to determine. The listing is there like a jedi warrior, may the force be with you.
Anyway i feel something is up.