Forum Moderators: martinibuster

Message Too Old, No Replies

script for detecting Page Rank

         

SlowMove

12:09 am on Mar 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If I'm linking to sites, I want to know if any of them suddenly drop in Page Rank. I haven't really looked at the Google API. Does anyone know if there's a way to use it, or another tool, to automate the process of checking links?

Nova Reticulis

5:56 pm on Mar 24, 2004 (gmt 0)

10+ Year Member



There's no such legitimate way, and Google specifically prohibits doing so.

There is a known way that goes along the lines of reverse engineering the toolbar, but I would recommend against it because all in all, Google -is- smarter than us all.

SlowMove

10:34 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can understand Google not wanting automated processes running amok, but I would think that they wouldn't mind Page Rank being checked, since it would only be done to review sites and possibly drop links for what may be considered "bad neighborhoods" by Google.

blaze

10:42 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could probably do it if you download ethereal and then snoop the packets. It won't work for arbitrary urls, but if you have specific urls you want to check then I'm sure you could just copy/paste the request the tool bar makes.

Don't bother unless you are a programmer, though..

SlowMove

11:02 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>download ethereal and then snoop the packets

Let me see if I understand. Are you suggesting automating the process of loading pages into the browser and checking the data that crosses the wire? I'm sure that would take a lot of work, but it would probably be fast if all images and multimedia were turned off. Would Google consider that to be an automated process since it would only work if the toolbar was installed?

blaze

11:06 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



it's a few lines in java/perl/vb/whatever.

you could probably also use wget and then grep for the page rank.

SlowMove

11:21 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm guessing it would be alright with Google if it didin't run too fast. Just ride the brakes and request pages at random time intervals.

Philosopher

11:30 pm on Mar 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've always been curious how optilink was able to get the google pr for pages. I emailed the author a while back but never got a response.

blaze

12:28 am on Mar 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The issue is the checksum that goes along with the request. The toolbar is probably hashing it with some snake oil encryption. No doubt if you were so inclined you could figure it out, but I have never had a particular need to do it.

When I want to determine the page rank for something, I just do an automated link: request and count the number of backlinks. Not precise, of course, but at least it's within the Google API TOS.

BroadProspect

1:54 pm on Mar 31, 2004 (gmt 0)

10+ Year Member



the real question is how many link: requests are allowed a day by google .
/BP

blaze

11:45 pm on Mar 31, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1000 on the google api.