Welcome to WebmasterWorld Guest from 54.234.8.146

Forum Moderators: phranque

Message Too Old, No Replies

Researchers Develop Techniques for Computing Google-Style Web Rankings

Speed-up may make "topic-sensitive" page rankings feasible

     

Clark

8:50 pm on May 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member


I found this on http://google.blogspace.com/archives/000920

As Aaron Swartz said:

"The paper had a number of clever ideas, including a web browser add on that would show a little bar next to each link with the PageRank of the linked page."

WebGuerrilla

9:30 pm on May 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Direct Link: [nsf.gov...]

Ambiorix

5:06 am on May 17, 2003 (gmt 0)

10+ Year Member



Paper from .edu about a new way on calculating PageRank.

Mayby this BlockRank is the main reason why things are weird right now?

The conclusion of the paper is:
"We have shown that the hyperlink graph of the web has a nested block structure, something that has not yet been thoroughly investigated in studies of the web. We exploit this structure to compute PageRank in a fast manner using an algorithm we call BlockRank. We show empirically that BlockRank speeds up PageRank computations by factors of 2 and higher, depending on the particular scenario.

There are a number of areas for future work: finding the "best" blocks for BlockRank by splitting up what would be slow-mixing blocks with internal nested block structure; using the block structure for hyperlink-based algorithms other than web search, such as in clustering or classification; and exploring more fully the topics of of updates and personalized PageRank."

Full paper in pdf format
[stanford.edu...]

jeremy goodrich

6:09 am on May 17, 2003 (gmt 0)

WebmasterWorld Senior Member jeremy_goodrich is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Most of what that theory has to do is speeding up the process, and suggests ways that the vectors can be personalized.

Incredible read and they had a press release about it as some of the funding came from the national science foundation.

:) It's nice to see such cool research being published that will help, imho, search engines advance.

globay

2:33 pm on May 18, 2003 (gmt 0)

10+ Year Member



Here is an interesting article about BlockRank:

[www-sccm.stanford.edu...]


We now present the BlockRank algorithm that exploits the
empirical findings of the previous section to speed up the
computation of PageRank. This work is motivated by and
builds on aggregation/disaggregation techniques [5, 17]
and domain decomposition techniques [6] in numerical linear
algebra.

jeremy goodrich

3:25 am on May 19, 2003 (gmt 0)

WebmasterWorld Senior Member jeremy_goodrich is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yes, ever since that NSF press release, this thing has been making the rounds. Up to 5 times faster calculations versus the 'power method' as used in the original research papers.

What do you think of it?

vitaplease

5:48 am on May 19, 2003 (gmt 0)

WebmasterWorld Senior Member vitaplease is a WebmasterWorld Top Contributor of All Time 10+ Year Member



During the recent "dominic" update threads in this Google forum, this article has been mentioned, possibly in relation to the fact that the www-sj.google.com data centre was pre-updated quite quickly.

Speeding up the calculations must be one of the main concerns within Google.

[edited by: msgraph at 3:07 pm (utc) on May 20, 2003]

hitchhiker

11:33 pm on May 18, 2003 (gmt 0)

10+ Year Member



If im not mistaken G have decided to first group rank (block rank) all internal pages, get a 'simpler' PR, and then pagerank over the whole thing?

www.widget.com gets blockranked with all it's pages

www.widget2.com gets blockranked with all it's pages

and then the whole pagerank algo starts off with these values (saving time)

percentages

11:46 pm on May 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I doubt they have attempted to implement "Block Rank" this soon after the idea was first developed at Stanford.

I would have thought that such a drastic change would have needed considerable testing before attempting to use it on real data. I suspect "Block Ranking" may be something they want to do, but I wouldn't expect to see it for several months.

But, who knows, some people ride by the seat of their pants!

hitchhiker

11:48 pm on May 18, 2003 (gmt 0)

10+ Year Member



Yep, the only thing is SJ has shown us that google are kinda willing to test this thing live.

And why shouldn't they, there's not exactly a test drive arena more appropriate for them than themselves.

BroadProspect

6:08 pm on May 18, 2003 (gmt 0)

10+ Year Member



Hi Everyone,

It's about time i'll contribute something so here it is

Algorithm tweaks could boost Google's speed article:

[newscientist.com...]

Hope this helps

BroadProspect

olias

7:54 pm on May 18, 2003 (gmt 0)

10+ Year Member



Interesting stuff.

Have to say I'm not getting to concerned about the personalised results any time soon, but I'm not looking forward to the time when we have different SERPs for different people. We will all be setting up multiple generic profiles to see how well we are targetting various groups. Oh boy.

I think the most significant point is the idea that the pagerank calculation is being improved so much - fits in with a more fluid update policy really.

Oh, and how long before we get the 'How do I check my sites BlockRank?' threads? ;)

Yidaki

3:21 pm on May 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Today golem news (german it news) [golem.de] has a short story about the stanford team working on improvements in pr calculations. They refer to some papers, that i didn't found mentioned here at WebmasterWorld.

Google's translation of the article: Researchers want to accelerate Google [translate.google.com]

rmjvol

5:39 pm on May 22, 2003 (gmt 0)

10+ Year Member



Latest story on this:

[osopinion.com...]

rmjvol

 

Featured Threads

Hot Threads This Week

Hot Threads This Month