Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Subject gateway sites and search engine ranking
Online Information Review, Vol 26, Issue 2
"Abstract" (not sure about posting this, oh well)
The spread of subject gateway sites can have an impact on the other major Web information retrieval tool: the commercial search engine. This is because gateway sites perturb the link structure of the Web, something used to rank matches in search engine results pages. The success of Google means that its PageRank algorithm for ranking the importance of Web pages is an object of particular interest, and it is one of the few published ranking algorithms. Although highly mathematical, PageRank admits a simple underlying explanation that allows an analysis of its impact on Web spaces. It is shown that under certain stated assumptions gateway sites can actually decrease the PageRank of their targets. Suggestions are made for gateway site designers and other Web authors to minimise this.
It is shown that under certain stated assumptions gateway sites can actually decrease the PageRank of their targets. Suggestions are made for gateway site designers and other Web authors to minimise this.
Heh. What the heck does "Gateway Guy" care about the other site's PR?
I've been getting a lot of this lately. I currently have about 130 pages in the Zeal directory (i.e. The Looksmart Directory). In addition to MSN now outperforming Google by about 10 to 1 for traffic, I'm getting the occassional hit (10 a day or so) from little portal sites that mirror the L$ directory. Unfortunately, several of these sites are PR0 or PR1 up front and 99% are PR1 as deep as the results pages I'm shown on.
Unfortunately, there isn't a heck of a lot I can do about this without just yanking my pages out of Zeal. I'd hate to imagine that Google would slap a major penalty on me for being listed at L$, but you never know. (And, it's way too soon for me to see what the exact effect will be - time will tell. It also kills).
A major practical problem with continuing the voting indefinitely is that all sites that host links will expend any votes garnered on them and may eventually run out and be unranked. In response to this, new votes can be continually added to the system...
In reality, the received and given votes in each iteration of the PageRank calculation are completely independent of each other. A page cannot "expend its votes" and lose its own ranking because of that. Therefore there is no need to "add new votes" to the system, and no reason why Google would do so.
They suggested the figure of 85 percent, so that at any voting stage at each URL, 15 percent of its votes would be allocated to its link targets and 85 percent were distributed evenly to all URLs in the system.
In reality, the Google algorithm distributes 85 % of the PageRank of each page between its link targets, and drops the rest, instead of "distributing it evenly" to all others.
I read through the paper very quickly, so I assume those are only the most obvious and glaring of its errors. Personally, I wouldn't base my promotion strategy on its conclusions just yet...