Forum Moderators: open
Has anyone else noticed a PageRank pass-through penalty for a large # of outbound links?
By this I mean that if a page with a large # of outbound links has a link to your site, that the PageRank benefit of that link is completely eliminated (or reduced).
I am not talking just about the normal PR drop due to the natural effect of a large # of links, I am well aware of that.
I can see why Google might have implemented such a thing, which could be part of a new link farm / FFA filter.
I also happened to notice that Google's latest guidelines now recommend that you have no more than about 100 outbound links on a page, they considered this advice important enough that they repeated it twice!
See -
[google.com...]
(Design and Contents Guidelines section)
If there is a set # of outbound links that would cause such an effect to kick-in, this would be important to know, both from a link exchange and internal site design standpoint.
Has anyone else noticed this lately?
some discussions here and in the threads mentioned:
[webmasterworld.com...]
If this were the case, then the effect would be the exact opposite, the PR passed to the pages above the "cut-off" would actually increase, due to the outbound PR being divided-up among fewer links. You would also see some links getting PR and others not.
However, that is not what I am seeing.
Also, this effect is being seen on internal links. It also appears to be a fairly recent change.
Has anyone else noticed this?
As for the threshold, I'm going by Google's Webmaster guideline of 100 links maximum.
Would think that if they don't spider or index a page then PR can't be passed.
WHY? Only a minority of webmasters have had read the Guidelines. Most of them don't really care about it. Penalizing such sites equivalent to penalizing the majority of the web.
One of our sites has many more than 100 links on some pages. (All internal links not outbounds) The thing is that for useability's sake I'm not sure I want to change it.. Surely DMOZ/directory.google etc have more that 100 links on many pages?
:) J.
Does anyone know if this 100 link rule is particularily hard and fast?
I don't think anybody has found anything that is particularly hard and fast concerning the web.
I think in a case such as yours I would check to see which pages on your site Google has indexed. If most of them are in, fine. If not, go a bit deeper and see if the pages not indexed are further down than the 100-link limit. If they are -- and if the page with more than 100 links is the only way from which a spider can reach these pages -- then consider some changes.
BigDave, I would agree that they would have left a safety margin. The danger zone may go down to the 150 area, I don't know yet.
I have now come across two different cases of this (about 200-300 link range), both of which show target pages of PR0. I'm not yet sure if this is a complete elimination of PR pass-thru for these links, or simply a large reduction.
WHY? Only a minority of webmasters have had read the Guidelines.
I can see this being used as part of a link farm / FFA filter. I doubt that Google would penalize someone just for not reading guidelines!
jimbettle, this does not appear to be the case of a maximum # of links spider cut-off, as that does not explain the results I am seeing.
The 100 link rule does not seem to be a rule that is inforced because I am getting backward links from sites were I am 150 on the links page and are still registering as a backward link.
I think Google stops spidering a page at a size rather than a number of links.
Size is important :)
The 100 link rule does not seem to be a rule that is inforced because I am getting backward links from sites were I am 150 on the links page and are still registering as a backward link.
I think Google stops spidering a page at a size rather than a number of links.
I had the opportunity to speak with another forum participant, they also ran into this on a 200-link page, with a PR0 result.
Anyone else had any experience with this?