Forum Moderators: open
In times past Google penalties were obvious, now most of the time people don't know they have them.
This is a question I've often been asked; in my experience the 'can have PR but not pass it on' filter is quite common among sites selling PR.
Note that I'm not accusing 'PR sellers' of any deception. Mostly, people seem merely to be renting a link from a page with PR, rather paying paying for the PR they thought they'd receive. Also, I doubt that the 'PR sellers' have any more understanding of PR transfer in 2003 than their customers (I have questioned more than one).
[google.com...]
That's not to say it's 'better', just that it gives PageRank.
If the site with good related content has traffic, then expect real people to click through to you.
If you spend your time getting links from good, relevant sites then I think it will do more good in terms of your Google traffic than one high PR link. Of course this depends on some things to do with your site. For example, if you have 5,000 pages and PR1, then a decent PR injection may help your lower level pages to get listed.
I would also be curious to know what the PR is of the sites that dominate your clients space.
For example, if you are a PR3 competing in a space that has a max PR of 5, there is a strong possibility (based on many examples I've been shown)that your site will never climb past a PR5, regardless of the number of 7 or 8 links you have.
I've always found that you are better off requesting/purchasing links from sites that are within your PR range. They are much cheaper, and tend to produce better results.
If the pages only have the one link then I'll just crawl back into the woodwork.
OK, here are some more specific details. My client has 3 seperate domains on three seperate topics (totally unrelated, only one is online pharm affiliate). The only points of commonality are the ownership and that we purchased one link for each of them from various pages of www.widgets.com. All text links from PR 7 or 8 pages. When we bought them the pages averaged 10-14 outbound links, but thanks to Sullen (Good catch!) I went back and counted again. Apparently they've added a bunch more outbound links for a total of 51.
Perhaps I missunderstood, but I am under the impression that 99 outbound links was the limit. Am I confused on this point?
as for competitors' PR, the top dogs in two of the three categories are no more than five. I expected that links from a bunch of sevens and eights ought to get him up to a five or even a six.
Perhaps I missunderstood, but I am under the impression that 99 outbound links was the limit. Am I confused on this point?
Googlebot has been known to crawl as many as a thousand links on a page, although their guidance suggests to have no more than a hundred (may be PR dependent). The PR transferred from a page is distributed amongst the various links - if you're the only link then you get all the PR available, if you're one of 51 links then you get 1/51 of the PR available.
This is probably the simple and straight-forward solution to your question. :)
Let's assume that the base is 10 and a PR X page has ten outward links. Lets say that this turns ten PR 0 pages into PR X-2 pages. If a further 90 outward links are added, those pages will be demoted to PR X-3.
However, if a sensible amount of cross-linking is used, then the pages may be able to get an additional boost. If excessive cross-linking is used, an alarm might be tripped and a penalty could be applied.
So far as paid back-links are concerned, I'm certain that if you can find a site willing to sell backlinks, then the techs at Google can find the site. Given that the sale of backlinks is just plain cheating, I imagine Google are likely to nullify any PR that might result. I cannot say with any certainty that they have this technology but it is likely.
Kaled.
For a long time, a page with 'too many' links (often believed to be 100) would pass far less than (PR-d)/n (where PR is the raw PR, d is the decay due to the 'rank source' and n is the number of links ont he page). Note that in this context, 'outbound links' are all links coming out from that page, whether on the same domain or not.
This hasn't been the case for a while though. A page with 200 links does now seem to pass (PR-d)/n. This may change again.
kaled's point about PR being logarithmic is essential when thinking about raw PR and Toolbar PR.
kaled:
> However, I have never seen is stated what notional base is used.
If you search the site for "log base" you'll see some discussion. Most people say about six or seven; some say about four; I say about twenty.
I may find myself back here again, waving my tin cup and asking for answers, but you have all been a great help. Thanks again.
For a long time, a page with 'too many' links (often believed to be 100) would pass far less than (PR-d)/n (where PR is the raw PR, d is the decay due to the 'rank source' and n is the number of links ont he page). Note that in this context, 'outbound links' are all links coming out from that page, whether on the same domain or not.
ciml - how is the decay computed? When you say "rank source", are you referencing the referring site's PR? Is there somewhere I can read up on this formula?
.
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn)
"The PageRank Citation Ranking: Bringing Order to the Web" by the same authors (Google's inventors), provides a deeper explanation of rank source and convergence. 99% of webmasters would not benefit from understanding it.
(2) Are they real links to your site and not redirect links or javascript links? When you go to View > Source in IE, and if you can't find <a href="www.mydomain.com"> somewhere in the html text, then Google might not give you benefit for the link.
(3) How much did the links cost? I might want to buy some if they work!
Dave
No. The way to do it would be to set the lowest PR (that being a page with just the seed PR and no inbound links) at PR0, and the page with the highest PR in the index at PR11. (Google rounds fractional PR down; the actual range is 0-11.) The base then is whatever the math works it out to be.
I'm trying to find the "decay" from "rank source"....
The two page rank papers from way back when get into some iterative computation around the damping factors : inbound link effect(s) but I'm not familiar with "rank source" as a term.. Is this a new determination for the effect of OL's?
I'm seeing some relatively recent stuff on European sites changing around and some of it mentions some new calculations of how the PR of the linked site propogates back...
.
wmburke, part 2.4 of "Bringing Order to the Web" mentions rank source.
Small_Website_Guy, 3 billion to the 1/11th power is is compelling, but if Google were using the PageRank 'rank positions' for the Toolbar instead of the PageRank values, then there would be no need for a log scale. If PageRank was to work well (it does), then the end result would be very similar (think Pareto).