| 10:07 am on Oct 25, 2003 (gmt 0)|
You didn't say how your expected PR transfer was calculated, but judging from your description I'm 82% sure that your client is wasting his/her money.
In times past Google penalties were obvious, now most of the time people don't know they have them.
This is a question I've often been asked; in my experience the 'can have PR but not pass it on' filter is quite common among sites selling PR.
Note that I'm not accusing 'PR sellers' of any deception. Mostly, people seem merely to be renting a link from a page with PR, rather paying paying for the PR they thought they'd receive. Also, I doubt that the 'PR sellers' have any more understanding of PR transfer in 2003 than their customers (I have questioned more than one).
| 12:15 pm on Oct 25, 2003 (gmt 0)|
ciml - if I was going to pay for premium listings on good portals and in a strategic location say not that many external links and the site had good PR.
Would that be better than say a link from a site that is full of good 'related content' if they only had a PR of say 2?
| 1:50 pm on Oct 25, 2003 (gmt 0)|
layer8, you will get high PageRank from a page with high PageRank, that doesn't have too many links (on-domain and off-domain), and isn't affected by the 'can have PR but not pass it on' filter.
That's not to say it's 'better', just that it gives PageRank.
If the site with good related content has traffic, then expect real people to click through to you.
If you spend your time getting links from good, relevant sites then I think it will do more good in terms of your Google traffic than one high PR link. Of course this depends on some things to do with your site. For example, if you have 5,000 pages and PR1, then a decent PR injection may help your lower level pages to get listed.
| 2:56 pm on Oct 25, 2003 (gmt 0)|
>> bought text links
In order to get an idea about whether this phenom concerns the linking site, or the link itself - what is the format of the href?
Is it "www.example.com" or "www.ad-provider.com/ads/redir?ID=313242"
| 5:14 pm on Oct 25, 2003 (gmt 0)|
Good question. And while I'm at it, thanks to all who've responded. I checked and the link code is a straightforward www.widgets.com/index.html, with the actual link text saying "widgets".
Does this help?
| 6:55 pm on Oct 25, 2003 (gmt 0)|
I would say to visit the other links on the page.
Check to see if they have an PR that might have come as a result of the text links they have purchased.
I know, for example, that a major new network has a high-ranking web site that has PR but none of the links are seen as transferring PR to advertisers.
| 7:05 pm on Oct 25, 2003 (gmt 0)|
A lot has to do with what industry your client is working in. There are some (online pharmacuticals) that are monitored and capped.
I would also be curious to know what the PR is of the sites that dominate your clients space.
For example, if you are a PR3 competing in a space that has a max PR of 5, there is a strong possibility (based on many examples I've been shown)that your site will never climb past a PR5, regardless of the number of 7 or 8 links you have.
I've always found that you are better off requesting/purchasing links from sites that are within your PR range. They are much cheaper, and tend to produce better results.
| 7:14 pm on Oct 25, 2003 (gmt 0)|
oaktown, I note that you don't say how many links (in total) are on the pages that link to your client's site. Given Google's "damping" factor, if there are any more than 5 or 6 I wouldn't expect the PR passed on to be greater than 1 - i.e. it wouldn't show up in your client's overall PR. Could it be that simple?
If the pages only have the one link then I'll just crawl back into the woodwork.
| 10:10 pm on Oct 25, 2003 (gmt 0)|
You guys are terrific!
OK, here are some more specific details. My client has 3 seperate domains on three seperate topics (totally unrelated, only one is online pharm affiliate). The only points of commonality are the ownership and that we purchased one link for each of them from various pages of www.widgets.com. All text links from PR 7 or 8 pages. When we bought them the pages averaged 10-14 outbound links, but thanks to Sullen (Good catch!) I went back and counted again. Apparently they've added a bunch more outbound links for a total of 51.
Perhaps I missunderstood, but I am under the impression that 99 outbound links was the limit. Am I confused on this point?
as for competitors' PR, the top dogs in two of the three categories are no more than five. I expected that links from a bunch of sevens and eights ought to get him up to a five or even a six.
| 12:12 am on Oct 26, 2003 (gmt 0)|
there is one other important question, when did those links go up?
| 12:24 am on Oct 26, 2003 (gmt 0)|
I am guessing that these links were placed at some point over the last couple of months, and you are simply waiting for Google to *£$&*$ update PR! I am beginning to wonder if we're not on some sort of yearly cycle now.
| 12:28 am on Oct 26, 2003 (gmt 0)|
|Perhaps I missunderstood, but I am under the impression that 99 outbound links was the limit. Am I confused on this point? |
Googlebot has been known to crawl as many as a thousand links on a page, although their guidance suggests to have no more than a hundred (may be PR dependent). The PR transferred from a page is distributed amongst the various links - if you're the only link then you get all the PR available, if you're one of 51 links then you get 1/51 of the PR available.
This is probably the simple and straight-forward solution to your question. :)
| 12:50 am on Oct 26, 2003 (gmt 0)|
It has often been said that PR is logarithmic. However, I have never seen is stated what notional base is used. For instance, if base 10 is used, then a PR 5 page has 10 times as many points as a PR 4 page.
Let's assume that the base is 10 and a PR X page has ten outward links. Lets say that this turns ten PR 0 pages into PR X-2 pages. If a further 90 outward links are added, those pages will be demoted to PR X-3.
However, if a sensible amount of cross-linking is used, then the pages may be able to get an additional boost. If excessive cross-linking is used, an alarm might be tripped and a penalty could be applied.
So far as paid back-links are concerned, I'm certain that if you can find a site willing to sell backlinks, then the techs at Google can find the site. Given that the sale of backlinks is just plain cheating, I imagine Google are likely to nullify any PR that might result. I cannot say with any certainty that they have this technology but it is likely.
| 10:57 am on Oct 27, 2003 (gmt 0)|
> Perhaps I missunderstood, but I am under the impression that 99 outbound links was the limit. Am I confused on this point?
For a long time, a page with 'too many' links (often believed to be 100) would pass far less than (PR-d)/n (where PR is the raw PR, d is the decay due to the 'rank source' and n is the number of links ont he page). Note that in this context, 'outbound links' are all links coming out from that page, whether on the same domain or not.
This hasn't been the case for a while though. A page with 200 links does now seem to pass (PR-d)/n. This may change again.
kaled's point about PR being logarithmic is essential when thinking about raw PR and Toolbar PR.
> However, I have never seen is stated what notional base is used.
If you search the site for "log base" you'll see some discussion. Most people say about six or seven; some say about four; I say about twenty.
| 11:22 am on Oct 27, 2003 (gmt 0)|
If I were designing a log-based PR system, I would choose a multiple of 2 as the base. My guess is Google use 16 or, perhaps, 8.
Having said that, it is largely unimportant. The base is only relevant when quantifying for human use. It makes no difference otherwise.
| 11:31 am on Oct 27, 2003 (gmt 0)|
|no PR appears to have been transferred |
You didn't achieve a higher ranking for keywords in the anchor text of these links?
How long ago were the links put up? As James said, Google seems to have put up a yearly circle (this really made me laugh :)
| 8:31 pm on Oct 27, 2003 (gmt 0)|
Thanks to all! You've given me loads of stuff to chew on (this forum is truly awesome!). Some of the answers were a bit over my head but I'll work on getting a handle on them. After reading all the responses, I'm now leaning towards the "Has PR but cannot pass it on" theory.
I may find myself back here again, waving my tin cup and asking for answers, but you have all been a great help. Thanks again.
| 9:24 pm on Oct 27, 2003 (gmt 0)|
Several PR7&8 links bought & went live in last 10 days. As of Sunday, the 2 sites linked to went from PR 6 to PR7.
No obvious improvement in SERPs at present.
| 1:36 pm on Oct 28, 2003 (gmt 0)|
|No obvious improvement in SERPs at present. |
So the pages did not rank higher for the keywords in the newly generated anchor text?
| 1:52 pm on Oct 28, 2003 (gmt 0)|
Not yet but I live in hope!
| 7:11 pm on Oct 31, 2003 (gmt 0)|
|For a long time, a page with 'too many' links (often believed to be 100) would pass far less than (PR-d)/n (where PR is the raw PR, d is the decay due to the 'rank source' and n is the number of links ont he page). Note that in this context, 'outbound links' are all links coming out from that page, whether on the same domain or not. |
ciml - how is the decay computed? When you say "rank source", are you referencing the referring site's PR? Is there somewhere I can read up on this formula?
| 7:36 pm on Oct 31, 2003 (gmt 0)|
wmburke, I suggest beginning with "The Anatomy of a Large-Scale Hypertextual Web Search Engine" by Sergey Brin and Lawrence Page. This paper explains PageRank using an easy to understand model with 'd' as the damping factor.
|PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn) |
"The PageRank Citation Ranking: Bringing Order to the Web" by the same authors (Google's inventors), provides a deeper explanation of rank source and convergence. 99% of webmasters would not benefit from understanding it.
|Small Website Guy|
| 9:06 pm on Oct 31, 2003 (gmt 0)|
(1) It took TWO MONTHS before new links boosted my PR, so if those links went up less than two months ago, be patient.
(3) How much did the links cost? I might want to buy some if they work!
| 9:41 pm on Nov 1, 2003 (gmt 0)|
|It took TWO MONTHS before new links boosted my PR |
What do you mean by boost? Showing as more green in the toolbar or higher position in the SERPs?
| 3:31 am on Nov 2, 2003 (gmt 0)|
oaktown, how would you ever know what your PR is? I doubt the Google Toolbar can be relied on. My advise, forget PR and get as many links as possible from as many on-topic pages as possible. I would also not waste money trying to buy PR.
| 4:08 am on Nov 2, 2003 (gmt 0)|
>If I were designing a log-based PR system, I would choose a multiple of 2 as the base. My guess is Google use 16 or, perhaps, 8.
No. The way to do it would be to set the lowest PR (that being a page with just the seed PR and no inbound links) at PR0, and the page with the highest PR in the index at PR11. (Google rounds fractional PR down; the actual range is 0-11.) The base then is whatever the math works it out to be.
|Small Website Guy|
| 3:43 pm on Nov 2, 2003 (gmt 0)|
3 billion (the number of pages ranked by Google) to the 1/11 power is 7.27
But it's been a long time since I took advanced math, so that might be irrelevant to discovering the true answer.
| 4:15 am on Nov 3, 2003 (gmt 0)|
I'm trying to find the "decay" from "rank source"....
The two page rank papers from way back when get into some iterative computation around the damping factors : inbound link effect(s) but I'm not familiar with "rank source" as a term.. Is this a new determination for the effect of OL's?
I'm seeing some relatively recent stuff on European sites changing around and some of it mentions some new calculations of how the PR of the linked site propogates back...
| 6:49 pm on Nov 3, 2003 (gmt 0)|
Dave, forgeting PR and getting links from on-topic pages is sensible, but if oaktown wants to know if PR is transferring in particular case, it's important to understand what would happen if it was transferring. Then he can see if it isn't.
wmburke, part 2.4 of "Bringing Order to the Web" mentions rank source.
Small_Website_Guy, 3 billion to the 1/11th power is is compelling, but if Google were using the PageRank 'rank positions' for the Toolbar instead of the PageRank values, then there would be no need for a log scale. If PageRank was to work well (it does), then the end result would be very similar (think Pareto).
| This 33 message thread spans 2 pages: 33 (  2 ) > > |