Good luck with that 1 to 5 ranking project. You'll need a relatively complex formula to even come close to weighting backlinks the way Google seems to. I'm still boggled by it regularly. I can see some patterns that seem to hold true, but then I run into those mysterious exceptions.
One thing Google has told us is that on their back end, they can zero out the effect of links on a site-wide basis, a page-wide basis, or even at the level of an individual link. And if you're analyzing a huge pile of backlinks, it will pretty much be a wild guess as to whether one of those three possibilities is happening.
Even if you have your own independent link graph of a big chunk of the web, I still doubt you could reverse engineer everything that Google does to modify link strength. For example, there are historical factors - and they vary by the kind of site and keyword area.
Some links GROW in strength when they've been in place for a longer time. Some links WEAKEN in strength when they've been in place for a longer time. If a site has a high trust value with Google, then its links may carry more clout - especially if they use keyword anchor text.
And it seems to me that even pure PageRank calculations are now modified by where on the page a link appears. Check out this
newly granted patent [seobythesea.com] which was filed back in 2004. The original idea of a random surfer is replaced by a "reasonable" surfer. In other words, a link that a page visitor is less likely to click on (for any number of reasons) will vote less power to its target website.