I wondered if any other members had experience in setting up inaccessible links on a page? I have a situation where a websites I work on has a 'buy' option but uses:
The page is automatically updated with the information (item added to checkout), but the page has around 500 'buy' links. So I was hoping that I could setup a piece of code that doesn't pass PageRank on, whilst maintaining the user experience.
Another approach you could try is to place the link in an iframe for a page that is in subdirectory blocked from Google. That is not the easiest to achieve or maintain over the long term.
Personally I would not worry about your situation. Pagerank has been decreasing in importance and will continue to do so as Google adds more quality signals to its algorithm. Leaking pagerank on one or two links on page is not ideal but from my perspective it would be a low priority.
Your link: <a href="#">Buy</a> does not pass pagerank nor drop pagerank on the floor ala nofollow, even with an onclick event.
You could put the urls in js and hope google doesn't use any pr for them, scramble them so they don't look like urls and Google's heuristics don't identify them, or put all the urls in a separate js file that is blocked by robots.txt.
I would consider any of that to be gray hat. I don't think any of it is specifically called out by Google as black hat, but it seems to be trying to fool Google. Google often doesn't look kindly on such things once it puts some algo in place to detect it.