Forum Moderators: open
First I need to explain something. Crawlers exist just to follow links. That is their sole task in life. If you have a link they can see then they will follow it. Period.
The proper to tell it that it isn't wanted is to use the Robots Exclusion Standard protocols. Either prevent indexing of the page the links are on (using the robots.txt file) or stop them following the links (but they still know they are there) using the robots META tag:
<META NAME="robots" CONTENT="nofollow">
That said, crawlers are not browsers. Most have limitations on what they can (or choose to) interpret.
Two of the things they generally don't interpret can be used to provide links to your human visitors that the crawlers won't (currently) follow.
1 Forms
2 JavaScript
But to all that I have to tell you the most important thing of all. If you don't want the links detected - don't include them. ALL tricks can be detected, and once detected the engine knows you're prepared to trick and deceive it to unfairly inflate rankings or prevent earnt penalties. That's a fast one way ticket to getting your domain or IP banned.
Is the traditional - the whole link will be missing if Javascript is not enabled in the browser.
There is no trickery whatsoever.
You'd actually be better off just linking to the page (especially if it is as you say just one link on one part of one page, etc. They are less likely to penalize that than they are to heavily penalize any detected trick.
Ammon Johns
Internet Marketing Consultant