Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: mademetop
First would be to get the destination page blocked by robots.txt on their server. This might be easier said than done.
Second is to use 'rel=nofollow' in the link. This should keep most spiders from following the link, but not everybody.
Third would be to link to a page that is blocked by your robots.txt, and then have that page redirect the user to the real target. You could create a php script called redirect.php, and then block redirect.php in your robots.txt. Redirect.php would in turn do a redirect to the real landing page. That should stop most of the legitimate bots.