As an example, many ecommerce solutions have multiple product links which screws up the balance of the internal linking structure. If you could restrict access to duplicate links and allow bots to crewel only a single text link (whilst not impacting usability) then the internal linking structure would be a lot healthier.
For the greater good make certain areas uncrawelable. Thoughts?
All my web sites (except where a customer requests otherwise) block or remove links from certain pages (eg contact forms) if a bot of any kind is detected, in addition to the page being blocked in robots.txt. If the bot hits the unlinked page then it gets a 405 returned.