Following on from a post about spider friendly JavaScript for collapsible content, I wondered if anybody had intentionally used uncrawlable pieces of JA or Ajax to prevent search engines crawling specific links.
As an example, many ecommerce solutions have multiple product links which screws up the balance of the internal linking structure. If you could restrict access to duplicate links and allow bots to crewel only a single text link (whilst not impacting usability) then the internal linking structure would be a lot healthier.
For the greater good make certain areas uncrawelable. Thoughts?