Have you ever run a spider routine on your site? Can you honestly say that you know "all about" the internal linking structure of your website and everything that is going on? Are you sure you didn't miss a few redirects here and there?
I've been coming across sites that are providing additional layers for the bots. Not intentionally, but out of lack of quality control. For example, one site I recently reviewed had six (06) 301s taking place from their main navigation. "Oh, we redirected those pages a few months ago and forgot to update those links". Really?
Do you think that having additional redirect layers within your internal linking could cause potential challenges?