TheOptimizationIdiot - 4:55 pm on Mar 3, 2013 (gmt 0)
Not sure I "get" the cannot noindex A due to technical reasons?
Sounds like you need someone more "technically proficient" internally to me, cause you can: Noindex A via X-Robots-Tag header in the httpd.conf or .htaccess file or on the pages via server-side scripting header or you can 301 redirect A to B for external requests only in the httpd.conf or .htaccess file or server-side scripting language or 301 redirect all requests for A except for a specific user-agent string via httpd.conf or .htaccess or server-side scripting language which allows access to site staff via custom user-agent.
I'm almost positive there's a way to get the noindex directive across for the pages or at least redirect the pages for certain request only if they need to be present and accessible for some internal reason.
I have one dynamic site I work on that saves a static version for public access where the dynamic version to be saved is only accessible via internal request or custom user-agent string on a specific subdomain. To make it work the "bot" that does the saving of HTML pages either requests the pages internally via full file path request to bypass the .htaccess that does the redirecting or it requests the pages externally using the custom user-agent string depending on what the exact assembly process for the pages is.
If you have enough access to put a <link rel> in the <head> you have enough access to redirect or noindex in one way or another, cause in a "worst case" you could 0 sec meta refresh page A to page B and to the best of my knowledge they are still treated almost exactly the same as 301s.