TheMadScientist - 3:52 am on Nov 7, 2012 (gmt 0)
Unless you Purposely separate the www/non-www (which you would Know you did, because you have to physically, purposely change the server settings) they show Exactly the same content (including robots.txt), because they run out of the same directory on the server by default in any hosting account I know of where you don't have to 'do it yourself', so how can you 'screw that up' and/or 'confuse a bot'?
In other words, you Cannot have two different robots.txt files for www and non-www Unless you Purposely make it so you can, meaning: Canonicalization should not matter a bit in this situation.
You're not going to 'confuse a bot' by having the same robots.txt file for both the www and non-www version of the domain and you can't even accidentally 'screw it up' yourself or you would Know to check both, because it takes a higher knowledge level to be able to have separate files on each than it does for them to serve duplicates of each other AND if you had them set to not serve duplicates of each other you would NOT want to canonicalize them, because that would defeat the purpose of serving different files from each. (Like say you know something about site speed and want to serve your 'cookieless files' from the non-www and files requiring a cookie from the www to keep the upstream requests from the browser down or something crazy like that.)
Two separate robots.txt files between www and non-www is like serving a 410 error, it doesn't 'just happen'. You can't even 'accidentally upload' one with an error and one without on the different versions of www / non-www, unless you've purposely made it so you can and if you made it so you could there would be a reason behind.
Sorry, but canonicalization is neither the issue nor the answer for this one.