lucy24 - 10:08 pm on Mar 24, 2013 (gmt 0)
but will google search for a robots.txt for ssl when changing from http to https?
It doesn't need to. The rule is set up as a rewrite, so the googlebot doesn't know it is getting a different file.
But, again, this is only necessary if you want to serve up different crawling rules for http and https. If you've got your 301 redirects in place, that should be all you need in the long term.
when indexing it might recognize it as duplicate content and treat them as one
I thought the essence of Duplicate Content was that it doesn't recognize two pages as one-- even in cases where an ordinary human can tell at a glance.