Forum Moderators: phranque
[google.com...]
[google.com...]
Are you sure you want a robots.txt solution, or would you be better off invoking a 301 redirect for some requests?
In many cases, the SSL and non-SSL "sites" are served from different directories on the server. In this case, it's a simple matter of putting two robots.txt files on the server, one in each directory.
However, if the SSL and non-SSL content are both served from the same directory, then you'll need either a dynamically-generated robots.txt file (e.g. generated by a script that tests for SSL/non-SSL requests), or you'll need a secondary robots.txt with a different name such as robotsSSL.txt, and a rewriterule to rewrite robots.txt URL requests to that secondary file if the request is made via SSL.
Jim
I think I did find some code. I'm not extremely technical so can you tell me if this is what I need?
RewriteEngine on
rewritecond %{SERVER_PORT} ^443$
RewriteRule ^robots\.txt$ robots_ssl.txt [L]
What is this doing? Is the secure server always on port 443? Does this then rewrite robots.txt to robots_ssl.txt if it is on the secure server?
@jdMorgan, no the SSL files are not in a separate directory and are served from the same place.
Note the use of the terms "URL" and "file" above. Making this distinction is quite important to avoiding confusion.
Jim