I'm looking for a way to keep the search engines from crawling the https versions of my html pages. Currently my best bet is to try it via mod_rewrite, but maybe there's an easier way to do it via robots.txt? Thanks in advance!
8:45 pm on Apr 9, 2005 (gmt 0)
Yes you can do that, but make sure that robots.txt is located in root of [example.com...]