Forum Moderators: goodroi
So is it possible to have a robots.txt file us the following code
User-agent: *
Disallow: <our UK domain>/
Disallow: <our AU domain>/
Disallow: <our CA domain>/
Or does robots.txt ignore any domain information and just look at what comes after the /. Very important we don't ruin our US rankings.
Yes, only the server-local URL-paths can be specified.
If you have the technology to "change the prices" between domains, you likely also have the technology to serve a different robots.txt per domain... I suspect the right questions are not being asked.
Jim
I'll bet you can easily find someone who *can* do individual files -- Good help is cheap in an economic downturn, something that "no-can-do" people should bear in mind... ;)
Use mod_rewrite or ISAPI Rewrite to internally rewrite robots.txt URL requests to different files based on the Host header sent with the client HTTP request. Or again, use a rewrite engine to pass all robots.txt requests to a PERL or PHP script which can generate different robots.txt content, again based on the Host header sent with the HTTP request. Or build this function into the script you use to generate your custom 404 error page contents, and let the robots.txt requests activate that script as well, with that script producing the robots.txt content (and a proper 200-OK server status response)... There are many ways to do it.
Jim