Forum Moderators: phranque

Message Too Old, No Replies

robot.txt files for subdomains

         

magnum

2:21 pm on Jul 29, 2005 (gmt 0)

10+ Year Member



Hi,

I use the following subdomain hosting using httpd.conf file:
==============================
<VirtualHost *:80>
RewriteEngine on
RewriteMap lc int:tolower
RewriteCond %{HTTP_HOST}!\.domain\.com [NC]
RewriteRule .* - [L]
RewriteCond %{HTTP_HOST} ^www\.domain\.com [NC]
RewriteRule .* - [L]
RewriteCond %{HTTP_HOST} ^([^.]+)\.domain\.com [NC]
RewriteRule ^/(.*) ${lc:%1/$1} [C]
RewriteRule ^([^/]*)/(.*) /$1/$2 [L]
</VirtualHost>
=============================

but subdomains are not searchable from popular engines like google, etc. How can I optimize it?

Regards

jdMorgan

3:58 pm on Jul 29, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you would mark-up your code with what you want/expect each directive should do, it would be much easier to discuss this.

But I'm not sure why you think this is affecting your search results, or how to interpret the contents of your post in the context of the title of your post.

Ticking off the possibilities one at a time, this post does not appear to be about robots.txt. If it is, then place a robots.txt file in each subdomain's subdirectory.

If you want your subdomains' pages to show up in search results, then you must get the search engines to spider those pages by getting links to them from your sites and from other sites.

In your posted code, the only line that looks suspect is the last rule: I'm not sure why it would ever be needed for normally-formed client requests. It just adds a leading slash if one is missing and it shouldn't 'hurt' anything to have it there.

Jim