Forum Moderators: goodroi

Message Too Old, No Replies

Separate robots.txt file for secure site

How do i block one and not the other?

         

Vimes

6:18 am on Jun 9, 2005 (gmt 0)

10+ Year Member



Hi,

I’m on IIS 6 and I need to stop my secure site being crawled with a Robots.txt.
Does anyone know a way of having a separate robots.txt file for my https site.
At the moment I’ve got a duplicate problem as Gbot is indexing my secure site, but the robots.txt file is used by the h*tp:// and https:// so I can’t use
user-agent: *
disallow: /
As it will destroy my non-secure page SERP’s

Anyone have any idea on how I can do this.

Vimes.

Lord Majestic

3:29 pm on Jun 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Create a script named robots.txt that will print correct copy of robots.txt depending on whether request was made for secure version or not.

I also thought Googlebot was not indexing secure pages.

Vimes

3:34 am on Jun 10, 2005 (gmt 0)

10+ Year Member



Thank you,

sometimes i can't see the woods through the trees..

Vimes.