Forum Moderators: goodroi

Message Too Old, No Replies

slurp ignoring disallow?

-or have i got it wrong?

         

tallis

10:18 am on Jun 20, 2005 (gmt 0)

10+ Year Member



i have a link to make a comment on every page, which i'd prefer not be indexed, and i've been using this:

User-agent: *
Disallow:
Disallow: comment.php

but i've noticed that slurp is still visiting those pages, have i got it wrong, or is there some other way to prevent those pages from being indexed?

they're mostly in different directories from each other, so i'm not sure if i'm supposed to be making that clear somehow, any ideas?

tallis

Clint

10:34 am on Jun 20, 2005 (gmt 0)



I think you must put the slash / before the path, and you have disallow twice. I think it should be like this:

User-agent: *
Disallow: /comment.php

tallis

11:49 am on Jun 20, 2005 (gmt 0)

10+ Year Member



oops, that second disallow is a pasting error. ;-/

thanks for that, do you know if having a slash in front of the filename stop it from being spidered no matter how deep it is in the directories? -or do i need to specify somehow that it's several deep?

tallis

Clint

12:09 pm on Jun 20, 2005 (gmt 0)



In that example, that will stop the bots from spidering ONLY root/comment.php. It's just like any other path, if that page is located elsewhere besides yourdomain.com/comment.php and you don't want it spidered, then you have to add each path.

User-agent: *
Disallow: /comment.php
Disallow: /whatever/comment.php
Disallow: /whatever/whatever/comment.php
Disallow: /whatever2/comment.php

etc.