Forum Moderators: goodroi
A couple of months ago I decided to set up a robots.txt file to ban all bots from visiting a couple of pages on a couple of sites I run, I set it up as follows:
User-agent: *
Disallow: terms.htm
Disallow: contact.htm
However I find that all ot these pages are still being crawled, I have done a bit more research and now changed the file to read:
User-agent: *
Disallow: /terms.htm
Disallow: /contact.htm
The robots.txt file is in the root folder, and can be accessed by www.mysite/robots.txt
Is the leading / the problem, or would this not normally affect it? Any other suggestions?
Cheers for any advice,
Paul