Welcome to WebmasterWorld Guest from 188.8.131.52 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
Disallow pages in a subdirectory Basic help please pjuk msg:1527355 9:49 am on Jun 14, 2004 (gmt 0) Hi - I'm new to the forum and new to robot.txt files, this is maybe a stupid question but I want to be sure before I upload the robots.txt file.
I want to disallow robots from three pages that 'live' in a sub folder called 'pages' I have set up the robots.txt file as follows-
Disallow: /pages/paypalconfirmed.htm Disallow: /pages/booking form.htm
Is this correct? I do not want to disallow all files in 'pages' only the listed ones.
Also I have saved the robots.txt file to the root directory as a txt. file is this correct?
Thanks for any advice
goodroi msg:1527356 7:02 pm on Jun 24, 2004 (gmt 0)
Yes that is correct. Make sure to use a robots.txt validator. I've caught many typos that way. Also as an extra precaution you can place noindex, nofollow tags within the html of those pages. jdMorgan msg:1527357 8:40 pm on Jun 24, 2004 (gmt 0)
The code posted above may not work as expected, because the
Standard for Robots Exclusion [ robotstxt.org] specifies that a blank line is to be interpreted as an end-of-record indicator. Delete the blank line, and use: User-agent: * Disallow: /pages/paypalcanceled.htm Disallow: /pages/paypalconfirmed.htm Disallow: /pages/booking form.htm As goodroi suggests, validate your robots.txt file [ searchengineworld.com] before using it.