Welcome to WebmasterWorld Guest from 126.96.36.199 , register , login , search , subscribe , help , library , PubCon , announcements , recent posts , open posts Subscribe to WebmasterWorld
Help with Disallow Not sure if I got this right... BillyS msg:1529306 3:56 am on Nov 14, 2005 (gmt 0) I just consolidated several sections and I don't want crawlers to index those pages (removed) anymore.
The old URL was of the form:
If I put in this:
That should stop indexing of these pages (right?)
MarkHutch msg:1529307 4:16 am on Nov 14, 2005 (gmt 0)
From your example your robots.txt file should say..
If the section is a real subdirectory. If it's not t hen you are correct.
Dijkgraaf msg:1529308 8:52 pm on Nov 14, 2005 (gmt 0)
MarkHutch is correct, but you would be even better of having Disallow: /section/7 without the traling slash, this is just in case there was a link pointing to that directory that served the default page. That of course is as long as there isn't another directory e.g /section/71 that you still want indexed :-)