Welcome to WebmasterWorld Guest from 126.96.36.199 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
Help with Disallow Not sure if I got this right... BillyS
I just consolidated several sections and I don't want crawlers to index those pages (removed) anymore.
The old URL was of the form:
If I put in this:
That should stop indexing of these pages (right?)
From your example your robots.txt file should say..
If the section is a real subdirectory. If it's not t hen you are correct.
MarkHutch is correct, but you would be even better of having Disallow: /section/7 without the traling slash, this is just in case there was a link pointing to that directory that served the default page. That of course is as long as there isn't another directory e.g /section/71 that you still want indexed :-)