topr8 - 3:22 pm on May 30, 2010 (gmt 0)
of ocurse there are different reasons to disallow robots accessing pages.
but in the case of 'permanently' disallowed pages, eg ones you would never want indexed, rather than ones you now don't want indexed ...
i have a disallow prefix in robots text, eg
etc etc, so anything i want disallowed starts in that way:
this way i can add pages to the site that are disallowed without thinking about a time delay