Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
I do know how to block pages from being crawled, but since I have more to block than allow, I was thinking it is probably easier to do the opposite.
thanks in advance
if you do want to list individual pages on your robots.txt file be careful that your file doesn't get too big. i once had a client with a robots.txt file several hundred kb and the spiders had a hard time reading it. so avoid the extreme sizes and you'll be ok.