Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
Typically, pages are generated thusly:
and so on...
I want all four of the 'id=1' pages to be crawled; but I only want the first 'id=2' page crawled. Can this be done, and if so, how would that be written in a robots.txt file?
ps it is nice to say please and thank you when you are asking people to help you write your robots.txt