| Welcome to WebmasterWorld Guest from 22.214.171.124 |
register, login, search, subscribe, help, library, PubCon, announcements, recent posts, open posts,
|Subscribe and Support WebmasterWorld|
|Robots.txt and dynamic pages|
| 7:58 pm on Jan 5, 2010 (gmt 0)|
I have inherited resposibilites of a dynamically driven site. It was not set up with robots.txt in mind and changes cannot be made to correct that.
Typically, pages are generated thusly:
and so on...
I want all four of the 'id=1' pages to be crawled; but I only want the first 'id=2' page crawled. Can this be done, and if so, how would that be written in a robots.txt file?
| 1:52 pm on Jan 7, 2010 (gmt 0)|
The answer to your question is yes it can be done. If you research the numerous posts already covering this here on WebmasterWorld you will learn how to write your own robots.txt.
ps it is nice to say please and thank you when you are asking people to help you write your robots.txt
| 8:03 am on Feb 14, 2010 (gmt 0)|
Yeah, it can be done, actually its prety easy for that. Just do what goodroid said.
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
WebmasterWorld ® and PubCon ® are a Registered Trademarks of Pubcon Inc.
© Pubcon Inc. 1996-2012 all rights reserved