Forum Moderators: goodroi
I want to block this page: http://www.example.com/cgi-bin/pseek/dirs.cgilv=2&ct=category_widgets
But want to keep this page: http://www.example.com/cgi-bin/pseek/dirs2.cgi?cid=147
Would this work to block the first URL without hurting the second one?
User-Agent: *
Disallow: /cgi-bin/pseek/dirs.cgilv
Or would it be better to write out the full URL for each page I want to block like this.
User-Agent: *
Disallow: /cgi-bin/pseek/dirs.cgilv=2&ct=category_widgets
I need to be very careful not to block the second URL (dirs2.cgi). Would there be any danger of blocking the second URL with any of the above robots.txt disallow's?