Forum Moderators: goodroi
example.com/Products?search=term
How do I disallow this?
The site also has a sort function that sorts products on a page. The URL generated is as follows:
example.com/Products?search=term&sort1desc=F&sort1=Item_NAME&range=
how do I disallow robots from crawling these pages?
Please let me know what I should add to the robots.txt page.
I would be really obliged for the help.
I had a similar problem. Not sure if this will help but check this thread out
[webmasterworld.com...]
sort1desc=F&sort1=Item_NAME&range=productname1
sort1desc=F&sort1=Item_NAME&range=productname2
sort1desc=F&sort1=Item_NAME&range=productname3 ect..
cause if you just disallow only sort1desc=F&sort1=Item_NAME&range=
you will still have urls like above I think.
so you might have to try the solution in the link I posted before.
Disallow: /Products?search will block any url starting with that string (presumably, that would cover all search result pages).