Hi,
I have an .asp page which I wish to block as google is listing it as a soft 404 with duplicate content. The page is referenced for example as :
review_form.asp?model1
review_form.asp?model2
review_form.asp?model3..... and so on.
Each page is being flagged by google as duplicate so I wish to block robots from accessing this page in its entirety.
I've searched for info on these boards and am not sure if i should be using an asterisk after the "?"
I'm currently using the following robots file :
User-agent: *
Disallow: /review_form.asp?
Is this correct, or should I be using something like :
User-agent: *
Disallow: /review_form.asp?*
Any help will be appreciated.
Regards to all.
Chris