| Welcome to WebmasterWorld Guest from 184.108.40.206 |
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
|Subscribe and Support WebmasterWorld|
|disallowing .asp pages|
disallow asp pages
| 6:51 pm on Dec 8, 2010 (gmt 0)|
I have an .asp page which I wish to block as google is listing it as a soft 404 with duplicate content. The page is referenced for example as :
review_form.asp?model3..... and so on.
Each page is being flagged by google as duplicate so I wish to block robots from accessing this page in its entirety.
I've searched for info on these boards and am not sure if i should be using an asterisk after the "?"
I'm currently using the following robots file :
Is this correct, or should I be using something like :
Any help will be appreciated.
Regards to all.
| 9:28 am on Dec 9, 2010 (gmt 0)|
the robots.txt pattern matches left-to-right, so your current usage is sufficient.
however, excluding a url with robots.txt will not prevent the snippetless url from being indexed nor from collecting PR.
depending on your particular application it may be better for you if you either use a meta robots noindex or 301 redirect to the canonical url.
| 4:25 pm on Dec 9, 2010 (gmt 0)|
Many thanks phranque - I've used the first robots file and seen the number of duplicate files listed in google tools reduced from 180 to around 120 so it would appear the code is working fine.
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
© Webmaster World 1996-2014 all rights reserved