Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

disallowing .asp pages

disallow asp pages

6:51 pm on Dec 8, 2010 (gmt 0)

New User

5+ Year Member

joined:Feb 8, 2010
posts: 20
votes: 0


I have an .asp page which I wish to block as google is listing it as a soft 404 with duplicate content. The page is referenced for example as :



review_form.asp?model3..... and so on.

Each page is being flagged by google as duplicate so I wish to block robots from accessing this page in its entirety.

I've searched for info on these boards and am not sure if i should be using an asterisk after the "?"

I'm currently using the following robots file :

User-agent: *
Disallow: /review_form.asp?

Is this correct, or should I be using something like :

User-agent: *
Disallow: /review_form.asp?*

Any help will be appreciated.

Regards to all.

9:28 am on Dec 9, 2010 (gmt 0)


WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
votes: 115

the robots.txt pattern matches left-to-right, so your current usage is sufficient.

however, excluding a url with robots.txt will not prevent the snippetless url from being indexed nor from collecting PR.

depending on your particular application it may be better for you if you either use a meta robots noindex or 301 redirect to the canonical url.
4:25 pm on Dec 9, 2010 (gmt 0)

New User

5+ Year Member

joined:Feb 8, 2010
posts: 20
votes: 0

Many thanks phranque - I've used the first robots file and seen the number of duplicate files listed in google tools reduced from 180 to around 120 so it would appear the code is working fine.

Best regards


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members