Forum Moderators: goodroi

Message Too Old, No Replies

How to prevent search pages being soft 404

         

meelosh

7:20 am on Feb 16, 2016 (gmt 0)

10+ Year Member



On one of our sites we recently had a big spike in soft 404 pages. All of these pages are "search" pages. example would be mysite.com/search/page9 All search pages are noindex follow currently and this is setup with the yoast plugin. My question here is would it be best to ad the below to robots txt

User-agent: *
Disallow: /?s=
Disallow: /search/

to prevent these pages from being crawled......really just want to know the safest way of getting rid of these showing up in my webmaster tools.

thank you

oligalma

12:32 am on Feb 18, 2016 (gmt 0)



According to the following link [builtvisible.com...]

Disallow: /search?s=* stops any crawler from crawling search parameter pages. Maybe is this what you are looking for.

meelosh

7:15 am on Feb 18, 2016 (gmt 0)

10+ Year Member



thanks oligalma......any other suggestions are welcome...especially to get these empty search pages to 404 and not 200 if possible...thank you

lucy24

7:48 pm on Feb 18, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



/search/page9
...
yoast plugin

You may want to take the question next door to the WordPress subforum and see if someone knows how to return a blanket 404 for invalid search URLs. Do any of your site searches really lead to 9 pages of results?

User-agent: *

This will only work if you don't have a separate category for the Googlebot. If you do, any Google-specific directives have to go there.