Welcome to WebmasterWorld Guest from 18.104.22.168
I meant "password protected" as in a "401 status code" response but would be interested in knowing the answers for both...
If a page comes with
(as some of my non-html pages do)
will this directive be honored in html pages that don't have a meta robots tag?
Ok so lets say you wanted to remove these urls that are in a subfolder. IN GWT would you use the following syntax.
as far as i know google does not index any 4xx status code responses.
My question - is there any differential treatment for password protected pages vs robot.txt excluded pages in Google SERPS? We know that robots.txt excluded pages show up as link only stubs in SERPS with a description posted by the OP. But what about password protected pages? If they don't show up in SERPS at all, why this differential treatment?
Hence pages that are roboted out may show in SERPs as Google was only told not to crawl them and does not know about any other directive or response code that might result in different page handling.
Why aren't they showing the password protected URLS in the SERPS with a boilerplate description, like they do for roboted out URLs?
But doesn't the same hold true for password protected pages? By password protection we are telling them, they are not allowed to crawl.
Note that is a subfolder.
Hence it is probably not good for Google to have such page in index because if a click from SERPs requires a password it is most likely a bad experience for visitors coming from SERPs.