Forum Moderators: goodroi
User-agent: *
Disallow: /
Disallow: /private/
Disallow: /user/
Does the line "Disallow: /" override the rest and block crawlers from accessing the entire site? Or does listing specific subdirectories override the top line and enable crawlers to access the rest of the site?
the disallow syntax of the robots exclusion protocol matches left-to-right.
in your example the 2nd and 3rd disallow are redundant and it is disallowing everything - including, in and under the root directory.
the default is to allow everything or if you want to whitelist a specific bot and disallow all others from those two directories:
User-agent: specific-bot
Disallow:
User-agent: *
Disallow: /private/
Disallow: /user/