Forum Moderators: goodroi

Message Too Old, No Replies

Is there a way to do this?

inverted robots

         

Steviebone

2:11 pm on Mar 4, 2015 (gmt 0)

10+ Year Member



I am trying to reconfigure a robots.txt file. I know this approach may be impossible but... I want to exclude everything except certain specified directories (instead of allowing everything except certain paths/files)

Consider this block:

User-agent: *
Disallow: /
Allow: /Dir1/
Allow: /Dir2/
Allow: /Dir3/
Allow: /Dir4/


This works except for one fatal flaw. It blocks the use of the default home page referenced by the url domain name alone, such as:

www.domainname.com


Since the 'index.htm' or whatever default file returned by the web-server is implied and not implicit the rule fails for the domain name by itself. I don't care much for the idea of allowing everything by default and then having to hunt down everything I don't want indexed/crawled. Whoever came up with this idea was creating crawlers

I know you can allow subdirs after a disallow statement but how then can you handle anything in the root? Hell, that's the one place I want to limit. It seems like it would be much simpler to be able to just list areas of a site you want crawled, not the other way around. Am I crazy? Or is this just stupid?

Any workarounds I can't see?

Propools

2:19 pm on Mar 4, 2015 (gmt 0)

10+ Year Member



The "Disallow: /" blocks bots from going anywhere on the site. The "Allow:" statements are useless. The only thing that should be in the Robots.txt file is what is NOT allowed.

From - [robotstxt.org...]
To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:

not2easy

2:33 pm on Mar 4, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Google does recognize "Allow:" but only to modify a prior "Disallow:" setting. Other bots may be left out.