Forum Moderators: goodroi
User-Agent: some bot
Disallow: /one/
Disallow: /two/
Disallow: /tree/
Disallow: /for/
Disallow: /five/
These are not tree directorys.
The all reside in the root directory and have their own structures.
ie:
Public HTML contains all 5 directories.
i apologize but my caffeine is not working yet & my brain is still a little slow. are you trying to block files like:
http://www.example.com/one/
http://www.example.com/one/index.html
http://www.example.com/one/abc.html
or
http://www.example.com/level1/one/
http://www.example.com/level1/level2/one/abc.html
User-Agent: some bot
Disallow: /one/1.2/1.3/
Disallow: /two/
Disallow: /tree/
Disallow: /for/
Disallow: /five/scripts/
Disallow: /five/images/
Disallow: /five/includes/
Disallowing five like that will still leave the root of five and all other subdirectories open for spidering. Disallowing 1.3 like that only blocks that directory and leaves everything else in "one" open.
The last time I did not limit 'the bigest american search engines" access. I got torn apart by unwanted traffic the last time and it cost me a fortune.
So what I have done is expose the index page without the geoip block. Therfore allowing all search engines to access it.
But keeping the rest of the site in other directories with the geiop block in place.
This way only the first page shows up on search engines. And my comment that they are out of my service area will only need to be displayed on the front page.