Welcome to WebmasterWorld Guest from 107.22.87.205

Forum Moderators: goodroi

Message Too Old, No Replies

Will this work?

Robots.txt

     
12:12 am on Mar 1, 2009 (gmt 0)

5+ Year Member



If I want to block SE's from a particular directory, can I do this?

Let's say I want www.example.com/widgets directory to be indexed, and also www.examples.com/blue but not www.example.com/blue/widgets, is this exceptable?


User-agent: *
Disallow: /blue/widgets/

The reason I ask is I can't find an example that has two directories together, and I don't want to block out 'www.example.com/widgets' or 'www.examples.com/blue' at all, but just the combination.

Will this work?
Thanks.

12:19 am on Mar 1, 2009 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Robots.txt uses prefix-matching, so your directive will do what you say you want -- Assuming that the rest of the file (if any) is structured properly.

Jim

12:24 am on Mar 1, 2009 (gmt 0)

5+ Year Member



ok, just making sure because I couldn't find any example that had two directories on one line.
10:08 pm on Mar 1, 2009 (gmt 0)

5+ Year Member



A followup question:

Would not

User-agent: *
Disallow: /blue/wi

accomplish the same thing as CWebguy desires -- with the added 'feature' of not revealing the actual sub-directory name(s) to the more scurrilous robots and spiders out there?

9:13 am on Mar 16, 2009 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



It would.

With robots.txt you are disallowing all folder names that match the pattern from the left.

4:25 am on Mar 19, 2009 (gmt 0)

5+ Year Member



I think it would work
 

Featured Threads

Hot Threads This Week

Hot Threads This Month