Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Does this work?

Disallow: /somepath/*/file/*



11:41 am on Mar 11, 2004 (gmt 0)

10+ Year Member

User-agent: *
Disallow: /*/file/*/

Does this work? and is this allowed?


5:31 am on Mar 12, 2004 (gmt 0)

10+ Year Member

Maybe with a robot that was set up to use patterns like that. But normal robots won't understand it.


7:57 am on Mar 12, 2004 (gmt 0)

10+ Year Member

Then what would be the best way to accomplish this.

url: /*1/file/*2/

Taking into account that:

- "*1" can be practicly anything and a lot of different values are possible
- "*2" will be short, allthough it can be a lot of different values.

What would be the best way to prevent indexing by bots?
I can't use the meta-tag, as these are images and other files (non-html).


5:49 am on Mar 13, 2004 (gmt 0)

10+ Year Member

Well, if you're going to disallow everything within file, you can just shorten it to /*1/file/.

There's no pretty way to deal with the different values of *1, though. I just write out each case. If you're gonna end up with lots of lines of Disallows, you could probably write a script that will traverse your directory structure and output a robots.txt for you.


8:51 am on Mar 13, 2004 (gmt 0)

10+ Year Member

hmmm, too bad

Something that needs improvement if you ask me :)

oh well, I'll live with it.

I'll probably make a script that will make the robots.txt on the fly when it's requested; simple check in the database will give me the answer on that :)
As *1 is subject to change so often that it's to much work to do that on every change.

tnx for the input!


5:36 pm on Mar 13, 2004 (gmt 0)

10+ Year Member

Yeah, it's a pain in the you know what. :-)

You're welcome, DoppyNL.


Featured Threads

Hot Threads This Week

Hot Threads This Month