Forum Moderators: goodroi
url: /*1/file/*2/
Taking into account that:
- "*1" can be practicly anything and a lot of different values are possible
- "*2" will be short, allthough it can be a lot of different values.
What would be the best way to prevent indexing by bots?
I can't use the meta-tag, as these are images and other files (non-html).
There's no pretty way to deal with the different values of *1, though. I just write out each case. If you're gonna end up with lots of lines of Disallows, you could probably write a script that will traverse your directory structure and output a robots.txt for you.
Something that needs improvement if you ask me :)
oh well, I'll live with it.
I'll probably make a script that will make the robots.txt on the fly when it's requested; simple check in the database will give me the answer on that :)
As *1 is subject to change so often that it's to much work to do that on every change.
tnx for the input!