Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
This would be very usefull if it has some form of admin page where you could specify a user-agent and check a box for allow or disalow etc. or you coudl specify directories that are to be protected with robots.txt
To be honest I dont even know if this is possible but is would be a very usefull tool for the web developer.
It's both certainly possible, shouldn't be hard to write.
For the first case I think it's probably best to just learn the theory needed and do it by hand, giving maximum control and minimal bugs :)..if it's just one or a few files that need to be edited. It's not much to learn..a day learning should teach you all you need to know.
The second case could save some time, ie..blocking rogue spiders automatically.
I know of one script that does part of that..it can automatically block an ip by adding it to an .htaccess file.
It's called Apache Guardian, from xav.com. The feature is called 'blacklist'..it's not mentioned as one of the main features of the script on the site..I guess because it does have some drawbacks to block ip's automatically. ie..blocking innocent ip's, blocking proxy servers ..blocking spiders you do not want to block..htaccess files growing very large..
This script is not 'industrial strenght' though..
If you know some perl you could possibly tweak it to do exactly what you need in your case...clear the .htacces files automatically according to need...build in some better proxy detection..write to robots.txt instead of an .htacces file etc..
Maybe others know of better solutions..
I don't know really though just thought about that.