Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
I have approx 15000 folders like this where i want to block the file.php in every folder. what is the best option to block. I dont want to make the robot file so lengthy by putting all the urls in side. any other quick solution.
help appreciated in advance :-)
[fixed confusing typo in title]
[edited by: goodroi at 1:54 pm (utc) on Aug. 13, 2007]
if yes then this would be really helpful for me.
Please remember that most smaller search engines do not support this in robots.txt. Also if you want to test other combinations for Googlebot you can go to Google's Webmaster Central and use their robots.txt analysis tool.
... and that only works for URLs in the ROOT, i.e. BEGIN with that.
You need the * to make it work for folders.
This Rule MUST go in the User-Agent: Googlebot section.
Other bots do not understand the *.
If you have a User-agent: Googlebot section, then ALL of your rules for Googlebot must go in that section as Googlebot will then completely IGNORE the User-agent: * section.
You do this even if it means duplicating a lot of stuff into both sections.