Forum Moderators: goodroi
Do I have to disalow all of them 1 by 1 ie.
User-agent: bot
Disallow: /folder1/
User-agent: bot
Disallow: /folder2/
User-agent: bot
Disallow: /folder3/
User-agent: bot
Disallow: /folder4/ etc.. or is there an easier way.
I only need it to check www.mydomain.com/midi and exclude it from every other place.
TIA,
Javi
It is imperative that you keep in mind that robots.txt is a SUGGESTION to bots of compliance.
Many bad bots don't even read robots.txt.
With the above in mind. . .
User-agent: *
Disallow: /
denies all bots.
I seem to recall that Allow is not valid protocol in robots'txt.
So your alyernatives are either to deny all or list each "1 by 1"
should work.
It is better to disallow "/folder", as some SE assume that they are allowed to try "/folder" if you only disallow "/folder/", especially if you have some internal or incoming links pointing to "/folder".
Weesnich