Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
Suppose I want to exclude the subfolder "/bad" in my robots file. And suppose this subfolder can occur in my site in different ways.
I have put in:
in my robots.txt file thinking it would disallow bots from all those instances above. But it doesn't seem to be working. Looking at my log files, googlebot continues to browse to the disallowed folders/pages. I made this change almost a month ago now so I'm sure googlebot must have refreshed its cache of my robots.txt file by now.
Am I doing something wrong?
The urls are dynamically generated and there is no way for me to generate a complete list of disallow statements as you mentioned.
So, for example:
I want this to be disallowed. But the "variable" part of the url could literally be anything - it's a unique id coming from a database.
If it helps you to understand my problem, I'll give you a specific example:
So, in this case, the "595" is the variable, and the "/language" is the subfolder that I want to disallow. There is no way for me to create a zillion disallow statements to cover every possible products_id in my system.
Hope that makes sense?
As you may be able to tell, the URL is actually generated through an apache mod_rewrite. It is normally: