Forum Moderators: goodroi
Please help me to correct these errors if some one can .... I created this by using a tool.
User-Agent: *
Disallow: /cgi-bin/
Disallow: /pic/
Disallow: /images/
Disallow: /netforward/
Disallow: /reciprocal/
Disallow: /autorank/
Disallow: /members/
[edited by: oilman at 11:55 pm (utc) on Mar. 13, 2004]
[edit reason] no urls please [/edit]
Second, what you've posted doesn't match the actual robots file.
Try User-agent*
Instead of User-Agent: *
Delete this line:
Disallow:User-agent: *
I may be wrong, but I think you need to dump these three lines also:
Disallow: *.gif
Disallow: *.jpg
Disallow: *.bmp
Question now rises .... is what is Mozilla? is this a robot or what?
The capital "A" in User-Agent was definitely a problem. I guess I missed that.
AFAIK, the use of wildcards in Disallows are only allowed as extensions to the robots.txt standard. Googlebot, for example, has an extension that makes Disallows like this legal:
Disallow: /*.gif$ webapache:
the / at the end of each directory is not allowed as it is considered as second entry
Putting the / at the end of each directory is allowed and legal.
Mozilla is usually a human visitor, but there are also automated visitors that use it in their UA string.