Forum Moderators: goodroi
Sorry if this question has been asked many times before, but I honestly couldn't find a definite answer, I have checked it against the SEO validator but would prefer to hear it from another human.
Does this part of robots.txt conform to the standards:
User-agent: 216.167.97.169
Disallow: /blah
Disallow: /blah
Disallow: /blah
Many thanks,
-gs
Apart from that it looks okay but you might want to throw it through the Robots.txt Validator [searchengineworld.com] just to make sure.
- Tony
I did have the name of the robot in there no harm is saying its name (LinkChecker) but it just doesnt obbey my rules! Last month it hogged nearly 1Gb of bandwidth! Any ideas on how to only allow it access to my links dir?
[added]Also the validator said all was fine, hence the reason i wanted to validate it from another human[/added]
Thx again,
-gs
Sorry if that sounds a little vague but I'm an IIS + ASP person so Apache isn't my forte! If you'd like to know how to block at an ASP level that I do know :) sticky me if you'd like to chat about that...
In either case a site search for blocking and your server type will produce a list of possible answers.
<added>
LinkChecker seems to be a link checker program (duh @ me), the source is available via sourceforge - it claims to be robots.txt compliant so you might just have someone faking that UA, or perhaps an older version.
</added>
- Tony