Forum Moderators: DixonJones

Message Too Old, No Replies

I need help

         

Treddy85au

10:39 am on Apr 26, 2006 (gmt 0)

10+ Year Member



Can anyone help me I used a robots.txt generator to create a robots.txt. But throught google I get 404 error. Someone told me to go to this site to do a robots.txt tester but it keeps on failing. the link I use is <EDIT> No URL Drops please </EDIT>. I have copied and paste context of my robots.txt file.

User-agent: asterias
Disallow: /
User-agent: BackDoorBot/1.0
Disallow: /
# Long List Snipped
User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

Please Help!

[edited by: Receptional at 10:59 am (utc) on May 1, 2006]
[edit reason] (specifics) [/edit]

Receptional

10:58 am on May 1, 2006 (gmt 0)



If Google sees a 404, it doesn't really matter what is in the file - this means Google doesn't see the file at all.

We did hear in Boston that Google's crawl strategy is changing, so now, Googlebot holds its own cache to prevent over accessing a site. My guess is that it will fix itself in time.

Dixon.

encyclo

1:58 pm on May 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Are you doing any URL rewriting? If so, it can interfere with access to files - for example if you redirect every request through a script, then calls for your robots.txt file will be redirected (to a non-existent file) unless you make an exception.

You also need to check the file name and capitalization - it must be "robots.txt" not "Robots.txt" or "robot.txt", and make sure it is in the root directory: http://www.example.com/robots.txt.

tedster

3:08 pm on May 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One additional note, there was a bug in the Sitemaps program that was giving false errors in reading some robots.txt files. The Google team said recently that the trouble was identified and fixed -- so check again and see if things are already OK.