Forum Moderators: open
So for the king of bots... what are we talking about here and the significance of a 403 error and a 404 error in the robots.txt file search of G.
That is does it make a difference.
Kahuna.
So I am "hoping ++++" that the bots just ignor the difference... Boy.. this sure would screw somebody up that didn't know about these things... what rookie uploads a robots.txt file? I just uploaded a blank robots.txt file until the company fixes the problem.
Thanks again group.
While Powdork's description makes sense, this was a big enough problem for Google to switch their behaviour and crawl domains where /robots.txt returned 403.
That was some time ago, but if it's still the case then 403 for /robots.txt should be no problem.
I don't use a robots.txt file, so the error should be a 404 error.
You should deal with the most urgent problem first.
Before doing anything else, upload a blank (empty) file called robots.txt to your document root - that way, the bots won't get any error and your site will be indexed. Only then should you worry about incorrect error codes, probably by emailing your hosting company to complain about their setup.
I got this back from the host "This is because the folder is control. If you look for the same file in an uncontrolled folder you will get the 404 error. www.mydomain.com/images/robots.txt"
Anyway... this is starting to head towards a more technical nature about apache servers so I'll stop posting here...
Thanks again group.