Forum Moderators: goodroi
I have a question about the robots.txt file.
First of all, does it HAVE to be there? What if its not there? To my best logic, everyone can browse thru the site then and the spiders can see EVERYTHING on my site(which i don't mind for now).
Also, is it the ROOT where the robots.txt go or WWW folder?
Thank You, all answers should help me. I am a learner.
Tx, Again.
Neh
If there is no robots.txt, all spiders will feel welcome to index all of your files. The only bad effect of not having a robots.txt file is that your error log will have a lot of 404-Not Found errors cluttering it up as a result of spiders requesting robots.txt.
To avoid this, place a simple robots.txt file in your web root:
User-agent: *
Disallow:
You can then use this robots.txt checker [searchengineworld.com] on the WebmasterWorld sister site Search Engine World to help make sure that it is correct.
HTH,
Jim
Hi neh2008,
No, the robots.txt file does not have to be there. But you might save yourself some future questions (Why are there so many file not found errors for robots.tx), if you simply put a blank robots text file on the site.
Just open a file in Notepad or similar text editor and save it as robots.txt.
Where does it go? The exact location depends on your server but it in all cases it should be at the same level as your home page (index or default).
As you develop your site you might find instances where you would not want a robot indexing your pages. That's when you'll put the robots.txt to use.
When you need to, do a "site search" above to learn more than you really want to knwo about robts.txt.
Jim