Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
I submitted my site to something called "Scrubbing the web" and it offered an analizer thing on there which said my robot.txt thing wasnt there.
Can anybody explain this to me in a simple manner please? Are there different codes you have to put in the meta tag place for different sites?
I just clicked on some sort of Robots test and it gave hundreds of errors and I dont have a clue what its talking about.
You don't need a robots.txt file, it is used to disallow spiders from your site if they are programmed to follow the robots.txt guide lines
A few examples of what you may put in a robots.txt file
This indicates that nothing is disallowed and the spider can follow all links
To allow a single robot complete access and exclude all others
This would prevent your entire web site from being indexed
If you do not want certain directories to be spidered
Disallow: /cgibin (change this to what you require)
or any directories which are private
Disallow: /sitestats (change this to what you require)
You would create the file using say notepad and FTP using ASCII mode to your site root directory
hope this helps
Thank you for your reply. So does this mean that if I want the search engine to include all my pages, I really dont have to put any kind of a robot txt line on my page and it will do it automatically?
..and that you only have to put that line in there if you DONT want it to do certain pages?