I'm new to writing robots.txt files, I've read a few tutorials and still not sure about my robots.txt. My question is, do I still need this code [User-agent: * Disallow:] in the robots.txt file to give permission to the major search engines to index my site and is having that in there giving them permission to ignore the rest of the text file below it?
And to bicycling who wrote: ==================================== Google has a robot.txt checker at [google.com...] you might want to try that. And I think your first line is best advised removed. User-agent: * Disallow:
The above mentioned means to disallow all bots from indexing your site. Unless you dont care about getting ranked in the SERPS you can leave that there :) =====================================
I would like to thank you for your reply bicycling and I beleive the block above doesn't request the robots not to index but to index the site, this is the block that disallows:
User-agent: * Disallow: /
There has to be a / in the block to disallow spiders from indexing, that is what everybody else is saying on the net. Also thanks for the location of that tool.