Forum Moderators: open
<meta name="keywords" content="keys here">
<meta name="description" content="desc">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta name="ROBOTS" content="INDEX,FOLLOW">
Here is our robots.txt file:
# Allow all
User-agent: *
Disallow:
Every day, the Google spider is coming by and grabbing our robots.txt, and then grabbing our index.html, and then not going any further.
Can anyone think of why this might happen?
The Yahoo Slurp spider is hammering our site spidering everything just fine.
if you are allowing everything for everyone just upload an empty file, I don't think that syntax is quite right. As with all directives for robots.txt, you only tell them what they can't do. It runs on the assumption that the base rule allows them access to everything= and then excludes from there.