I'm stuck on some basics and need any help that anyone can provide. I have a robots.txt file which, in theory, disallows search engines from various pages on my site based upon the spider the engine uses. I've used a robots.txt syntax checker to verify that my syntax is ok.
I've included meta tags with index, follow options on my pages.
One of three things that seem to happen when a spider visits my site:
1. They visit my home page and go away. I've tried changing the size of my home page by adding 100-200 blanks at the end of the page. Didn't help.
2. They get the robots.txt file, go away and never index the site.
3. Occasionally, a spider will pick up one page in addition to the robots.txt page.
If this were just one engine or spider, I would think that it was just something unique about that spider. But with it happening on virtually all of them I assume it must be something I'm doing wrong.
Oh, I should mention that when I posted the latest buddy links pages, AV visited the site, picked up the pages, got the robots.txt file, visited my home page, and then went back to the robots.txt file and then disappeared.
Can anyone suggest what I might be doing wrong?