Forum Moderators: open
for 3 weeks now. However, I am not yet indexed by google.
How can I check if my complete site has been crawled? I am afraid that only the index is searched.
Formerly my robots.txt was
User-agent: *
Disallow:
I got the advice to remove the disallow line. Is this OK?
What is the difference between these bots?
Thanks for any input
[edited by: WebGuerrilla at 6:23 pm (utc) on June 13, 2003]
[edit reason] no specifics please [/edit]
There are lots of threads here about the difference in googlebots and robots.txt files. Try the site search at the top of this screen.
This thread [webmasterworld.com] will get you started. Notice that disallow means nothing unless followed by a "/". If you don't want to disallow any bots you can dismiss using a robots.txt file as the default is allow all.