WebJoe - 6:27 pm on Jan 6, 2004 (gmt 0) (I hope this is the right forum)
I have a robots.txt that has not changed since oct 10 2003, with several directories disallowed for all user agents. I validated the robots.txt with the robots.txt validator [searchengineworld.com], and yet I find a page within one of these directories indexed by google. Is this normal? Do I have to use the robots-metatag to prevent these pages from being indexed?
(I hope this is the right forum)