Forum Moderators: goodroi
If it has been a while, or if Slurp has fetched the new robots.txt, then you may have a syntax or other problem with the file. Have you tried validating [searchengineworld.com] it?
What exact syntax are you using to block Slurp?