Forum Moderators: open
The first deep crawls were reported yesterday [webmasterworld.com], so it may just be a coincindence.
If your robots.txt validates using Brett's robots.txt validator [searchengineworld.com], then you should be OK. Two non-obvious things I know of that can cause problems are CR/LF at the end of each line instead of newline(LF), and a missing terminal newline. Most robots will ignore these errors, but some don't.
Brett's robots.txt checker catches everything except the missing terminal new line. In case that's not clear, the last line in the robots.txt file should be blank except for a newline character.
Jim
"Disallow: /manual" disallows anything beginning with /manual, so it does forbid crawling of /manual/, /manual/whatever and even /manualwhatever.
EAHunt, I agree with Jim. It's bound to be coincidence.