Are you sure the crawling isn't because the robots.txt file is malformed in some way?
In particular, if you have a
User-agent: * section and a
User-agent: GoogleBot section, Google completely IGNORES the
User-agent: * section.
That is, you must duplicate all the rules from the
User-agent: * section into the
User-agent: GoogleBot section if you want Google to see them.
The same holds true for other searchengines.