>>I have a client who has the following within their robots.txt file:
Really? That should not be in there. What you have there should be in the head of your document. Think of the robots.txt as an exclusion protocol, it wont make the spidering process any quicker
>>and we want the spiders to revisit every day or even more frequently.
You mention every ten days, well I am afraid you cant really dictate how often they come. A simple way to get crawled more regularly is to add regular content and get a few links into that section.
>> banafit of PageRank / rating from the home page link. 2 questions:
Forget page rank, you are after being crawled, if you have enough beef in your page you will rank regardless of it. :)
If you want Google and others to crawl all your pages you donít even really need to have a robots.txt file. It is best to have one though, even if it is just blank.
I would be tempted to lose the top and bottom of the two you have, but put them where they should be :). I donít know what the global one does. Never used it.