Are you using session ids in your urls? If so, Slurp often gets lost and accesses the same pages repeatedly as the id changes. If that's what's happening to you, just ban him temporarily to make him stop.
If noone has said it already, Welcome to WebmasterWorld.
Thank you BlueSky. My site consists of about 50 static pages and they all link to OsCommerce shopping catalog (www.mydomain.com/catalog/index.php).This catalog contains aprox. 100 pages and yes I believe session ids are used in catalog urls. What would be the best solution for me: 1. To ban Slurp temporarily from visiting www.mydomain.com 2. To ban Slurp permanently from visiting www.mydomain.com/catalog/index.php 3. Something else Another question is if I ban Slurp will it come back?
You just need to get him out of your catalog temporarily so he stops eating up bandwidth going around in circles. Once you get him to stop, you can let him back in after you get rid of those SIDs. If you cannot turn them off for bots in the control panel and no one here knows how to do it, then do a search or ask for instructions over at OSCommerce's site. It should be a pretty simple change to make the script check for known bots and serve them pages without any SIDs.
Not sure how often Slurp updates robots.txt, but you can try putting this in that file:
User-agent: * Disallow: /catalog/
If you have other disallowed directories/files, just add them to this. Might as well keep all bots out of there until you turn off the SIDs. Let me know if he stops with that. If not, what kind of server is your site on -- Apache or something else?