Forum Moderators: phranque
Yes, but Slurp never was any good with 404s -- it just keeps asking for them, sometimes for more than a year. A better bet is to redirect all requests for your non-www domain to your www domain, and prevent this problem from happening in the first place. Thankfully, slurp understands 301s better than it understands 404s and 410s.
Jim
I complained at one point when it made >100 requests for one robots.txt file in 24 hours - they said it was 'unavoidable' because the spider is 'distributed'. How can you reason with that? ;)