We have a situation where site:foo.com shows a large number of pages but very few pages are actually findable with a regular search if you search for a unique text string. The ones that are findable are cached. This amounts to "banning" since our traffic to this site from Yahoo to the very small number of findable pages is negligible. Their spider accesses robots.txt 50 times a day but otherwise doesn't spider that site any more.
Even if you don't have robots.txt, Yahoo slurp will attempt to access robots.txt in your root directory and you will get a 404 log entry.
We think we have the same problem in one of our sites. It was banned. We fixed it up to eliminate anything that might be considered abuse such as www.foo.com and foo.com delivering the same content and applied to be unbanned. Slurp came back and eventually we started to see more pages indexed. But now slurp only accesses robots.txt (50 times a day) and we are down to six pages in the index so it looks like we are "re-banned". We are totally mystified as to what could be causing the problem.
Do you get any traffic at all from Yahoo? If your traffic went from something to ZERO, then you may have been banned. Otherwise, it's another wierd search engine issue that will *most likely* work itself out in time. Just keep writing content and getting links and staying far away from spam and the rest will take care of itself.