So last month (early Feb) I updated my site. On my development box, i keep a robots.txt that disallows everyone. I accidentally copied this "disallow all" robots.txt to production site, while deepbot was crawling.
I didn't notice it for about a week, and by then, deepbot was gone. I had about 60,000 pages indexed with a PR5 and 1500+ referrals a day. Life was good.
Starting in the middle of the month, I started getting crawled by freshbot. I think freshbot crawled over 100,000 pages.
But to no avail. www2 shows me with less than 15,000 pages indexed, much of which doesn't look to be totally indexed (URL with no title or description)
I've also disappeared completely on around 90% of my SERPS (#1,2,or 3 on several hundred different serps)
Now I'm silently praying that freshbot will update the main index with it's 100,000 pages sitting somewhere on some server in the googleplex...
Don't let this simple mistake happen to you! It's only 10 AM but I'm going to get a drink!