My site has been ranked #1 for years and suddenly dropped off. After a lot of panicking and looking for black-hat attacks I found nothing. I started checking server responses for every file on my site and after checking all the HTML, PDF, JPG, etc, I started checking text files. That's when I found that I was getting a 324(EMPTY RESPONSE) for my robots.txt file. Txt files were the only files with a problem. It seems to have started on the day my firewall software was updated. Something went wrong with the way my filters were scanning txt files in transit. When Google received this error they started reporting that every page on my site was unreachable.
You'd think that Google handle a 324 for a robots.txt file like a 404 and just go about their business of indexing the site. They didn't. In the time it took me to figure out the problem my site lost ranking completely.
Please Google - treat a 324 for a robots.txt like a 404!