yosmc - 5:57 pm on Mar 25, 2013 (gmt 0)
Thank you for all the replies - genuinely appreciated.
Just to reiterate - my site ranked well for a decade with only a couple of dozen pages indexed. So I thought it's not really relevant whether or not some of those thousands of pages had inbound links or not, because they were not needed to begin with. Wrong? My conlusion now is that Google is either still upset that it saw all those spidering errors over the course of five months, or because (from their view) 98% of the site has disappeared from one day to the next.
Concerning the various redirects and status codes - well as it's now, Google isn't going to see those, because the Googlebot is shut out via robots.txt. So the general line of recommendation is to actually let the Googlebot in again, just to let it know that the stuff to spider is gone? To be honest I don't fully understand how deliberately placing a robots.txt and deliberately removing pages from Google's index doesn't convey the same message, namely that the removal was intentional.
(...oh and yes, I do need access to /links to actually manage my directory.)
Is the general consensus also that such a situation will eventually work itself out, or not unless I take the recommended actions?