aakk9999 - 9:48 pm on Jul 13, 2013 (gmt 0)
In this case there must be something else in robots.txt that blocks these pages (or robots.txt may be returning incorrect HTTP response code). Have you got User-agent: googlebot section in robots.txt and if you do, is anything listed in there that could block your pages?
Another thing that I would do is to go to Google Webmaster Tools, section "Crawl" --> "Blocked URLs" and in the entry box at the bottom enter URLs that Fetch reported as blocked.
Click on "Test" and if this URL shows there as blocked, it should tell you which line in robots.txt (seen in the top box above) blocks this URL. If the URL should not be blocked, then your rules in robots.txt are created incorrectly.
Another thing that I would do is to check how many pages are blocked by robots.txt. You can get some idea by searching google using the site: command
If you get message "In order to show you the most relevant results, we have omitted some entries..." then click on "repeat the search with the omitted results included". Browse through the SERPs and look for SERPs entries with "A description for this result is not available because of this site's robots.txt – learn more" - see how many of these you have on the site.
I am not saying that robots.txt is the (sole) reason for your traffic drop, but it is obvious that you have some issues in this area, and the best is to fix these first.