First, appreciate that there is no one position for a URL on any keyword search. Google's results vary according to many factors. For example, variation is introduced by which data center Google's load-balancing sends a request to, by geographic location of the user, by browser, by "search history" personalization when a user is logged in to any Google account, and by cookie-based customization when a user is logged out. In recent times, some people are even reporting that "time of day" also seems to affect some search results.
That said, many rank tracking tools do exist, some free and many paid. But they all send automated requests to Google. And the positions they record are subject to all the above fuzziness.
There is one way to get some ranking data without sending automated queries - see this current thread
WMT Now Shows ALL Keywords - with CTR and Position [webmasterworld.com]. As you'll note in the discussion, the accuracy of the data is questioned by some, and certainly given my earlier comments, the positions reported must be averages.
In your situation, the first place I would turn is whatever analytics package you are using - it should be capturing traffic by keyword.
It seems unlikely that ALL keywords are down by 20% across the board. Instead, something like the 80-20 rule is likely in play - and it may well be more like 95-5 04 even 99-1 in the case of a large website. With analytics data, the loss of traffic can probably be pinned down to a significant degree, and that would give you a lot fewer keywords to look at in your diagnosis.