I have been checking our webmaster tools over the past week and drilling into URL's causing crawl errors - 404's, unreachable, etc.
There was a menu present on many pages which left out a "/" so that the link followed by browsers included the current working path and resulted in thousands of 404's. The site in question has around 2,000 pages of content and the links caused 5000 "not founds".
Would the above affect our rankings? Is there an "acceptable" amount of 404s that Google will tolerate?
Also, once you update the pages with the correct links, how long will it take Google to remove the 404 URLS from their site crawler error list?
If the header response for your 404s is 200 (i.e. you have a custom 404 page that isn't correctly implemented) then this could be a problem.
I've seen a few sites that have removed good content pages from their index 'allocation' because they have hundreds of non-existent custom 404 pages taking up the space instead. Google doesn't seem to be able to weed these out if they are linked to from within and outside the site.
I've not had any experience with true 404s in that number.