Welcome to WebmasterWorld Guest from 18.104.22.168
I have been checking our webmaster tools over the past week and drilling into URL's causing crawl errors - 404's, unreachable, etc.
There was a menu present on many pages which left out a "/" so that the link followed by browsers included the current working path and resulted in thousands of 404's. The site in question has around 2,000 pages of content and the links caused 5000 "not founds".
Would the above affect our rankings? Is there an "acceptable" amount of 404s that Google will tolerate?
Also, once you update the pages with the correct links, how long will it take Google to remove the 404 URLS from their site crawler error list?
I've seen a few sites that have removed good content pages from their index 'allocation' because they have hundreds of non-existent custom 404 pages taking up the space instead. Google doesn't seem to be able to weed these out if they are linked to from within and outside the site.
I've not had any experience with true 404s in that number.
The header response is 404, I remember using this header directive in the site code where necessary. The fact that WMT is showing pages as 404 would indicate that that is set up correctly.
My thinking was coming more from a point of Google applying some sort of site wide factor to its ranking algorithm based on the number of broken links / 404s on a site.