We run a large e-commerce site that has been running for 11 years and around 18 months ago we took a big drop in our Google rankings, the details if this aren't really relevant here but we have been working extremely hard to fix this and are getting close to submitting a reconsideration request (actually our third in 18 months … we aren't 100% sure what the problem is but we believe now there aren't any issues with the site).
Because the site is old we continually get old links appearing (mainly from external sites) and dealing with these is an on-going job. Obviously we want everything to be squeaky clean before submitting the request so my question is this : do the Not Found crawl errors reflect badly on us? and if so is it worth fixing as many as possible and waiting for them to drop out of GWT before sending the request, or just the send the request anyway?
Obviously we will fix as many as possible regardless, but should we delay our request in order to get them down to a minimum?
To give you an idea of scale we currently submit 4,059 pages in our sitemap and GWT report 2,435 of these in the index. GWT is currently reporting 64 Not found errors.
Thanks in advance for any input :0)