Welcome to WebmasterWorld Guest from 54.234.63.187

Message Too Old, No Replies

Crawl errors and their impact on a site

     

olly

11:39 pm on Jul 8, 2009 (gmt 0)

5+ Year Member



Hey everyone

I have been checking our webmaster tools over the past week and drilling into URL's causing crawl errors - 404's, unreachable, etc.

There was a menu present on many pages which left out a "/" so that the link followed by browsers included the current working path and resulted in thousands of 404's. The site in question has around 2,000 pages of content and the links caused 5000 "not founds".

Would the above affect our rankings? Is there an "acceptable" amount of 404s that Google will tolerate?

Also, once you update the pages with the correct links, how long will it take Google to remove the 404 URLS from their site crawler error list?

fishfinger

6:19 am on Jul 9, 2009 (gmt 0)

10+ Year Member



If the header response for your 404s is 200 (i.e. you have a custom 404 page that isn't correctly implemented) then this could be a problem.

I've seen a few sites that have removed good content pages from their index 'allocation' because they have hundreds of non-existent custom 404 pages taking up the space instead. Google doesn't seem to be able to weed these out if they are linked to from within and outside the site.

I've not had any experience with true 404s in that number.

olly

9:44 am on Jul 9, 2009 (gmt 0)

5+ Year Member



Hi fishfinger

The header response is 404, I remember using this header directive in the site code where necessary. The fact that WMT is showing pages as 404 would indicate that that is set up correctly.

My thinking was coming more from a point of Google applying some sort of site wide factor to its ranking algorithm based on the number of broken links / 404s on a site.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month