Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Restricted by robots.txt - WMT Crawl Errors

         

seoN00B

7:06 am on Jun 14, 2010 (gmt 0)

10+ Year Member



Hi SEO Guros,

I have about 428,623 - Crawl Errors on my Webmaster Tools.

My question is, does these errors matter or not a factor?

-seoN00B

tedster

7:13 am on Jun 14, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you really intend to disallow crawling of those URLs with robots.txt, then just take the report as FYI. If you want any of those URLs to rank, then you've got some fixing to do.

seoN00B

7:20 am on Jun 14, 2010 (gmt 0)

10+ Year Member



Thanks tedster for the information.

Actually I didnt do anything on the robots.txt since I worked for this site 6 months ago.

I was just alarmed by the massive crawl errors.

phranque

8:47 am on Jun 14, 2010 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



what type of errors are those?

seoN00B

9:02 am on Jun 14, 2010 (gmt 0)

10+ Year Member



On the robots.txt its URL restricted by robots.txt which is deliberately. (a stupid n00b question from me).

Other errors are 400, 404 & 500 type.

phranque

9:48 am on Jun 17, 2010 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



these errors will be a factor if the urls are supposed to be valid and especially if you are internally linking to those urls.

you will probably have to look at the access log to understand the 400 errors or the error log to get clues for the 500 errors.

sometimes a 404 is just a 404 - other times it should be a 200 ("missing" content needs replacement) or 301 (Moved Permanently to a new Location)