shaunm - 11:12 am on Nov 9, 2012 (gmt 0)
I am getting a lot of errors from a directory in my websites. This directory pages are not indexed but I am getting the errors through crawl section in Google webmaster tool.
What do I do now?
Can I just go ahead and add 'Disallow: /directoryname/' in my robots.txt file? Will you suggest that? Will that stop Google from accessing that particular directory when it arrives for crawling the next time? So eventually, I won't be seeing those errors anymore right?