How do you get Google to not crawl pages banned in robots.txt?
This is what I have now, but according to webmaster tools it is not seeing these pages as banned. These are search pages, one for each section and they do have noindex, nofollow on them but I would rather not have them crawled. Is there something wrong with my robots.txt syntax? Here is what I have at the end of my robots.txt:
User-agent: *
Disallow: /folder1/Search.asp
Disallow: /folder2/Search.asp
Disallow: /folder3/Search.asp
Thanks
Vortech