Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Google and robots.txt



5:43 am on Apr 3, 2007 (gmt 0)

5+ Year Member

How do you get Google to not crawl pages banned in robots.txt?
This is what I have now, but according to webmaster tools it is not seeing these pages as banned. These are search pages, one for each section and they do have noindex, nofollow on them but I would rather not have them crawled. Is there something wrong with my robots.txt syntax? Here is what I have at the end of my robots.txt:

User-agent: *
Disallow: /folder1/Search.asp
Disallow: /folder2/Search.asp
Disallow: /folder3/Search.asp




3:16 pm on Apr 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

You can add a noindex meta tag as well. Some people report that the double message gets through.

Featured Threads

Hot Threads This Week

Hot Threads This Month