aristotle - 6:02 pm on Nov 9, 2012 (gmt 0)
I understand all of that. But in this case I intentionally used both robots.txt and the noindex tag on purpose. I used robots.txt to block googlebot because I don't want it to even see what's on these pages at all (they have some content that duplicates some content on other pages.). But I also used the noindex tag as extra insurance just in case the robots.txt file somehow accidentally got deleted or was corrupted.