Robert_Charlton - 9:41 am on Oct 5, 2012 (gmt 0) [edited by: Robert_Charlton at 8:28 pm (utc) on Oct 8, 2012]
Does this mean that Google still indexes pages that have been disallowed by the robots.txt
g1smd's post covers it, but here's a long recent discussion which also discusses many nuances and almost all possible misinterpretations. We've discussed the topic here many times.
Pages are indexed even after blocking in robots.txt
Read the above thread, which is still open. You can post there if you have further questions, but please... let's not drive this thread, about algo updates, off topic by getting into robots.txt vs noindex once again.
Google has simply added the message noted in an attempt to clarify what's an oddly paradoxical sounding situation. It's not a search change... just an interface enhancement.
PS: londrum, you've got robots.txt vs noindex roughly backwards. Read the robots.txt thread cited above, which covers noindex as well. It's a confusing set of protocols with regard to search engine indexing.
[edited by: Robert_Charlton at 8:28 pm (utc) on Oct 8, 2012]