tedster - 3:06 pm on Nov 22, 2012 (gmt 0)
"Close this block" is not a directive to search engines, as far as I know. Instead, it is a third party application used by, among others, Microsoft to help site builders.
If you don't want a URL crawled you still need a robots.txt disallow rule.
If you don't want a URL indexed you still need a robots meta tag noindex rule.
It does get just a bit more complicated because if you disallow crawling then the meta tag never gets crawled or seen, so some Google SERPs (usual longer tail) might show a link with an explanation that the actual URL was restricted by robots.txt