lucy24 - 11:15 pm on Sep 5, 2012 (gmt 0)
But you have to concede that g### itself willfully contributes to the misunderstanding. Consider the page-removal section of WMT:
To remove a page or image, you must do one of the following:
* Make sure the content is no longer live on the web. Requests for the page must return an HTTP 404 (not found) or 410 status code.
* Block the content using a robots.txt file.
* Block the content using a meta noindex tag.
To remove a directory and its contents, or your whole site, you must ensure that the pages you want to remove have been blocked using a robots.txt file. Returning a 404 isn't enough, because it's possible for a directory to return a 404 status code, but still serve out files underneath it. Using robots.txt to block a directory ensures that all of its children are disallowed as well.
Content removed with this tool will be excluded from the Google index for a minimum of 90 days.
Would not a person of ordinary intelligence interpret this to mean that a file in a roboted-out directory will stay out of the index, once removed?