JD_Toims - 12:55 am on Aug 1, 2013 (gmt 0) [edited by: JD_Toims at 1:28 am (utc) on Aug 1, 2013]
Adding robots noindex on page at the same time when the response code is changed to return 404/410 will not speed up dropping the page out of index since Google will not bother with page HTML once it receives 404/410 response.
I'm sorry, but that's incorrect.
Google often continues to index a page they receive a 404 response from until the algo decides to treat it as "not there" any more. In other words, they err on the side of caution, because "not found" is not definitive and does not mean "gone".
A 404 can be caused by a number of conditions, including a page being uploaded at the time they spider it. (In this case the FTP software usually deletes the remote copy and then uploads the new one and there can be occasions where a page is deleted and then for some reason the upload does not happen before a bot hits the page... There have been *many* reports here over the years of Google not dropping 404 pages from their index at all quickly, but a noindex on the 404 page served to the URL solves that issue in my experience.)
If you don't believe me, check your favorite SE for the many threads here over a number of years about how slow Google can be to drop 404 pages from their index. Eventually they will, but if the URL is noindexed when they receive a 404 they drop it nearly immediately. (I guess it's possible they've changed something since I've tested it, but a 404 + noindex has worked well for me to get pages dropped sooner in the past.)
That's why some of us do it and recommend it.
[edited by: JD_Toims at 1:28 am (utc) on Aug 1, 2013]