I haven't seen any case of Google indexing a page that returns 404 or 410 if a proper response codes have been returned. Of course, if the page existed and Google indexed it previously, then once the page is removed and starts to return 404/410, it will take some time for Google to remove it from its index.
Adding robots noindex on page at the same time when the response code is changed to return 404/410 will not speed up dropping the page out of index since Google will not bother with page HTML once it receives 404/410 response.
The only reason I could see for setting up noindex on a page that returns 404/410 would be to protect from own errors, that is, if there is some mistake in returning HTTP response and the page suddenly starts to return 200 OK. In this case having a noindex could be a fallback in preventing Google indexing the page.