TheMadScientist - 11:38 pm on Feb 8, 2010 (gmt 0)
Yeah, what tedster said, and being able to set a custom 404 error page is usually standard in most hosting accounts, so by creating a single page you can serve a cool site-specific 404 page for the visitors (real people) who request a non-existent URL. IMO it's a good way to do things and I use them on almost all, if not all sites I work on.
I usually include links to 'important pages' or directories visitors might be looking for, so it's a single page and can usually be set from within your hosting account.
Do make sure you run a header check when using one to ensure it serves a 404 properly... IMO the issue may have been prolonged by disallowing the content rather than serving a 404 page even without the robots meta tag, but again IMO it definitely has been by not serving a 404 page with the meta tag I posted previously, because as soon as compliant bots get the noindex,nofollow,noarchive tag on a page (URL) and that URL is processed the page is dropped from the results.