For an error result, I'd like to confirm if I'm using error codes correctly.
If the supplied URL is never loadable, I'm returning 403, and hoping that the robots won't reload the page ever.
If the page won't load at the moment, but might load later, I'm returning 404.
The two question marks are:
1) Is 404 a good code if my program crashed (but might work later)? Is there any code that implies a more temporary failure to a search engine?
2) If there's a cgi parameter parsing error (e.g. foo?a=one&b=invalid), is 403 enough to convince the search engines to remove the URL from their indexes?
In perl: $q=new CGI; $q->header('-status' => '403 Invalid CGI Parameters');