Welcome to WebmasterWorld Guest from 126.96.36.199
b) Use RewriteRule to do the redirect.
c) Use the POST method.
Is that how everyone else sees it or am I missing something?
[edited by: engine at 9:23 am (utc) on Mar. 3, 2009]
[edit reason] No urls, please see WebmasterWorld TOS [/edit]
Also, doing client-side validation leaves your form-submission method open to inspection by the less-than-honorable denizens of the Web, and invites spoofed submissions.
I know nothing of your specific application here, but I've never found it necessary or even desirable to get rid of query strings for simple user-option-selection submissions; When the application is more complex, I use POST.
As far as "the URL in the browser no longer reflects the current page," don't confuse URLs with filepaths; The browser does in fact show the current and correct URL, and this URL is as real as it can be; Because mod_rewrite is used (in this proposal) to map the URL to a filepath, and that filepath leads to a script which generates the associated "page," that URL is quite correct, genuine, real, etc. It is no less "accurate" than a non-mod-rewritten request for the URL "example.com/" which a typical server would convert to a filepath request for /path-to-Apache/user/widgetco/www/public/html/index.html
So, I think that what you meant was that since part of the context of the page created by submitting a form is carried in the POST data and is therefore not available to search spiders, a POST URL alone does not carry all the information needed to duplicate the page a client would see after POSTing valid data. (I'm kind of talking to myself here while figuring our what you meant, and maybe this will help other readers.)
Anyway, unless there is some compelling reason that you want search engine spiders to submit forms on your site, I wouldn't do it, and I'd either use the POST method on the form or find a different way (other than a form) to accomplish the goal, in order to make the content spiderable. And yes, I am hinting at possibly using "user-agent-based content-generation" here, sometimes called "cloaking" -- but only if it's truly necessary.
I'm not interested in search engine spiders submitting my forms. I wanted to be rid of the ugly URLs since I have used .htaccess to allow my site to work with SEO optimised URLS. I also didn't want users bookmarking the ugly URLs. Furthermore, I thought it would be better for the SEO URLs displayed to users in their browsers to correspond to the current page as much as possible since they are fairly intuitive.
Sounds like I should be using the POST method for the form and not worrying about URL equivalence.