> Do I need to have my domain name on the right side of the first two rules that are redirects?
Yes you should, in order to prevent problems if the ServerName is configured as "example.com" instead of "www.example.com" and UseCanonicalName is set to "on" -- or in case your host later changes to this configuration.
> I saw in another thread that -f is slow. Should I instead be listing the 10 or so pages like so: RewriteCond $1 ^(page1|page2|etc)$ in the fourth rule?
Only if your list of URL-paths is fixed and not likely to ever change. Otherwise, it becomes a maintenance headache -- not worth it, and you should "just let the server do the work."
However, consider that the script is probably not set up to serve anything but "pages." So you can probably exclude non-page objects from that rule and eliminate the majority of unnecessary disk checks when the requested object is NOT something that your script can generate. Good candidates for exclusion are images, CSS files, JavaScript files, xml documents, multimedia files, and text documents (pdf, xls, doc, txt, etc.)
Since the majority of objects fetched from most servers are images, excluding just a few filetypes can eliminate a lot of unnecessary file checks.
You may also be able to exclude requests for all .php filetypes as well.
Adding
RewriteCond $2 !\.(php|gif|jpe?g|png|ico|css|js|xml|swf|flv|pdf|xls|doc|txt)$
above the "-f" RewriteCond would take care of this.
> my .htaccess could be optimized. Any tips are greatly appreciated.
OK... The "^.*" and ".*$" subpattern are a waste of time. Just leave the pattern un-anchored to achieve the same thing.
Do not use a RewriteCond to check the requested URL-path if the RewriteRule pattern itself can be used to do this. The RewriteRule pattern is the first thing that mod_rewrite evaluates; If it does not match, then none of the RewriteConds will be processed. For efficiency, the RewriteRule pattern should always be as specific as possible.
Use the power of regular expressions to avoid testing the same thing repeatedly -- for example, the leading 'dots' on filetypes or the trailing slashes on directories in this code.
Do not allow the client to control the initial path. If the substitution would start with a back-reference to any client-controlled path info, then precede it with a slash.
Put the excluded URL-paths and filetypes in order from most- to least-frequently-requested for efficiency.
Tweaking just what's here, you'd get:
RewriteRule ^index\.(html|php)$ http://www.example.com/ [R=301,L]
#
RewriteCond $1 !^(admin|images)/
RewriteCond $1 !\.(php|xml|css|pdf|js|txt|jpg|gif|ico|shtml)$
RewriteRule ^([^/]+)$ http://www.example.com/$1/ [R=301,L]
#
RewriteRule ^food/menu-(.+)$ menu.php?view=$1 [L]
#
RewriteCond $2 !\.(php|gif|jpe?g|png|ico|css|js||xml|swf|flv|pdf|xls|doc|txt)$
RewriteCond %{DOCUMENT_ROOT}/$1\.php -f
RewriteRule ^([^/]+)/(.*)$ /$1.php?view=$2 [QSA,L]
#
RewriteCond $1 !^(admin|images)/
RewriteCond $1 !\.(php|xml|css|pdf|js|txt|jpg|gif|ico|shtml)$
RewriteRule ^(.+)$ article.php?url=$1 [QSA,L]
If you have any 'infinite looping' trouble with your first rule, then exclude internally-rewritten /index.xyz requests and redirect only direct client requests for that URL-path:
RewriteCond %{THE_REQUEST} ^[A-Z]+\ /index\.(html|php)([?#][^\ ]*)?\ HTTP/
RewriteRule ^index\.(html|php)$ http://www.example.com/ [R=301,L]
I left the exclusion-list that I added to your fourth rule exactly as I had it in the preceding discussion just for consistency. You may wish to modify it and/or your own exclusions to one standard form -- again for consistency. :)
The object-type exclusion lists for file-checking do not need to be absolutely comprehensive. Only the most-frequently-requested object types need to be included to get the most benefit; You'll quickly reach a point of diminishing returns if you try to make this list too long.
Jim