g1smd - 10:03 am on Dec 14, 2012 (gmt 0)
Yes, Xenu LinkSleuth will take a plain text file list of URLs and test each one and report the result. You need to set scan depth to 1 (I think) so that it doesn't then go on to traverse the whole site (do that on a second separate test starting at example.com or www.example.com).
The text file list of URLs should include URLs that don't exist, URLs with appended junk, URLs with the wrong case, URLs with incorrect stuff (e.g. in a part of the URL that should be numeric add a letter or a period or comma), URLs with parameters in various orders as well as real URLs that do exist. Each URL should be listed in www and non-www form. Testing a site then becomes a couple of clicks. Keep the text file handy and be sure to add new test URLs to it as you think of them. I have a file with about 800 URLs in for testing one large site that I occasionally work on.
For human readability, leave a blank line after every RewriteRule and add a comment before each block of code. I also number each block. Take the situation where I am adding a new redirect/rewrite pair of rules. In this case the matching rewrite will be a long way down the file. I will add the redirect as perhaps 2.14 then the matching rewrite will go in at 3.14. When I want to modify those rules at a later date, the pairing is obvious.
My htaccess files usually have at least 4 major sections for mod_rewrite code.
0.xx - setting things up
1.xx - blocking access to bots and for malicious requests
2.xx - redirects
3.xx - rewrites
Make sure you add a date and the site name to your htaccess file as a comment. Keep old copies so that you can compare changes. I develop as "htaccess.sitename.01.txt" and increment the number. Once I upload that file it's a simple matter to rename it to .htaccess on the server. It's also a simple matter to go back one version if necessary.
I can also highly recommend using version control (Subversion, Git, etc) for development.