Hi dogweather,
I guess your assertion in the post is that it's a good idea to test before you deploy, and that indeed testing should be an automated part of any reasonable software development process, and further, development should start with a test proving that what you want works, then the code to do it.
I'm with you so far. It's an admirable goal.
But I have been managing software development teams since my 13 year old son was -13, and we have had some variant of this notion in our heads all during this time. In my experience, it common sense, but not common.
For example, if I had a test run for all of the conditions in rewrites for all of the pages on my site, I would have full coverage.
But doing this would negate the value of a fine logic machine attached to the top of my neck that tells me I could do better things with my time than creating tests for things "I already know" will work. The problem with the thing on the top of my neck is that it's not a perfect logic machine, and occasionally makes assumptions that are wrong. (OK, much of the time)
So in looking at your site, I like the concept, but think good software development frameworks like Ruby on Rails have this concept baked in to their very cores (others as well). Tests come nearly for free, are written in the same language as the code, and they get run automatically, and ... even then few developers actually write tests without having had the hard lesson you had.
(And instead of learning the lesson, I have had several developers say, I kid you not, "I am going to write a tool that writes tests for me automatically", which is simply an amusing anecdote revealing how deluded and grandiose us engineers can be).
(several decades of having been beaten down by reality ... sorry.)
But what would be cool in your tool is if we could more simply declare a few examples of URLs to run without saying what should happen quite yet, for example:
http://example.com
Rather than having the TDD approach, having a really clear output that said what actually did happen when the test was run could give me a simple second chance to say "Yep, that's what I expected", or not.
For example, if your tool told me that I got a single 301 redirect to http://www.example.com/ in 10ms, that would be good and I could bless the result -- your tool would then record the details as a saved test. Now you have a simple way of building a test suite.
Better yet, write a firefox plugin or Greasemonkey script that let me click once do that process on a URL I just submitted.
Oh, and I should be able to invoke my tests from whatever environment I happened to be working in, getting a response that could be integrated into normal compile/build output.
Now that would be cool. For me (and 26 years of development teams), your tool as it stands is a few steps too many.
But on the right track!
That's my feedback.
(Except for one other thing: my first reaction was that you were using WebmasterWorld forums to promote your product. I dissuaded myself that this was the case. But (believe it or not) I am a lot less cynical than some others here.)
So welcome to the embittered, curmudgeonly WebmasterWorld :-)
Tom