Google, Alta, and MSN all three requested/suggested I take it offline. I can't say as I blame them, becaus rouge spiders are one of the current biggest problems with popular sites. Since I was unable to get Search Engine Endorsement of said tool without paying approx .10-.50 per search, I had to take it offline. Otherwise, it would have had to become a pay service.
Se's are cracking down on most of these utils, and you'll notice there has been a wave of rogue metasearch engines and link checkers that have been taken down the last couple of months.
On the other hand, Goto gave me full permission for the Goto keyword interrogator. Not quite a full endorsement, but a virtual handshake, and it was "ok for now".
In other tool news: I put up a new HTML validator [searchengineworld.com] based on a similar core the the W3C themselves use. I'm using SP/NSGMLS by James Clark, while they are using a modified form of it from WDG (I believe). I am using the W3C's check script to drive NSGMLS. It is so nice to have one handy like that. I will be doing some more work on it.
Also put up a webpage size checker [searchengineworld.com] with HTML to TEXT ratio.
Other offline stuff I have laying around that is getting converted: Robots.txt checker, link checker, and new Meta keyword suggestor with some unique twists.
As for code - the old problem with my code is that my library routines worm their way into everything I write. They are highly customized and it would take forever for me to convert things to generic, run anywhere scripts.
Thanks for Noticing.