1script - 8:54 pm on Jul 30, 2010 (gmt 0)
I think you should leave robots.txt alone for now :) Using the current political speak, it would be the "nuclear option". In your particular case you'd need to not only remove the links to "coloring" and "habitat" if there is none defined, but also find a way in your CMS to return 404 if the "coloring" or "habitat" page had still been called up, even though there was no link to it.
In fact, if you wanted to remove those "empty" pages quicker, you would leave links to them so they will be visited by Gbot with more certainty, then Gbot will get the 404 and know that they don't exist (at the time). However, to make the whole things even more convoluted, it looks from another thread here [webmasterworld.com] that having too many internal bad links (to non-existing pages returning 404 HTTP code) can damage your "quality score" and further lower your rankings so maybe you just remove the links and hope Google will eventually come for the none-existing "coloring" pages and learn that it does not, in fact, exist.
Anyone has a better idea about how to SAFELY "nudge" Googlebot or "speed up" if you will re-discovery of the pages you intent to return 404 (or 410 for that matter) on.
From personal experience, they don't seem to like 404s in the sitemap either...