--never received that robots.txt in full
---it was a bug in robots.txt parsing
-- perhaps they got a bug on their end
-- contact them with your problem
i don't have one
---give good chaps a break---
I do, but not on site that clearly states
User-agent : *
when something gets in to the /trap I whack the good and bad guys on the same level, whether it's Google, Yahoo(in fact with in last 5 minutes), WORIO or any other yo-yo,
The WORIO spider showed tipical scraper behaviour based on my script that tracks them and got blocked.
BTW, I've seen it doing so on more than one site, just simply stating the fact
don't get me wrong I love all of them.