dstiles - 7:50 pm on Jun 28, 2013 (gmt 0)
> The meatbots would be all over it
There are ways of detecting (most?) auto-submission agents, same as detecting auto-scrapers.
> negative SEO issue
Agreed. Not sure how to avoid that one.
Domain names / DNS - there are indicators in DNS and certainly some DNS servers are very suspect. I agree it would take a lot of work but what is the project's aim - to avoid as much spam as possible. Some DNS servers are "obviously" compromised and could be trapped.
I saw some stats that gave the number of criminal domains registered per day and was very surprised.
seoskunk - make sure to set an unique user-agent string with a url pointing to the bots page of the "site", even if the real SE has no existence as yet. For a new bot there should be at least a minimum policy set out - "We do not sell on" etc.