What? You don't want MSNbot to crawl the site? ;-)
You also forgot about all the scraper & exploit bots that will claim to be common web browsers. There are also a few "broken protocol" proxy and caching services/servers.
You want to accomplish a task that has no easy answer. There are a lot of trash bots on the Internet scraping content, harvesting email, scanning for exploits and driving inflated traffic reports. You make a few server adjustments and set aside time to deal with the most offensive sources.
Look at what Brett experienced when he wanted to vent his wrath on the misbehaving bots that were slamming WebmasterWorld. You have to see him tell the story. The exasperation mixed with extreme annoyance slowly creeps into his face and voice as he describes the options they tried and the aggravating results.
You could force cookies, but a lot of anti-spyware & anti-phishing software will automatically delete or refuse to accept a cookie.
You could use Flash and force users to load the plugin and navigate through Flash menus. But then you have to hope that visitors have Flash installed and that your Flash programming will be compatible with the largest variety of versions. You also have to create your Flash objects so Google will be able to "read" the text and navigate the site. A pure Flash site is evidence of pending failure for search results.
You could also setup a small sandbox script or hidden DIV that identifies rogue bots and redirects them or forces a 403. I've seen samples of scripts that capture the IP address of rogue bots and then builds a DENY list. The problem is that you will develop an astounding list of subnets in .htaccess that will eventually affect server performance.