(I think "cloaking" is as good as place for any for this discussion.)
Here are some reasons I can think of:
1 - I posted a script here about a month ago that 'forged' its user_agent to see if other sites were using user_agent based cloaking. (I wonder if somebody is using it on you! LOL)
2 - If somebody wrote a nasty little email-sucking spider that ignored robots.txt and did all kind of ugly things, I bet they might want to pretend that they are some big spider as to not alert you. They don't usually call themselves 'Spambot version 0.8' you know.
3 - How do you know it's not Google? Maybe they have a new block of IPs you don't know about?