I am new here so although I linked in here through a google search and found a topic that did discuss this
How Do Search Engine Robots Work?
The topic was quite old. So I read all the replies but here is why I was searching about robots
I am developing my own CMS, I've be trying to figure out how to determine if a visitor to a Website was human or not at runtime?
The reason: So I can store IP, request time, user agent, etc. in my database so that I can provide analytics for the users of my CMS.
Granted there are programs out there that do this already... but I am stubburn and want to do it myself.
Now, I could write a program to parse the access logs (and yes, I know that there are programs that do that too. See paragraph above :)
I wrote some code to inspect the user agent for key words from the database like bot, spider, crawl, etc. But if robots don't accept cookies I could simply check to see if visitor got "phpsession" cookie, then I am done and I can store the visitor info in the DB.
I think I figured out my own answer: When you start sessions with php the cookie isn't there the page goes to the client. Thus, you won't know if the first request is from a robot based on the fact that the session cookie isn't there!