a browser and a robot are simply two different types of user agents.
let me be more specific and say that you can certainly design a robot to accept any cookies and store them, just as a typical browser would. you could also design that bot to return any stored cookies with subsequent GET request headers which would make the usual server application act as if this was a user agent that accepts cookies.
the typical search engine robot will not be looking for cookie-dependent text.
if necessary, the server application can be designed to thwart access to cookie-dependent content by robots spoofing as cookie-accepting browsers. and thus begins the arms race...