I'm developing a mobile version of a site that most likely will use responsive design and become the only version of the site. As a UI feature I find lazy loading very convenient. Just in case it's not a commonly accepted technically proper term: you load only part of the page, scroll close to the pages's end and a JS loads another piece of content via AJAX and stitches it to the back of the page. Could be automatic (hence "lazy") or by you pressing a "Load More" button - the Googlebot accessibility implications would be the same, I presume.
I have some monstrously long pages and breaking them into several separately loading chunks makes them much quicker to load and just basically more usable. But the issue of course is how to make sure Googlebot reads the page in its entirety and not just the tiny piece in the beginning.
I'm guessing that Googlebot might try to run the JS and perhaps even get the content in small chunks, although I am not too sure about that either - AJAX requests are POST by default. They could be GET but I don't want them to have their own URLs so chunks of a whole page could not be indexed on their own. Besides, without knowing the framework of the page as a whole, how do you know how to stitch pieces together?
Some of the particular issues in my mind are: