I launched a new site in late Feb. It doesn't have a google site map but it is being crawled to its deepest level.
1. Would a site map really make any difference?
2. The number of pages being crawled nightly has been increasing but it really jumped last night to over 1000 but they are all zero bytes transfered get requests. Is Googlebot making its own site map so to speak to come back later for a full crawl?
3. These days I don't have a clue about what to think about Googlebot and its methods. Can anyone enlighten me? Does this sound good or bad?
Transfer of 0 bytes with a 200-OK status means that Google did a Conditional GET, with an If-modified-since HTTP request header, and your server responded with zero bytes because the Last-modified date on the file indicated that Google's cached copy was still current. Therefore, it was unnecessary to transfer the same 'old' content.
The elimination of unnecessary bandwidth and server resource usage is precisely the purpose of the Conditional GET request. You will also see some clients use HEAD requests for the same purpose, although a CGET is more efficient.
I didn't know what to think. So these are the pages they have been getting already, just checking for changes. Now if I could get them to crawl the rest of the site, as I said they have been going to the deepest level, just not all of it........