Forum Moderators: open
From 64.68.x.y: 2273 pages requested
From 216.239.x.y: 4608 pages requested
They are fairly evenly mixed up throughout my logfile. I'm not so sure on what I said in a previous message in this thread now.
I only have a meagre little PR5 site with about 250 flat html pages and a phpBB forum, so getting so many pages requested by the fresh crawl seems unlikely.
Uploading pages during the crawl won't hurt anything. If you change a page after it's been crawled, though, Googlebot likely won't find any new links on it and the cached version after the next update won't reflect the newer content. Other than that, I wouldn't worry. If you've got something new to say, say it!
G.
For my real content pages (the ones that I want crawled) I have PHP code that logs the info I want to a mysql db when I get hit with a UA of a SE spider. Then I have a page to do some manipulation of this info on my admin page. It works really slick.