Let me preface this post with a note from the Webmaster Central article that states that "Googlebot may crawl more than the first 100KB of text." [
google.com ]
I take this to mean that the amount of text crawled by Googlebot varies from site to site based on a number of factors.
I also take this to mean that for sites that may be beyond a 100Kb file size, that some text is not read by Googlebot.
In Webmaster Tools > Diagnostics > Crawl Stats we are given a few graphs, 2 of which are:
Kilobytes downloaded per day
Am I correct to assume that: (Kb downloaded per day) / (Pages Crawled Per Day) = Avg Kb downloaded per Page ?
And that all important content and links should be placed before this limit is reached?
Or am I delusional? Thanks!