deadsea - 1:08 pm on May 17, 2012 (gmt 0)
The original 100k recommendation was for text and html not including images. At the point that Google originally made that recommendation, Googlebot would only index the first 100K of the page.
Today googlebot WILL index far more than 100k of the page. 100k downloads very quickly for most users, even those on cell phones.
I would also suggest paying attention to load time under real world conditions. There are lots of things to do to decrease page load time, only some of which involve decreasing the amount of content.
First, make sure your web server enables gzip compression. That can shrink bandwidth bills by 50% or more and really speed up downloads for visitors.
Use Gogole's Site Speed browser plugin or Yahoo's YSlow to see what is actually slowing your site down. Most of the time it isn't large sizes.