Forum Moderators: open
>>i have a page that is 268k, of which only 48k is represented by images, and Google spiders it and lists in in results with a cache.
The size limit doesn't include images. If you look at a cached version of a page that is over 100k, you will see that that the page is clipped. Google will crawl pages larger than 100k, but they will only store the first 100k of data.
Play safe, keep the pages small in HMTL (5-15k) and image size (weight - 5k per image. max 4 images). About 5% of the world is on broadband and that ain't necessarily a big buadband rate at that.
Accessability and usability, are as important as any aspect of a site.
Your site may top the engines, but if it brakes an individual searchers patience, forget them - Whats the point.
I understand the concept of keeping pages small, but in my case there seems no good way to do it. i tried cutting the info up into smaller chunks of 10k each, but users like to browse through all the info, so eventually they end up waiting the total time anyway, just in smaller bits. My link to the page has a warning about the size. What do you folks think about these "loading" javascripts. Do they make the wait seem shorter? Are they a good idea?
I think in most cases, smaller is better.
also keep in mind where in the page you have the links you want to have followed.