millercia - 11:45 am on Apr 28, 2013 (gmt 0)
But any resizing needs to be done server-side. Otherwise the site will be seen as inexplicably slow-loading. This is not only a human-interface issue; search engines measure page access time too. Your server can resize an image in much less time than it takes a human to download the oversized version.
The down side of this is that it places an unnecessary load on the server. Take for example a page of thumbnails resized by the server from larger images. Server resources are consumed for every image resize which needs to be performed as the page loads.
So if you have 1,000 concurrent users loading one thumbnail page you can have resource issues and how many more thumbnail pages are there to spider and explore?
Thanks for replies. Given the one large JPEG approach, the best way to go seems to depend on traffic load. With client resizing, lots of concurrent users might have a better experience. In a low traffic situation the server would not be overtaxed and so the user would likely benefit from both faster resizing and, since only the displayed size is downloaded, faster loading. Our site is presently low traffic but we certainly hope to increase volume and this was a big motivation for redesigning it.
I'm questioning the wisdom of using a large source image for resizing at all. I don't see how keeping two additional smaller versions significantly increases complexity and the storage increase would be under 10%.
There will be up to eight thumbnails on a page. Thunmbnails will be used to select a medium which when hovered over will bring up a zoom box with the larger image. For some pages, therefore, up to eight 200-250KB JPEGS will need to load before the page is functional.
I had not considered spiders' measurement of load times. This could be more "noticeable" to them since once they connected they would probably initiate a rapid-fire series of page requests.