| 5:18 pm on Mar 9, 2008 (gmt 0)|
We compress at 72dpi, mainly because we are on Macs.
About 5 months ago, we noticed that our download times were much higher then we would like, about 2.5 - 3.0 seconds according to G webmaster tools.
This was because of the small pictures that were uploaded in our news stories. I started to minimize their file size using Photoshop's export to web feature and the loading time is now under 1.0 seconds.
Our site caters to a worldwide audience, many of which do not have broadband, so this was very important to us.
| 5:38 pm on Mar 9, 2008 (gmt 0)|
In spite of the huge speed increaseshat Broadband gives, the expectations of the Visitor has become more demanding. With Dialup, the expectaions were 6 to 10 seconds, beyond which, impatience set in. Now with 2gig upward being the norm, expectations are now 2 to 3 seconds above 5 seconds is now very much an impatience trigger.
So yes, Download Speed Optimisation is still just as important as it always was, it's just that the Goalpoasts have moved.
| 6:05 am on Mar 10, 2008 (gmt 0)|
I don't go crazy but if I'm making a page with lot of images I'll pay attention more.
|We compress at 72dpi, mainly because we are on Macs. |
DPI has nothing to do with compression and/or the actual size of the image unless your scaling images using inches. If you have 300x300 pixel image whether you set it to 72DPI or 1000DPI it is still the same image visually and filesize. They will be identical pixel for pixel. DPI only sets the default scale mostly for printing purposes.
Until recently my local newspaper still insisted that all images submitted to them be at 300DPI because "web quality" 72DPI such as a 8MP image from professional Canon DSLR which defaults to 72 DPI is no good which is laughable. My only guess is they had a lot of people scanning them at really low dpi, at least I hope that is the case but that assertion certainly doesn't help the person reading it understand the differences. I often wonder how many people ruined images attempting to set the right dpi.
For jpg's the filesize of the image is determined by three things mainly. The content, the pixel size and the amount of compression. The "save for web" feature on most programs usually default to about 70 to 75% quality(compression) and set the dpi to 72 but its really irrelevant. It's the quality(or compression) that matters.
| 7:13 am on Mar 10, 2008 (gmt 0)|
Its not the file size that scares me, as many of my readers are on broadband. But its the bandwidth that uncompressed images suck up thats my damocles sword.
Even with my optimised images I have a 3 gig a day BW habit, how others cope I don't know....
| 12:47 pm on Mar 10, 2008 (gmt 0)|
netchicken1: You might want to look at the coral cache... It is free and would halve your bandwidth overnight.
No URL's please, see TOS [webmasterworld.com]
[edited by: limbo at 2:45 pm (utc) on Mar. 10, 2008]
| 1:03 pm on Mar 10, 2008 (gmt 0)|
Count me in for making image filesize as small as possible. Even if the whole world has super-fast broadband (which they don't) everyone loves a fast site. I visit plenty of sites on fast connections that still make me grind my teeth, and images are often the culprit.
| 1:06 pm on Mar 10, 2008 (gmt 0)|
I use 96 DPI and I use Photoshop on Windows. I heard that there are programs that menage the compression better then Photoshop. Some suggestions,anyone?
| 1:28 pm on Mar 10, 2008 (gmt 0)|
Using photoshop jpeg at 6 for quality level (progressive - so should be rough image appearing quickly), below which I've found images can look worse on screen.
Using up to 780 pixels in length; yes, "DPI" not meaningful for photos on screen - 17" monitor with 1200 dpi would be different to same size monitor with 800dpi.
Importantly, it seems to me, if get text appearing first so there's something to look at/read, not so appalling if photos take a bit more time to appear as page completes. Hope my visitors agree!
| 1:33 pm on Mar 10, 2008 (gmt 0)|
Many years ago my image sizes were 200x300 and no more than 20K. Bandwidth cost about $3/gig.
I still optimize but not at the expense of quality. Now my typical images are 450x600 and typically 75K and I really don't worry about bandwidth costs. If you're worried about bandwidth you need to find a new host.
| 1:41 pm on Mar 10, 2008 (gmt 0)|
I still do heavy compression on all images... most of the time. At least I do on large projects I care about.
Advanced JPG Compressor by WinSoftMagic does a slightly better job than Pshop, especially cool that it can compress partial areas of an image while leaving other parts crisp.
Advanced GIF Compressor by Creabit
Similar name, no relation. To find it search for "advanced gif compressor". Does batch jobs, lossless compression. Easy interface.
No URL's please, see TOS [webmasterworld.com]
[edited by: limbo at 2:35 pm (utc) on Mar. 10, 2008]
| 2:51 pm on Mar 10, 2008 (gmt 0)|
I optimise everything. Even broadband users have to wait!
Usually it's on a per image basis, particularly for photos. I tend to have my JPG compression set to 75% - this, on the whole, cuts out a large chunk without too much loss - much more than that and the image starts to look too compressed.
That said I'd rather have a slightly larger file size over one that looks shonky - like it's been said, bandwidth is not so much of an issue and for a beautiful design I make 'em wait ;)
| 2:54 pm on Mar 10, 2008 (gmt 0)|
I do heavy compression on largish images that aren't vital to the look of the site. If I'm doing a graphical site header, though, that gets to be a bit larger if necessary.
I do source-order layouts and always assign widths and heights for images, so folks can still use the page while they wait for the images to come in. Also, I've spread my images over a couple sub-domains, as IE will pipeline requests from multiple domains. That made a huge difference. The homepage for my main site loads the content within one second, and the images are in within three (on broadband). This is a fairly graphics-intensive page by my normal standards, but it seems to be working fine.
| 3:01 pm on Mar 10, 2008 (gmt 0)|
Now that's an interesting point. Does spreading the images across subdomains, on the same server and same IP, speed up the loading of the site?
| 3:10 pm on Mar 10, 2008 (gmt 0)|
It seemed to make a big difference for me.
| 3:19 pm on Mar 10, 2008 (gmt 0)|
|I heard that there are programs that menage the compression better then Photoshop. Some suggestions, anyone? |
I use Fireworks from Adobe (used to be Macromedia). I believe I've been using it since it first came on the market. In the beginning, from my own personal experience, none of the others could compare to Fireworks' image optimization for the web. I could usually shave off another 25-30% from most images that were sent to me that were already "optimized for the web" through Photoshop or some other program.
These days though, most of the programs are up to snuff. You just need to know how far you can go and which settings do what. I believe all Adobe products now have a "Save for Web" option. That's where you can "make or break" your images. With Fireworks, since it is a Web Image editing program to begin with, you are working at the Web level to start.
Just recently I was running a report on a site that was using images hosted on Flickr. There were 9 images on one page that came in over 500k. It made up for about 70% of the pages weight. Those images are not optimized for the web. If you were to open them in Fireworks or Photoshop, you'd see that they had a Quality setting of 100.
So, what types of websites do you feel are victim to image bandwidth waste? I say we classify this under the "Green" movement and start cleaning up. :)
On a side note, another culprit is using .jpg vs .gif and vice versa. Or, using .png. Of that 500k referenced above, I was able to trim away almost 350k of it by converting the images to their proper format. These were graphs with solid colors, white background, very simple. One was 123k to start in .jpg. Afterwards it was 35k in .gif with no loss in visual.
| 6:23 pm on Mar 10, 2008 (gmt 0)|
|What's your routine these days? Are you squeezing those file sizes down as far as you can? Or, have you joined the movement that feels "all" of the world is on broadband? |
I'm smack dab in the middle. There's no question that optimizing images is beneficial for all parties involved. I used to be in the camp of those who would shave the file size down to an absolute minimum.
However, I now find myself spending less time shaving off a few k of file size because the savings don't justify the time it takes to find the perfect level of optimization.
| 7:05 pm on Mar 10, 2008 (gmt 0)|
Photoshop CS3 is pretty good at optimizing images. No matter what software you use, it's a short learning curve then it only takes a second or two every time you do it.
From a webmaster perspective:
Optimized images = less storage, lower bandwidth, quicker page delivery, lower server load, lower cost of page delivery. The cost benefit is insignificant on low traffic sites, but increases as traffic goes up. You especially can't ignore seek time/server load, which can make a huge difference in page delivery times. Again, not an issue with a low traffic site, but once traffic starts to climb, especially if you get big "spikes", it starts making a huge difference - I'd rather have the RAID array serving up pages than grinding away buffering images.
From a user perspective:
Un-optimized image = slower page loads (even with broadband, you can tell the difference), gives me the impression the webmaster is lazy/uninformed, results in me hitting the back button by reflex.
Also, if you leave the image large, then just "size" it in the html, it never renders as well as it would if you used a commercial editor to size/optimize an image. The better the monitor, the more visible the difference, and the monitors coming out in the last few years are very unforgiving of artifacts and poor optimization.
That's my 2 bits. Now I'm off to creatively destroy a couple dozen thin clients. Great way to finish a Monday.
| 7:12 pm on Mar 10, 2008 (gmt 0)|
I definitely take image compression seriously. We use Corel PhotoPaint X4, 96dpi and are careful when selecting the optimal format (JPG / GIF / PNG) per image.
Load times (particularly when considering mobile devices); bandwidth consumption; server / network load; drive consumption; and efficiency of the site in general are all key reasons to optimize the hell out of each and every image bar none.
| 7:16 pm on Mar 10, 2008 (gmt 0)|
I also use Fireworks, and I was amazed at how abysmal the extra-optimized, smaller .gif ("web 216") files were. They looked great to me with my uber-screen; however, once I went to an average enduser resolution (1024x768 or 1280x960), it was amazing how pixelated and juvenile everything looked.
I switched to exporting .gif's in Fireworks "web adaptive" which is a little larger, but absolutely beautiful at all screen resolutions. It puts a little bit of a "Glamour Shots" haze over some graphics, but looks great otherwise.
| 7:31 pm on Mar 10, 2008 (gmt 0)|
Photoshop's (and Illustrator's) "publish for web" feature is amazing; after using other applications (graphic converter, image ready) I use nothing else today. Not only can I create a sharply-rendered image that takes minimal bandwidth, those applications have transformed my habits of uploading a thumbnail image doubled by a larger version (for a "zoom" feature) into using a single image suitable for both.
| 8:45 pm on Mar 10, 2008 (gmt 0)|
|The cost benefit is insignificant on low traffic sites, but increases as traffic goes up. |
Good point. This is one of the reasons I optimize everything, images, HTML, code, etc...
Larger images also mean that the server itself has to spend more time both reading from the drive and staying connected to the client, meaning that large images can have an effect on future hardware costs. Efficiency is hugely important where computers are concerned.
Since you can generally reduce the file size of any image without losing visual quality, why wouldn't you? It really doesn't matter how fast your audience is connected.
| 9:49 pm on Mar 10, 2008 (gmt 0)|
I'm mostly interested specifically in photos, which you pretty much have to optimize.
DPI doesn't matter, but screen resolution might
Where the DPI issue comes into effect is not what setting you have in Photoshop (as coalman says, that only matters when you print), but with the resolution on the user monitor (as p5gal5 found). An imperfect image (whether due to jpeg artifacts or native imperfections like bad focus) can look okay at a relatively high resolution but terrible at a relatively low resolution(by "relatively" I mean relative to the physical size of the screen, not some particular number of pixels). On occasion, I'll bump the resolution of my monitor back down and see how things look when they are, relatively speaking, larger than I would normally view them.
So the brief version of the foregoing is that if you have your monitor maxed out for resolution, you might take a look at your most important images with the resolution backed down a couple of steps. Sometimes they still look great and sometimes not.
basic photo optimization for the lazy
If I have a photoset that has similar characteristics and/or I don't care if they look perfect, I batch process with Irfanview. Results are not perfect, but in most cases, pretty similar to an image-by-image Photoshop optimization.
I can batch run
- lossless auto-rotate based on EXIF info
- resize/resample with a set jpeg compression
- make slight adjustments to exposure
- sharpen (0-100)
Sharpening should always always always be the last manipulation you do to a jpeg.
I could probably batch process much better in Photoshop, but Irfanview would have finished its batch job about the time that the Photoshop splash screen would be crediting Seetharam Narayan for his work....
But if you want serious lifting (HDR, shadow/highlight adjustment and reasonably good sharpening), Photoshop is the bomb.
| 9:55 pm on Mar 10, 2008 (gmt 0)|
When possible, I use CSS sprites to combine two or more images into one file. Two big advantages: total file size is generally less, and it saves the browser a trip to the server.
| 10:35 pm on Mar 10, 2008 (gmt 0)|
I think it is still important. Even additional 50K can cause few seconds delay (sure it depends on connection speed).
One more thing is compressing page contents. Nowadays almost all modern browsers support compression. Usually it cuts page size 2-3 times. A lot of traffic saved :)
| 10:36 pm on Mar 10, 2008 (gmt 0)|
Agree, you can gain much by CSS sprites. Basically the technique is to combine the individual images you need for your layout (not so much your content).
You can arrange them in a grid with some large "whitespace" between them and then position that on image as a background showing the graphic you need by positioning the background.
It's basically one (large in dimensions) image that has all the little bit combined in it. The whitespace will compress nicely so no need to worry there.
ask is using it on their homepage quite beautifully.
Also agree that with all current browsers DPI on the image has no effect whatsoever. But as screen resolutions might increase in the next (many( years perhaps at a point there is a sense in setting them to something reasonable.
| 1:36 am on Mar 11, 2008 (gmt 0)|
Again, surely it's not DPI that counts, it's actual size in pixels.
Without other instructions, surely a 780x600 image (around what I often use) is same to a browser whether you had it at 72dpi or 300dpi. Will fill fair amount of 800 pixels wide monitor, less of 1200 pixels wide one.
Yes, wondered re screen resolutions becoming significantly higher; I guess that if so, browsers will have to do something, or many photos on older pages could look horribly tiny.
As to optimising: I'd only really thought re individual pages n bandwidth - good points re whole site slowing if many pages w larger-than-necessary images accessed at once.
- "necessary" size will depend partly on whether images are rather throwaway things, or good looking images important, as on photography sites.
| 7:33 am on Mar 11, 2008 (gmt 0)|
Loading one image instead of two does save bandwidth and disk use, but there is one caveat I must mention to using one web-optimised image as both zoom and thumbnail; browser rendering. I noticed that the edges of some 'downsized' images (made smaller through code than their original size), especially highly contrasted ones, get 'choppy' at low monitor resolutions in certain browsers (namely IE). Thanks for bringing that up!
[edited by: Josefu at 7:33 am (utc) on Mar. 11, 2008]
| 12:14 pm on Mar 11, 2008 (gmt 0)|
|Until recently my local newspaper still insisted that all images submitted to them be at 300DPI because "web quality" 72DPI such as a 8MP image from professional Canon DSLR which defaults to 72 DPI is no good which is laughable. |
It is kind of laughable in the digital paradigm and yes, your newspaper are technically incorrect - digital bitmap image resolution quality can only really be measured in absolute pixel dimensions. But in the print medium dpi is a factor. DPI (at the final printed size) is a legitimate format to require an image, as the output resolution of the printed image is relative to the halftone line screen (and this also has a bearing when importing an image into a DTP package - images marked as 72dpi can display and print incorrectly). And yes, people sending in low resolution images are the cause of these kinds of requests. The words 'I copied this from my web site and attached it in an email' are words to freeze the blood of most print professionals.
To combat this, the 300dpi figure has become a standard, as 150 lines per inch is a standard colour magazine halftone screen. Even for a newspaper, which probably only prints at between 55LPI to about 75LPI (that's 'L' PI for lines per inch), the resolution of the image only needs to be 1.5 to 2.5 x the line screen - so 150dpi (at the final printed size) should do the trick.
So really, all parties are correct. It's just a question of which paradigm we are referring to and at what stage of digital image processing we are at.
| 8:49 pm on Mar 11, 2008 (gmt 0)|
Bouncybunny my point is that the general population has no idea what DPI is or what it does, they do understand pixel sizes as it generally used everywhere. Newspapers are going to have to scale and resample the images anyway correct, I would think it would be better for them to request original images that have not been resampled no smaller than nnnn X nnnn pixels and set the dpi themselves.
[edited by: encyclo at 8:41 pm (utc) on Mar. 23, 2008]
[edit reason] fixed typo [/edit]
| This 41 message thread spans 2 pages: 41 (  2 ) > > |