Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: not2easy
I've used Fireworks now for years when it comes to image editing for the web. I also use Photoshop and Illustrator when applicable. I'll even fire up Image Ready if I'm feeling frisky. Fireworks and the others offer me a wide range of image optimization options. I still take each and every image and optimize it to the nth degree. It takes all of a few seconds to do so and I'm probably shaving off a few seconds of load time for my visitors.
What's your routine these days? Are you squeezing those file sizes down as far as you can? Or, have you joined the movement that feels "all" of the world is on broadband? Are you compressing using 72 or 96 dpi?
About 5 months ago, we noticed that our download times were much higher then we would like, about 2.5 - 3.0 seconds according to G webmaster tools.
This was because of the small pictures that were uploaded in our news stories. I started to minimize their file size using Photoshop's export to web feature and the loading time is now under 1.0 seconds.
Our site caters to a worldwide audience, many of which do not have broadband, so this was very important to us.
So yes, Download Speed Optimisation is still just as important as it always was, it's just that the Goalpoasts have moved.
We compress at 72dpi, mainly because we are on Macs.
DPI has nothing to do with compression and/or the actual size of the image unless your scaling images using inches. If you have 300x300 pixel image whether you set it to 72DPI or 1000DPI it is still the same image visually and filesize. They will be identical pixel for pixel. DPI only sets the default scale mostly for printing purposes.
Until recently my local newspaper still insisted that all images submitted to them be at 300DPI because "web quality" 72DPI such as a 8MP image from professional Canon DSLR which defaults to 72 DPI is no good which is laughable. My only guess is they had a lot of people scanning them at really low dpi, at least I hope that is the case but that assertion certainly doesn't help the person reading it understand the differences. I often wonder how many people ruined images attempting to set the right dpi.
For jpg's the filesize of the image is determined by three things mainly. The content, the pixel size and the amount of compression. The "save for web" feature on most programs usually default to about 70 to 75% quality(compression) and set the dpi to 72 but its really irrelevant. It's the quality(or compression) that matters.
No URL's please, see TOS [webmasterworld.com]
[edited by: limbo at 2:45 pm (utc) on Mar. 10, 2008]
Importantly, it seems to me, if get text appearing first so there's something to look at/read, not so appalling if photos take a bit more time to appear as page completes. Hope my visitors agree!
I still optimize but not at the expense of quality. Now my typical images are 450x600 and typically 75K and I really don't worry about bandwidth costs. If you're worried about bandwidth you need to find a new host.
Advanced JPG Compressor by WinSoftMagic does a slightly better job than Pshop, especially cool that it can compress partial areas of an image while leaving other parts crisp.
Advanced GIF Compressor by Creabit
Similar name, no relation. To find it search for "advanced gif compressor". Does batch jobs, lossless compression. Easy interface.
No URL's please, see TOS [webmasterworld.com]
[edited by: limbo at 2:35 pm (utc) on Mar. 10, 2008]
Usually it's on a per image basis, particularly for photos. I tend to have my JPG compression set to 75% - this, on the whole, cuts out a large chunk without too much loss - much more than that and the image starts to look too compressed.
That said I'd rather have a slightly larger file size over one that looks shonky - like it's been said, bandwidth is not so much of an issue and for a beautiful design I make 'em wait ;)
I do source-order layouts and always assign widths and heights for images, so folks can still use the page while they wait for the images to come in. Also, I've spread my images over a couple sub-domains, as IE will pipeline requests from multiple domains. That made a huge difference. The homepage for my main site loads the content within one second, and the images are in within three (on broadband). This is a fairly graphics-intensive page by my normal standards, but it seems to be working fine.
I heard that there are programs that menage the compression better then Photoshop. Some suggestions, anyone?
I use Fireworks from Adobe (used to be Macromedia). I believe I've been using it since it first came on the market. In the beginning, from my own personal experience, none of the others could compare to Fireworks' image optimization for the web. I could usually shave off another 25-30% from most images that were sent to me that were already "optimized for the web" through Photoshop or some other program.
These days though, most of the programs are up to snuff. You just need to know how far you can go and which settings do what. I believe all Adobe products now have a "Save for Web" option. That's where you can "make or break" your images. With Fireworks, since it is a Web Image editing program to begin with, you are working at the Web level to start.
Just recently I was running a report on a site that was using images hosted on Flickr. There were 9 images on one page that came in over 500k. It made up for about 70% of the pages weight. Those images are not optimized for the web. If you were to open them in Fireworks or Photoshop, you'd see that they had a Quality setting of 100.
So, what types of websites do you feel are victim to image bandwidth waste? I say we classify this under the "Green" movement and start cleaning up. :)
On a side note, another culprit is using .jpg vs .gif and vice versa. Or, using .png. Of that 500k referenced above, I was able to trim away almost 350k of it by converting the images to their proper format. These were graphs with solid colors, white background, very simple. One was 123k to start in .jpg. Afterwards it was 35k in .gif with no loss in visual.
What's your routine these days? Are you squeezing those file sizes down as far as you can? Or, have you joined the movement that feels "all" of the world is on broadband?
I'm smack dab in the middle. There's no question that optimizing images is beneficial for all parties involved. I used to be in the camp of those who would shave the file size down to an absolute minimum.
However, I now find myself spending less time shaving off a few k of file size because the savings don't justify the time it takes to find the perfect level of optimization.
From a webmaster perspective:
Optimized images = less storage, lower bandwidth, quicker page delivery, lower server load, lower cost of page delivery. The cost benefit is insignificant on low traffic sites, but increases as traffic goes up. You especially can't ignore seek time/server load, which can make a huge difference in page delivery times. Again, not an issue with a low traffic site, but once traffic starts to climb, especially if you get big "spikes", it starts making a huge difference - I'd rather have the RAID array serving up pages than grinding away buffering images.
From a user perspective:
Un-optimized image = slower page loads (even with broadband, you can tell the difference), gives me the impression the webmaster is lazy/uninformed, results in me hitting the back button by reflex.
Also, if you leave the image large, then just "size" it in the html, it never renders as well as it would if you used a commercial editor to size/optimize an image. The better the monitor, the more visible the difference, and the monitors coming out in the last few years are very unforgiving of artifacts and poor optimization.
That's my 2 bits. Now I'm off to creatively destroy a couple dozen thin clients. Great way to finish a Monday.
Load times (particularly when considering mobile devices); bandwidth consumption; server / network load; drive consumption; and efficiency of the site in general are all key reasons to optimize the hell out of each and every image bar none.
I switched to exporting .gif's in Fireworks "web adaptive" which is a little larger, but absolutely beautiful at all screen resolutions. It puts a little bit of a "Glamour Shots" haze over some graphics, but looks great otherwise.
The cost benefit is insignificant on low traffic sites, but increases as traffic goes up.
Good point. This is one of the reasons I optimize everything, images, HTML, code, etc...
Larger images also mean that the server itself has to spend more time both reading from the drive and staying connected to the client, meaning that large images can have an effect on future hardware costs. Efficiency is hugely important where computers are concerned.
Since you can generally reduce the file size of any image without losing visual quality, why wouldn't you? It really doesn't matter how fast your audience is connected.
DPI doesn't matter, but screen resolution might
Where the DPI issue comes into effect is not what setting you have in Photoshop (as coalman says, that only matters when you print), but with the resolution on the user monitor (as p5gal5 found). An imperfect image (whether due to jpeg artifacts or native imperfections like bad focus) can look okay at a relatively high resolution but terrible at a relatively low resolution(by "relatively" I mean relative to the physical size of the screen, not some particular number of pixels). On occasion, I'll bump the resolution of my monitor back down and see how things look when they are, relatively speaking, larger than I would normally view them.
So the brief version of the foregoing is that if you have your monitor maxed out for resolution, you might take a look at your most important images with the resolution backed down a couple of steps. Sometimes they still look great and sometimes not.
basic photo optimization for the lazy
If I have a photoset that has similar characteristics and/or I don't care if they look perfect, I batch process with Irfanview. Results are not perfect, but in most cases, pretty similar to an image-by-image Photoshop optimization.
I can batch run
- lossless auto-rotate based on EXIF info
- resize/resample with a set jpeg compression
- make slight adjustments to exposure
- sharpen (0-100)
Sharpening should always always always be the last manipulation you do to a jpeg.
I could probably batch process much better in Photoshop, but Irfanview would have finished its batch job about the time that the Photoshop splash screen would be crediting Seetharam Narayan for his work....
But if you want serious lifting (HDR, shadow/highlight adjustment and reasonably good sharpening), Photoshop is the bomb.
You can arrange them in a grid with some large "whitespace" between them and then position that on image as a background showing the graphic you need by positioning the background.
It's basically one (large in dimensions) image that has all the little bit combined in it. The whitespace will compress nicely so no need to worry there.
ask is using it on their homepage quite beautifully.
Also agree that with all current browsers DPI on the image has no effect whatsoever. But as screen resolutions might increase in the next (many( years perhaps at a point there is a sense in setting them to something reasonable.
Yes, wondered re screen resolutions becoming significantly higher; I guess that if so, browsers will have to do something, or many photos on older pages could look horribly tiny.
As to optimising: I'd only really thought re individual pages n bandwidth - good points re whole site slowing if many pages w larger-than-necessary images accessed at once.
- "necessary" size will depend partly on whether images are rather throwaway things, or good looking images important, as on photography sites.
[edited by: Josefu at 7:33 am (utc) on Mar. 11, 2008]
Until recently my local newspaper still insisted that all images submitted to them be at 300DPI because "web quality" 72DPI such as a 8MP image from professional Canon DSLR which defaults to 72 DPI is no good which is laughable.
It is kind of laughable in the digital paradigm and yes, your newspaper are technically incorrect - digital bitmap image resolution quality can only really be measured in absolute pixel dimensions. But in the print medium dpi is a factor. DPI (at the final printed size) is a legitimate format to require an image, as the output resolution of the printed image is relative to the halftone line screen (and this also has a bearing when importing an image into a DTP package - images marked as 72dpi can display and print incorrectly). And yes, people sending in low resolution images are the cause of these kinds of requests. The words 'I copied this from my web site and attached it in an email' are words to freeze the blood of most print professionals.
To combat this, the 300dpi figure has become a standard, as 150 lines per inch is a standard colour magazine halftone screen. Even for a newspaper, which probably only prints at between 55LPI to about 75LPI (that's 'L' PI for lines per inch), the resolution of the image only needs to be 1.5 to 2.5 x the line screen - so 150dpi (at the final printed size) should do the trick.
So really, all parties are correct. It's just a question of which paradigm we are referring to and at what stage of digital image processing we are at.
[edited by: encyclo at 8:41 pm (utc) on Mar. 23, 2008]
[edit reason] fixed typo [/edit]