Forum Moderators: open

Message Too Old, No Replies

background color seems to dither even in 16-bit

What's going on?

         

zollerwagner

8:59 pm on Apr 6, 2004 (gmt 0)

10+ Year Member



I have a site using pure css for layout. The background of one div is set to #eef. The display is great on most systems I've checked it on. But in some systems the color gets moved toward purple and there is obvious dithering (measles). On one system (running win98, ie5.5sp2, colors set at 16-bit) the colors of the dithers are #f0f0ff and #e8e8ff.

I had thought that websafe color was irrelevant with a 16-bit display. Is that wrong?

What causes this and is there a solution?

tedster

11:57 pm on Apr 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The so-called "websafe" colors may not be the source of this problem -- or the solution.

1. Older browsers especially can get strange with background hex colors. The technical reasons are a minefield, but the short version is that background colors are not rendered by the same code as foreground image colors.

2. 16-bit color is NOT a subset of 24-bit color, it is its own thing completely. In fact, there is just a small overlap between the two color depths - about 12 colors that actually are a precise match.

3. Workaround: try defining a small gif in the solid color of your choice, and then using that bg.gif as a background-image, tiled to fill the div. In most cases where I've had 16-bit color problems, this approach fixed it.

zollerwagner

12:43 am on Apr 7, 2004 (gmt 0)

10+ Year Member



Wow, that's enlightening, especially that the colors available barely overlap.

I suppose this means we also have to test our color choices at different color depth settings.

Thanks, Tedster!

tedster

6:46 am on Apr 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just learned that there is NO overlap at all (except for black and white, I assume.)

There are no shared colors between high color (15- or 16-bit) and true color (24-bit) depths. 24-bit is the full palette, and this is the palette we use with design programs such as Photoshop. 8-bit is a subset of that 24-bit palette. The old 216-websafe palette is a subset of the 8-bit palette, identified for browser and operating system compatibility. But the 15-bit and 16-bit palettes are not subsets of the 24-bit palette; they are entirely distinct palettes.

Reference: Webmonkey Article [hotwired.lycos.com]

I've been alerting members here for a while to the fact that color-depth testing is an under-recognized but important need in website creation. This is especially true because designers usually use 24-bit color (or 32 bit, which is the exact same number of colors plusn some other info.) They miss the fact that many machines come with 16-bit as the default setting and that never gets reset, even when the graphics card could handle 24-bit.

You can get some ugly surprises, especially if you are trying to match background colors with colors in an image -- you know, trying to get that non-boxy "floating" illusion. Instead, nasty visible boxes can crop up when you just use hex colors. But not when you tile a gif instead. Then the same color shift will apply to all the images.

DrDoc

8:20 pm on Apr 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



See, now, that doesn't make sense...

They state that 24-bit holds "all possible colors"...
So, if there are no overlaps between 15/16-bit and 24-bit, then 15/16-bit obviously offers colors not offered by 24-bit... which means that there are more colors available than those in the 24-bit palette.

Extremely good and informative article though!

tedster

8:37 pm on Apr 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it means no identical rendering of hex values between the two color depths.

DrDoc

8:59 pm on Apr 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think that's what they meant...

Moreover, the smallest step on the 24-bit scale (0.39216 percent) does not divide evenly into any of the values on the 15- and 16-bit scales -- well, there are two exceptions: black (0 percent) and white (100 percent). Other than those two, no colors are shared by the 8-/24-bit scale on the one side and the 15- and 16-bit scales on the other.

Since, no matter if the rendering is identical or not, the colors should match up with those on the 24-bit scale. They may not be evenly spaced, but they should be there... else, there are possible colors in addition to those on the 24-bit scale.

While that is true in real life, it is not possible in the digital world... So, either they match up, or they don't.

The computed value may end up somewhere inbetween the possible values which, techincally speaking, means that the 15/16-bit palettes only contain two possible digital colors -- black and white. The other colors are "virtual", but not physically possible. So, for each of the two, there are two palettes -- the 15/16-bit computed scale (where all colors differ from those in the 24-bit palette) and the 15/16-bit actual scale (where the colors match up with those in the 24-bit palette, but are not evenly spaced internally).

Don't take me wrong... I found the article very informative... But, there are not 22 "really safe" colors, there are only 8: #000000, #0000FF, #00FF00, #00FFFF, #FF0000, #FF00FF, #FFFF00, #FFFFFF. All other colors will experience the "box" problem when rendered in anything less than 24-bit. You may not notice it, depending on the settings on your monitor (brightness & contrast)... Or, you may be lucky enough to have the browser/OS compute a new value that is identical for both the hex code and the GIF.