The difference between ‘color gamut’ and ‘bit depth’

You’d be surprised at how often these two terms are used interchangeably in describing display performance. In fact, their recent misuse by a top display executive at a major consumer electronics company was one of the chief inspirations for this blog.

Basically, it boils down to this:

  • More bits means more colors can be displayed
  • More gamut means more colors can be displayed

Got it yet?

While factually accurate, the statement above clearly illustrates the source of many a gamut vs bit depth misconception. Let’s dig a little deeper and define these terms.

First you should understand the simplicity with which modern displays create the color picture you are currently looking at.  Digital displays create all the colors you see by mixing just three primary colors, much like you did while finger painting in grade school. So, red plus green equals yellow, red plus blue is magenta and all three primaries combined creates white. 

Bit depth

So, what is bit depth? Bit depth refers to the number of bits that your computer uses to describe a specific color to your screen.  A typical modern display has “8-bit” color depth, which is a shorthand way of saying “8 bits of data per primary color.” Since 8 bits translates into 256 distinct values, your computer can call for 256 distinct hues of red, green and blue.

While ‘256’ does not sound like a lot of colors – it’s actually quite impressive. Mixing 256 reds with 256 greens and 256 blues (256 x 256 x 256), in a massive expansion of the finger painting analogy from above, creates nearly 16.8 million possible colors. Not too bad, right?

Well, you might ask, how many different colors can I actually see? Turns out a number of studies have been conducted on this, and most researchers agree that humans can detect anywhere between 7-10 million unique colors (see: http://physics.info/color/). This means that even a measly 8-bit display should have plenty of color accuracy headroom above and beyond the performance of your eyes.

So, a more accurate way to describe bit depth would be: more bits mean that a higher number of distinct colors can be displayed.

Gamut

If bit-depth refers to the number of distinct colors that can be displayed, where does that leave gamut? Perhaps the easiest way to think about gamut is as a measure of the range of colors that a display can show. Your 8-bit display may be able to show 16.8 million colors but that doesn’t tell you much about which colors. You see, not all reds, greens or blues are created equal.  When your computer calls for a specific color of deeply saturated red, for example, what you actually see is limited by the physical capabilities of the systems in your display.

How is gamut measured?

Color gamut performance is measured by a variety of standards, typically defined by groups like the National Television System Committee (NTSC) as a way of maintaining consistent color, from capture, to broadcast to end viewer. Creative professionals like graphic designers and Hollywood cinematographers rely on these standards to make sure that their work looks how they intended it to across a variety of screens and print media. The most common gamut standard, developed by the NTSC in 1953 in anticipation of color television, which is typically referred to as simply “NTSC,” covers about 50% of what your eye can see. 58 years and a couple generations of IPads later, we must be able to do better than that, right?

Unfortunately, unlike bit-depth, the color gamut performance of even high end, professional displays is still a far cry from the capability of your eye, which can detect wavelengths of light from 380nm to 740nm. Most high end displays can only display 25-35% of what your eye can see:

The above diagram, called a “CIE Diagram” (CIE stands for Commission Internationale de l'Éclairage or International Commission on Illumination) is an abstract, two-dimensional (missing luminance) mathematical chromaticity model. In layman’s terms, what you are looking at is the full range of colors that humans can perceive, represented by the horseshoe shape and the subset of colors that HDTVs can display, inside the triangle. Note that the deepest reds, greens and blues are out of reach for HDTV.

Putting it all together

Now we can put the two definitions together to clarify the oversimplification from the introduction:

  • More bits means more distinct colors can be displayed, within a range of colors that is defined by the display’s gamut

For a really in depth look at how color is displayed on your screen check out Steve Patterson’s very informative, much more in depth post on this topic over at photoshopessentials.com.

This entry was posted in Color science, Terminology by Jeff Yurek. Bookmark the permalink.

About Jeff Yurek

I work at an advanced materials company called Nanosys that manufactures Quantum Dots for displays. In my work the display business, I’ve noticed a lot of confusion around color science and image quality. I’ve started this blog as a way to get to the bottom of some of this stuff and share what I’ve learned. My aim is to synthesize information I’ve accumulated in order to provide a useful way for us to understand why great color is so important and exciting.

6 thoughts on “The difference between ‘color gamut’ and ‘bit depth’

  1. Oh, terminological confusion is no surprise to me. I am a terminologist and I remember helping the Microsoft Expression team defining their terminology in 2006/7. It was a good exercise for all involved, but certainly a bit painful. This is a wonderful example of how concepts must be dissected before they get defined. And one can never focus on one concept, it must always happen in relation to other concepts in the same conceptual area. Thanks for your contribution to clearer communication in your field.

  2. Very late reply, but only recently found your blog and started with reading it from the bottom down, because all of it seems interesting. As a reaction to:

    – More bits means more colors can be displayed
    – More gamut means more colors can be displayed

    I would personally put it like this:

    – More bits means a higher number of colors (within a gamut) can be displayed
    – More gamut means that colors more different from one another can be displayed

    • I like that phrasing as well. This still begs the question – WHICH different colors in the X million to 10 million estimate can humans see?

      Forgive me if I’m not understanding something, but it is simultaneously stated in the article that the CIE 1931 diagram is the range of Human Vision, as well as … Humans can only see a maximum of 10 million colors in the best case scenario…

      How many colors/shades TOTAL are in the CIE 1931 diagram? And can every human just see different colors/shades on that diagram, up to a theoretical maximum of 10 million?

  3. To cover entire gamut how much colour depth do we need ?
    You said human can see 7 million colours, but with 8 bit depth we can see 16 million colours.
    Does that mean others colours are waste ?

    When 8 bit is enough, why do we need 10 bit and 12 bit ?

  4. Thank you very much for your explanation.
    Just to clarify my mind. I guest that although the CMYK colour space has more bits per pixel (32) than a RGB image (24), because of his reduce gamut vs a RGB gamut, has less colours to be displayed.?¿
    So, no always more bits means more quality image. Taking into account that for printing it is necessary that extra channel (K or black) to accurate represent the images.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.