Let's Talk Monitors: What Makes a Good Display

Monitors are a hugely important part of color correction. While we do have scopes (essential in balancing scenes and maintaining legal levels), we need to be able to trust what our eyes see when building looks and making creative decisions. Unfortunately, there is a lot of bad information out there when it comes to monitors for budding colorists. We'll try and sort through what makes a great monitor, what you should be saving for, what you'll be getting instead, and basic calibration over the course of a few articles.

This discussion is aimed at readers new to the world of mid and high end displays, not those of us with a calibrated Dolby, Sony, Cine-tal, Flanders, Hd2line, etc.

What Makes a Good Grading Display?

There are many important considerations to take into account when looking for your next display. Each of these points are extremely important and sacrificing on any will impact the images you are working with and ultimately your final output.

Bitdepth

Bitdepth is one of the most important considerations when choosing a budget display. Fortunately, the rule is simple: never buy anything with less than a 10bit panel.

To explain the difference extremely briefly, 8bit means a video codec or display is only capable of 16.7 million colors. That sounds like a lot until you compare it to the 1.07 billion colors a 10bit panel (or codec) is capable of. It's quickly apparent why a 10bit panel is so superior to 8bit. This is especially true when working in codecs that are 10bit (or greater). An 8bit monitor would be incapable of displaying all the colors!

A result of the fewer colors can be the introduction of banding in your images. See the below image for an example:

Compare the 8bit images (top) with their 10bit counterparts (bottom). If you look closely, you can see the tiny bars that appear in the 8bit palettes. These are introduced because there aren't enough colors to accurately display the whole spectrum. Compare this to the smooth gradient the 10bit images display with their larger amount of colors. This is a simulated comparison, but fairly accurate as to what you might see when comparing an 8bit/10bit display.

There are some monitors which boast about their "10bit processing." Generally, these are using clever electronics to mask banding and other 8bit issues while still using an 8bit panel. This does not overcome the color difference. If a monitor has a 10bit panel, the manufacturer is going to proudly display that fact.

Equally important to having a display capable of 10bit is ensuring your connection and video card are capable of sending a 10bit signal. If a single part of the chain is only capable of 8bit, then the output will be 8bit.

DVI is (generally) not capable of 10bit (though a few monitors have an implementation), DisplayPort and HDMI are. NVIDIA GeForce video cards, while able to output 10bit in DirectX mode, are effectively 8bit in Premiere, etc falling back to OpenGL rendering. NVIDIA Quadro cards are 10bit because of their 10bit color buffers. AMD is similar with their Radeon cards only outputting 8bit. Their proline FirePro and FireGL cards are a must to achieve full 10bit capability.

Gamut

Standards compliant gamut response is the single greatest argument for purchasing a pro display. These high end displays offer built in, calibrated switching between several color standards. This flexibility means you can proof your output in a variety of situations and delivery formats.

There are a couple of standards important to color grading that you should be familiar with. BT.709 (often referred to as Rec.709) is the standard for color and lightness reproduction in HD video. sRGB was made to match BT.709 exactly. BT.601 does the same for SD. DCI P3 governs the same for digital cinema projection and is currently not achievable outside of the highest end monitors and projectors.

Notably absent is AdobeRGB. AdobeRGB was designed to display CMYK colors on a RGB display and greatly expands on the number of colors compared to sRGB/BT.709 (50% of CIE color space versus 35%). This is ideal for photographers who can view their images close to what they will appear as once printed. In film/video, however, low and mid range wide gamut displays can cause trouble for calibration and previewing of Rec. 709 materials. Images graded on wide gamut like AdobeRGB can appear washed out in Rec709. This does not mean you should avoid wide gamut displays, merely ensure you are coloring within a properly calibrated gamut for your desired output (probably Rec. 709).

High end displays offer built in LUTs and calibration to easily switch between these and other color specs. Low and mid range displays often have generically named modes such as "Cinema" or "Sports" which may or may not get close to one of these standards. Many new midrange monitors do sport specific sRGB or Adobe RGB modes but these should only be considered starting points for calibration (as in any display). The closest you'll find to one touch 709 in the consumer environment is Movie Mode in THX certified displays.

Gamma

Our eyes are tremendously sensitive to differences in luminance (how bright a surface will appear). Research has found that we are capable of distinguishing between luminance levels of just 1%. This sensitivity, however, is not a linear function and degrades under low luminance levels. We take advantage of this in video encoding by adjusting gamma with a power function to most efficiently encode data for the quirks of human vision. Gamma is the power function of technique.

There are a number of standards in gamma that are extraordinarily important when it comes to selecting and setting up a display.

  • 1.8 is an older gamma setting used by OSX until 10.6 (a throwback to working with print content). It can be safely ignored for nearly all modern use cases.
  • 2.2 is the most widely seen gamma as it is the standard of sRGB inherited from NTSC specs. Windows and OSX (since 10.6) default to 2.2 as do most monitors. Many consumer displays are also set up with a gamma close to 2.2 though there is variance depending on manufacturers and picture styles.
  • 2.35-2.4 are often cited as the preferred viewing gamma for Rec709 (which has no declared standard). There have been efforts to encourage SMPTE to select a standard but no consensus has been met.
  • 2.6 is the standard for digital cinema projection.

In practice, you're going to be hard pressed to find a display that offers anything besides 1.8 and 2.2 without stepping up to professional models.

Once you do find a display capable of a wide range of gamma power functions, it can be tough to know which multiple to use as BT.709 has no set standard. Typically, the deciding factor is based on the general ambiance of the room. A darker room requires a higher power (the reason why cinema projections in a completely dark room use gamma 2.6). A good rule is to use a value around 2.0 for daylight uncontrolled lighting, 2.2 for dim lighting, 2.35-2.4 for a dark controlled room, and 2.5-2.6 for pitch black.

Luminance

Setting proper luminance is important for ensuring your gamma accurately displays true black and true white. Too high and your blacks will be washed out, too low and the image will be dark. Old standards called for a luminance level of 35 footlamberts (3.426 candela per square meter or "nit"). This works out to 119.92cd/m2 though many displays are optimized for lower levels (100 is very common).

Consumer displays generally have far, far too much luminance. For example, a higher end Dell Ultrasharp (advertised as being aimed at graphics professionals) ships with 350cd/m2 out of the box - nearly three times the recommended Luminance. This stems from a marketing ploy to sell monitors. When a row of displays are sitting in a showroom, the brightness, most saturated, and contrasty models will stand out and likely end up in the consumers cart. The result is that many people are staring at monitors that are horrendously calibrated out of the box because a marketer realized bad colors, contrast, and luminance sells.

True pro displays should come properly calibrated from the factory (but still require periodic recalibration like all monitors).

Color Temperature

Color temperature is the "color" of white (and therefore all) light on a display, lamp, etc. A higher color temperature shifts the color towards blue resulting in a "cool" look (think overcast light) while a lower temperature will shift towards red resulting a "warm" look (think candles or incandescent light bulbs). Your eyes naturally adjust to the temperature after a few moments, but when switching between color temperatures it becomes readily apparent how drastic a difference in overall tone this setting can make.

Color temperature is measured in Kelvins (K). Next time you're at your local home improvement store, check out the lightbulb section. Many packages will list 2800K (very warm), 3200K (often called white), or 5600K (generally called daylight and about what you find outside on a sunny day). Displays are set with similar numbers. The standard for video in most markets is 6500K (D65).

Below, you can see these white balances charted out on a spectrum. D50 is known as "horizon light" as it closely matches the color temperature in the early morning/evening. D55 is "mid morning/mid-evening" and D65 is "noon daylight."

Practically speaking, you will be hard pressed to find a monitor that is not D65 out of the box. Televisions, however, are often set to a higher color temperature as it gives them an artificial "brighter" look that stands out in showroom floors.

Some nations, notably much of Asia, have a slightly bluer preferred white point. This is only relevant when producing content for these markets or when importing displays directly.

Resolution

4K displays are finally trickling down to consumer availability. These monitors allow you to view your UHD content with little or no scaling - an enormous benefit when checking for resolution artifacts on 4K+ outputs. With that said, the additional cost and sacrifices the displays make may not make the extra resolution worthwhile to many users.

We have an equation to determine visible pixel sizes based on human vision and distance. Pixel structure is visible where adjacent image rows (scan lines) subtend an angle of one minute of arc or more (1/60°) or more (the angular discrimination limit for normal vision). We've figured out that to achieve this, the viewing distance should be about 3400 times the distance between image rows (or 3400 divided by the pixel density in pixels per inch) ( Digital Video and HD: Algorithms and Interfaces by Charles Poynton).

On 24" displays, one needs to sit approximately 18" away or closer to begin to resolve pixels at consumer 4K (3840x2160 or 183.58 ppi) resolution (considering 20/20 vision). In contrast, on a 1080p 24" monitor (1920x1080 or 91.79 ppi) pixels begin to be discernible at 37 inches or closer. A properly configured environment should have the monitor around 3 feet away (and certainly no closer) making 1080 the ideal balance in resolution cost vs. performance. At that distance, the 4K+ is wasted.

The practical benefits of 4K previews are limited beyond the spatial resolution of our eyes. 4K displays require additional processing power as they are pushing four times as many pixels as the 1080p displays we've grown accustomed too. This means workstations require more GPUs or will suffer from less than realtime playback.

The colorist's display is ultimately about providing accurate gamut and gamma for making creative and stylistic decisions. In most scenarios, resolution plays little into these choices as color is the critical component. 1080p monitoring is more than sufficient in nearly all use cases at present. As 4K displays become common in consumer environments, this may change.

At the current time, I strongly recommend spending money on a display with better gamut/gamma and more controls over increased resolution. The exception to this is your GUI display which can benefit from more real estate to better see nodes, preview clips, and move about your timeline, but this upgrade should come after your primary.

Sizes

The proper size for a monitor is based on a combination of resolution, how close it sits to you, and as always budgetary concerns.

For desktop monitoring, 24" panels have become the standard of color critical displays. The size is an excellent balance between screen real estate and how much of a desk is utilized by the display. It's large enough a client can take a peak if your suite lacks a dedicated client monitor, while allowing you to easily switch between your GUI and your previews. The size is common in both consumer and professional displays making it easy to find a model that suits your needs.

17" displays are popular as an entry level size to the world of pro monitors. The size limits their use as a hybrid workflow/client panel (the extra real estate of the 24" gives you both some breathing room), but makes the monitors an excellent candidate for on set work.

30-32" panels have been dropping in price in both the consumer and pro areas finally making them a viable alternative. The larger size means placement must be carefully considered (typically above your GUI for your preview or wherever is comfortable if used as a client monitor) and desk use is mostly out of the question. Clients love the larger size, but you'll pay a premium for a solid panel.

42"+ panels are pretty much exclusively used as client previews. These are typically televisions from the popular Panasonic professional plasma line. These pro TV's can be upgraded with SDI cards and calibrated to achieve a fairly accurate display. Consumer televisions in this space should mostly be avoided due to the difficulty to calibrate, input options, and gamut/gamma tweaks manufacturers add.

Display Tech

The multitude of panel technologies is far beyond the scope of this already lengthy article. Any mid range consumer display and up should be using a technology that's more than adequate for color critical work. These technologies include the various LCD/LED panels (IPS, etc), Plasma, and increasingly OLED.

It should be noted if you mix panel technologies (for example a plasma client monitor and OLED color critical panel) the screens can display different images even after being (nearly) perfectly calibrated. The differences in technologies make exact matches impossible so it's important to define a monitor as "correct" (and maybe keep the client from having a chance to compare the two).

In Review

This should stand as a basic introduction on some of the points to consider when evaluating a color critical monitor. Future articles will discuss the basics of consumer options, professional options, and calibration.

UPDATE: Part two discussing "what you should be saving for" is now online.