Why HDR Matters

If you read the reviews here at Cineluxe with any frequency, you’ve probably noticed that we make frequent reference to HDR—high dynamic range–video. By now, it’s a term you’re almost certainly familiar with. But if you’re not really sure what it means, you can be forgiven, because most of the standard marketing materials are confusing and misleading.

 

Here’s a perfect example. This image is representative of the images that most TV manufacturers use to convey the advantages of HDR. Look at that dull and washed out image on the left. Marvel at how it pales in comparison to the vibrant image on the right side of the screen. See how much better HDR is?

Why HDR Matters

There’s just one problem with this. This entire pictured is rendered in standard dynamic range (SDR). That vibrant, lifelike image on the right? Your old, non-HDR display could almost certainly render it with no problem. The image on the left? It’s artificially toned down and muted. This analogy isn’t really helpful. And mind you, I’m not knocking the graphic artist who made this particular example. The entire electronics industry seems content to rely on some variation of this example on every piece of marketing material promoting the advantages of HDR. I’m simply saying that if this is the only sort of comparison you’ve seen, you’re right to be skeptical.

 

So, how is one to understand the actually differences between SDR and HDR video? One easy way is to visit your local tech expert, be it a custom integrator or an electronics store you trust, and ask for a demo.

 

But you can also understand it with just a little math.

 

In short, the SDR video we’ve grown accustomed to for the past few decades, through DVD, HDTV, Blu-ray, and even non-HDR 4K, uses 8 bits of data to represent each primary color: red, green, and blue. What this means is that you can have 256 different shades of each of those colors, which are then combined to create the entire visual spectrum. 256 shades of red, 256 shades of blue, and 256 shades of green combine to create nearly 17 million total shades that can be displayed on a SDR screen, or captured in a video format like Blu-ray.

 

HDR, by contrast, relies on 10-bit (or even 12-bit) color. To understand what a monumental increase that is, understand that 10-bit color allows for 1,024 different shades of red, green, and blue, which when combined result in over a billion different shades onscreen.

 

Here’s a visualization of the difference between 10-bit and 8-bit, when limited to the blue channel alone:

Why HDR Matters

And grayscale, which represents every step along the way from pure black to pure white:

Why HDR Matters

Again, you’re seeing these images presented in SDR, but hopefully they convey the point that 10-bit video, and hence HDR, allows for more subtle variation in color and grayscale. Which means that you see more detail in the shadows of darker images (or darker areas of a complex scene), and more variation in the highlights of brighter images (or brighter areas of a complex scene).

 

But that’s not all. HDR also allows for greater image brightness, and more control over which areas of the image are dark and bright. Your old HDTV might be capable of delivering 300 nits (a standard unit of measurement for brightness), whereas many of today’s better HDR-capable displays can easily deliver 1,000 nits or more. That doesn’t necessarily mean that the entire 

image is brighter, mind you, as if you just took your old HDTV and cranked the brightness control. Turn up the brightness on an old TV, and the blacks get washed out and turn gray. Turn up the contrast to compensate, and what you end up with is an image with stark blacks, bright whites, and not much in between.

 

A good HDR TV, on the other hand, can make a small area of the screen—a flashlight beam, for example—shine with all the intensity of the real thing, while keeping the shadows wonderfully and natural dark, without robbing you of those all-important mid-tones in between.

If you’ll allow me my own dubious analogy, think of it like this: Imagine a piano that only had 22 keys. The key on the left is still low A, and the key on the right is still high C, but there are only twenty keys in between them and they can only be played with the soft pedal depressed. Compare that imaginary hobbled instrument to the rich sonic output of an 88-key Steinway Model D concert grand piano played at full volume, and you can start to really wrap your brain around the differences between SDR and HDR.

 

The bottom line is that good HDR displays do a much better job of matching our eyes’ (and our brain’s) ability to differentiate subtle differences in color and contrast, as well as the natural variations in brightness we experience out in the real world.

 

There is one other confusing aspect to all of this, though: The fact that there are competing HDR standards—which you may have seen referred to as HDR10, HDR10+, Dolby Vision, and Hybrid Log Gamma. You don’t really need to understand the differences between them to understand what HDR is and how it works, but we’ll dig into those competing standards in a future post and explain what sets them apart.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

No Comments

Sorry, the comment form is closed at this time.