High dynamic range (HDR) can elevate your monitor’s image to a level that makes it a joy to look at. But, not all HDR is created equally.
Let’s discover the advantages and disadvantages of HDR and what to look for when picking out an HDR-capable display.
HDR vs. Non-HDR: Side By Side Comparison
|Brightness||Max 1,000 nits||Max 100 nits|
|Color Gamut||DCI-P3 and Rec.2020||Rec.709|
|Color Depth||8-bit, 10-bit, and 12-bit||8-bit and 10-bit (very few use 10-bit)|
What’s The Difference?
Some of the technical jargon in the table above might be just that to you right now, but, before you are done with this article, they will start to make a lot more sense to you.
To understand the advantages and disadvantages of HDR and non-HDR, we must first understand what both mean.
What is HDR?
You might have heard of high dynamic range (HDR) before. For a while, HDR was primarily used in photography. Now, it is mostly used when it comes to metadata.
You might be asking yourself what metadata even is. Metadata is the additional information sent along with the video to display the content correctly. There are 3 different standards when it comes to HDR metadata: HDR10, HDR10+, and Dolby Vision.
The dynamic range bit comes into play once the TV or monitor reads the metadata and is displayed. For example, HDR lets you see details in the dark shadows and highlights in one scene. With standard dynamic range (SDR), either one is usually blown out or crushed, so you cannot see the detail in them.
What is Non-HDR?
Non-HDR, also called standard dynamic range or SDR, is probably what you are used to watching your content with. While not as fancy as HDR, SDR has been the conventional dynamic range for quite some time, and for good reason.
As we discussed above, there are 3 different types of HDR metadata for which content can be mastered. Not every display is compatible with every standard. So, if you have a TV that is compatible with Dolby Vision but not HDR10+, the colors might not be correct if you watch content mastered for HDR10+ on that TV.
Wide Color Gamut
A wide color gamut on a TV means that it is capable of displaying a gamut of colors with more saturation than a standard TV can. This does not necessarily require HDR to do so, but it does definitely help. Color gamut does not mean it can show more colors—it means that it can display more saturation of color in that gamut.
Like we said before, not all HDR is created equally. This is definitely true for a wider color gamut. On lower-end HDR TVs, you will probably not be able to notice a difference in the color gamut compared to a non-HDR screen.
A higher-end HDR TV can typically cover about 88.08% of the DCI-P3 scale, whereas an SDR screen covers <80% of the DCI-P3. Lower-end HDR TVs cover 75.30% of DCI-P3, so while this is not a bad score, it is less than higher-end SDR screens.
Color depth and gamut can get kind of confusing, but depth refers to the different colors a TV can display.
For example, a limited color gamut would stop a TV from accurately showing an apple’s red. In contrast, a limitation in color depth would make the red gradients on that apple look uneven and with visible steps.
Non-HDR displays use 8-bit color, which means they have 16.7 million shades of red, green, and blue to create all the colors it needs. This number might seem significant until you look at 10-bit. There is a whopping 10.7 billion shades of color for the display to use.
Where you notice color depth, the most is in gradients. TVs with a lower bit depth will have noticeable steps in a gradient, whereas, with higher bit depths, those steps will be smoothed out. Color depth is where higher-end and lower-end displays are equal since even the budget-friendly HDR displays are offering 10-bit color depth now.
Finally, we can take a look at the dynamic range. This is where HDR TVs will show the most significant differences. HDR content makes use of its higher brightness capabilities to show lifelike highlights. Non-HDR screens struggle with this, showing the highlights and crushing the shadows or vice versa. A TV with a high dynamic range can display dark shadows and bright highlights simultaneously without losing detail.
This is a category that seems like a no-brainer in which HDR should win. But, as always, it is not that simple. There are arguments that camera technology is not at the point at which this dynamic range on a TV is required to view the full depth of the image. But, in our opinion, camera technology is moving leaps and bounds every day, so a year from HDR TVs might be a required option to enjoy the content thoroughly.
HDR vs. Non-HDR: 4 Must-Know Facts
- HDR displays can show a larger color depth and more dynamic range than non-HDR displays.
- Not all HDR displays are created equally.
- Some lower-end HDR TVs show a smaller color gamut than higher-end SDR TVs.
- SDR TVs run at 8-bit, and HDR TVs typically run at 10-bit or higher color depth.
HDR vs. Non-HDR: Conclusion
HDR is one of the most exciting advancements in display technology over the last couple of years, but we know it is not for everyone.
For example, if you typically watch broadcast TV and the occasional movie, an HDR TV might be an unnecessary luxury feature for you to enjoy your TV. Most content in the broadcast space is not being mastered for HDR right now, so it won’t make you the neighborhood envy this football season.
But, if you are a movie buff and love to have the cinema-at-home feeling, an HDR display might be in your near future. More and more movies on streaming platforms offer HDR support, and even some YouTube channels are starting to master their videos with HDR in mind.