HDR10 vs HDR10+: How Do They Compare?

hdr10 vs hdr10+

HDR10 vs HDR10+: How Do They Compare?

HDR10 vs HDR10+: 5 Must-Know Facts

  1. HDR makes no changes to the intrinsic properties of a display. It merely alters the signal to make images and videos appear brighter, starker, and more vibrant.
  2. HDR10 is the most widely implemented form of HDR.
  3. Some other competing HDR formats include Dolby Vision, PQ10, and HLG.
  4. Disney’s Tomorrowland (2015) was the first movie available for the public to watch in 4K HDR.
  5. 20th Century Fox — now 20th Century Studios — was the first production studio to prioritize HDR filmmaking studio-wide.

Just when you thought you finally understood HDR, HDR10 and HDR10+ had to go and throw a wrench in it. What are these HDR formats? Also, what sets them apart from one another? Honestly, there’s no short answer here. HDR10 and HDR10+ both have their pros and cons, and they both have their respective fans. When comparing HDR10 vs HDR10+, is it possible to say which is best overall? As a matter of fact, it most certainly is.

HDR10 vs HDR10+: Side-by-Side Comparison

DebutedAugust 2015April 2017
DeveloperConsumer Technology AssociationSamsung and Amazon Video
Bit Depth10-bit10-bit
Maximum Brightness10,000 nits10,000 nits
Color PrimariesRec. 2100Rec. 2020
Metadata TypeStaticDynamic

HDR10 vs HDR10+: What’s the Difference?

Below, we’ve outlined three of the key differences between HDR10 and HDR10+. While their names vary only slightly, it’s clear there’s more that sets these two apart than the simple addition of a plus symbol at the end. From color primaries and metadata types to developers, these are the key differences between HDR10 and HDR10+.

Color Primaries

Firstly, HDR10 and HDR10+ employ two very different color primaries. These are used to encompass all the visual specs of an ultra-high-definition television display. This includes — but is obviously not limited to — resolution, frame rate, bit depth, color primary, and other more advanced specs that go above and beyond the average consumer’s understanding of color.

HDR10+ uses the Rec. 2020 color primaries, which fall significantly short of HDR10’s Rec. 2100 color primaries. Rec. 2020 is effectively the predecessor to Rec. 2100. This is interesting to note, as HDR10+ is largely considered the successor to HDR10.

Metadata Type

hdr10+ vs dolby vision
HDR10+ is a high-dynamic range video technology that adds dynamic metadata to HDR10 source files.

©Grzegorz Czapski/Shutterstock.com

Secondly, there’s a notable difference in metadata types between HDR10 and HDR10+. HDR10’s metadata is static, which means it remains the same for the duration of the work. Whether it be a movie, a TV show, or the latest console game, HDR10’s static metadata does not change from the start of the thing through to the very end.

Contrarily, HDR10+ relies on dynamic metadata. This mapping of dynamic tone varies quite significantly from the mapping of static tone. Instead of using one lone tone curve for the entirety of the film, television show, or video game, dynamic tone mapping utilizes unique tone curves on a scene-to-scene basis.

This seemingly gives HDR10+ a clear advantage over HDR10. However, some still prefer HDR10’s static metadata over HDR10+’s dynamic one. They claim HDR10+’s ever-shifting tone mapping can be distracting when applied poorly.


Thirdly, there’s a difference in developers between HDR10 and HDR10+. HDR10 was developed by the Consumer Technology Association in 2015. Abbreviated as CTA, this standard and trade organization encompasses nearly 1,400 different consumer technology companies around the world. Their HDR10 standard remains the most popular of all HDR variants on the market today.

Conversely, HDR10+ hails from a joint venture between Samsung and Amazon Video. First unveiled in 2017 on Amazon Prime Video, HDR10+ has since been championed by the HDR10+ Alliance. Made up of representatives from Samsung, 20th Century Fox, Panasonic, Warner Bros, and numerous others, the HDR10+ Alliance props up this HDR standard over HDR10. Moreover, they have made HDR10+ an open standard, which means it’s royalty-free to use. Albeit, there is an annual certification and logo fee.

The History of HDR10

Back in 2015, HDR10 emerged as a natural next step after the debut of HDR-TV in 2014. Seen as the successor to SDR (or standard dynamic range) television, HDR-TV was created to improve the display signal quality without actually changing anything about its most intrinsic properties.

In other words, HDR makes colors look brighter, more distinct, and more vibrant without actually changing the display’s brightness, contrast, or other technical color capabilities. It accomplishes this through the alteration of luminance and colors in the actual signal itself.

Subsequently, HDR10 was born. An improvement upon the basic concept of HDR, HDR10 is a proprietary set of HDR standards and algorithms agreed upon by the Consumer Technology Association. Not only does it forgo backward compatibility with SDR-TV, but it also forbids content creators from optimizing content to a consumer’s display capabilities.

hdr10 vs hdr10+
HDR10 is an open high-dynamic-range video standard.

©Grzegorz Czapski/Shutterstock.com

With HDR10, you simply get what you get. The specific configuration will remain the same throughout, regardless of the true capabilities of a consumer’s television, desktop, or phone. Today, HDR10 is more popular and more widely accepted than any other HDR format.

It has support from numerous computer and television manufacturers (including Dell, Samsung, LG, Sharp, Vizio, and Sony), not to mention the most popular next-gen video game consoles and streaming services. Its closest competitors are HDR10+ and Dolby Vision, both of which offer dynamic metadata, and HLG, which delivers a bit of SDR backward compatibility.

The Emergence of HDR10+

As we have seen thus far with HDR10 vs HDR10+, the latter has a number of important improvements. This is even more true in comparison to basic HDR, which HDR10+ triumphs over to an even greater extent. First introduced by Samsung and Amazon Video in April 0f 2017, HDR10+ immediately turned heads for its distinct improvements over HDR10.

Not only does HDR10+ add in that dynamic metadata we discussed, but it also has the ability to adjust brightness levels as high as 10,000 nits. With more than a billion colors in its palette and support for as much as 8K resolution, HDR10+ is a clear step ahead.

HDR10+ Certification Program

While HDR10+ is a free-to-use open standard, manufacturers and content creators still have to pay an annual fee in order to receive official certification and join the HDR10+ logo program. In order to get that official certification, one must have their content or device reviewed by an authorized test center. While there, the test center will put the device through an official HDR10+ certification program, at which point it will either pass or fail.

The HDR10+ Alliance has allowed the particular HDR format to thrive (even in a niche field crowded with competitors). From Samsung to Amazon Video, Panasonic to Warner Bros, and 20th Century Fox to Universal Pictures, HDR10+ has seen less widespread adoption in comparison to HDR10 but nevertheless continues to spread throughout the industry. Objectively speaking, it is superior to HDR10 and comparable to Dolby Vision. Plus, it’s royalty-free. What’s not to love about that?

HDR10 vs HDR10+: Pros and Cons

Pros of HDR10Cons of HDR10
The most widely accepted HDR formatNo dynamic metadata on a scene-to-scene basis
Supported by the Consumer Technology AssociationNo backward compatibility with SDR
Pros of HDR10+Cons of HDR10+
Open-source and royalty-freeWeaker color primaries compared to HDR10
Dynamic metadata that preserves content creators’ artistic intent better than staticLess widely accepted than HDR10

HDR10 vs HDR10+: Which Is Better?

So, when all is said and done, is there an obvious winner between HDR10 and HDR10+? Well, let’s think about what we know thus far. We saw how their specs lined up against one another, and we also saw their key differences. Plus, we read how each format came to be and also what their most notable pros and cons are. With all this in mind, who comes out looking better? Truthfully, it has to be HDR10+. Here’s why.

Firstly, HDR10+ has dynamic metadata. This matters because it allows content creators and filmmakers to set new HDR standards on a scene-to-scene, and even a frame-to-frame, basis. Secondly, HDR10+ has a higher brightness standard. While HDR10 aims for 1,000 nits, HDR10+ aims for 4,000 nits.

Lastly, HDR10+ comes from film and production studios, whereas HDR10 comes from technology manufacturers. So, this means HDR10+ has drastically more creatives in its corner. Ultimately, these factors are what make HDR10+ the superior HDR format.

Up Next

HDR10 vs HDR10+: How Do They Compare? FAQs (Frequently Asked Questions) 

Is Dolby Vision the same thing as HDR?

Dolby Vision is a form of HDR, similar to HDR10+, that utilizes dynamic metadata to optimize the visuals of the work. Compare this to HDR10, which uses static metadata.

Is Dolby Atmos the same as HDR?

Dolby Atmos pertains to sound quality and not visual quality.

What is HDR?

HDR is high dynamic range, and describes the process of enhancing the brightness, contrast, and color gamut of a photo or video being displayed without physically changing the actual brightness, contrast, or color levels of the display itself.

Is 4K the same as HDR?

4K and HDR differ in the sense that 4K describes the number of pixels in a display and HDR describes the process of improving a display’s visuals.

Is HDR10+ an improvement over HDR10?

HDR10+ improves upon HDR10 by adding in dynamic metadata instead of static metadata. This allows content creators to customize the HDR display on a scene-to-scene or frame-to-frame basis.

To top