Home

 › 

Products

 › 

Vs.

 › 

SDR vs. HDR: What’s the Difference, and Which is Better?

People watching a TV monitor

SDR vs. HDR: What’s the Difference, and Which is Better?

In recent years, advancements in the image quality of televisions have been unrelenting. There are now so many competing technologies that offer video and image viewing with enhanced resolution and realism. 

In particular, HDR is a next-generation display signal technology that creates a high degree of color clarity and contrast. It’s a big leap forward from the original SDR technology that has been around since the very first color TV.

Let’s break down what SDR and HDR are, and examine the key differences between them so you can evaluate what you need to get the best viewing experience.

What is SDR?

Standard Dynamic Range (SDR) is a screen contrast technology used in cathode ray tube (CRT) displays.

SDR’s representation of screen light intensity is determined by the color characteristics, contrast, and brightness of CRTs. SDR uses the gamma curve to encode light intensity information so a video can represent it accurately.

The dynamic range is the range spanning the lightest and darkest parts of a display image. This information, called luminosity, impacts the accuracy and quality of an image display. If a single image has a wide range of dark and bright properties it is described as having a high dynamic range. SDR is able to represent image luminosity within predetermined limits. The technical parameters of CRT include:

  • A maximum luminous intensity of 100 candelas per square meter. Luminance measures the amount of light that passes through, is emitted from, or reflected off an object.
  • A black level around 0.1 candelas per square meter. The black level measures the level of brightness at the darkest part of an image or where no light is emitted.  
  • Use of the sRGB color space that is routinely used on television screens and monitors. SDR may also use the Rec.709 color standard developed by the International Telecommunication Union (ITU). 

The History of SDR

The development of SDR follows the development of the CRT tube and television sets, going back as far as 1934.

Until HDR technology (discussed below) was introduced, SDR was the primary means of rendering light intensity on screens. The fixed parameters of CRT limit its use with high-definition screens, as the image quality is unpredictable and difficult to calibrate. 

Applications of SDR Technology

SDR is primarily used to represent light in images and videos shown on a CRT display. Some forms of cinematography and photography also use this technology.

What is HDR?

High Dynamic Range (HDR) is a display technology that renders screen light intensity with a wide or high dynamic range.

Unlike SDR, HDR is capable of extremely bright and detailed contrast in image and video, with a color intensity that the older SDR technology cannot achieve. 

HDR technology enables screens and displays to utilize a higher quality image source. It can be used with moving and still images. The effect of HDR is determined by the specifications of the display such as its brightness, contrast, and color properties.

This novel technology for representing light exceeds the limits of SDR and CRT technology, which has been superseded by current displays which are significantly more advanced. HDR facilitates higher brightness and wider color range, in line with the intent of the image or video creator.

High dynamic range HDR image high quality
High dynamic range (HDR) is capable of color range and depth that is simply unmatched by SDR.

How Does HDR Work?

HDR technology can reproduce light intensity with remarkable accuracy by quantifying the light used in an image and preserving key details that aid the reproduction of realistic-looking images. 

HDR uses Electro-Optical Transfer Function (EOTF). This metadata transfer technology takes the image or video input and calculates an electronic value for brightness that your TV or monitor can use to display images with the correct light intensity. 

The History of HDR

HDR imaging technology was first introduced commercially in 2014 as HDR-TV.

Dolby developed HDR as part of Dolby Vision, a set of proprietary imaging technologies spanning content creation, distribution, and playback.

Consumer electronics companies like Samsung have since devised their own version of HDR, known as HDR10, and other companies have devised standards to facilitate backward compatibility with SDR and use in photography. 

Applications of HDR

HDR is used in television, computing, cinematography, photography, and even smartphones.

What’s the Difference Between SDR and HDR?

SDR is the legacy standard for image display but is still widely used in televisions and monitors. HDR standards are newer and designed to work with new televisions that have the correct picture quality. 

SDR has only a fraction of the dynamic range of HDR. HDR can represent a greater range and level of detail of light intensity in images, preserving details that SDR cannot render. The visual effect of this is that SDR will clip colors that are beyond its range; for example, dark grey tones may be clipped to black and it may clip bright or light tones to white.

Photography measures dynamic range in stops. HDR has a dynamic range—17.6 stops—that is almost three times the stops value of SDR—6 stops.

SDR uses the gamma curve to encode data about light intensity, while HDR uses HDR metadata in line with standards like Dolby Vision or HDR 10.

SDR vs. HDR: A Side-by-Side Comparison

SDRHDR
What is it?Display signal technologyDisplay signal technology
Primary UseStorage of temporary system filesStorage of image data
Initial Release1934The 1980s
Influential DevelopersJulius Plücker and Johann Wilhelm Hittorf (cathode ray tube)Dolby, Samsung, the Consumer Technology Association
Technologies InfluencedColor televisionUltra HD, high dynamic range imaging, image formatting

Similarities and Differences

While SDR and HDR share some similarities, the technologies have plenty of differences to set them apart.

Similarities

  • Both SDR and HDR are display signal technologies.
  • Both SDR and HDR are used to determine the display of the light intensity of images on a screen or monitor.
  • SDR and HDR are both currently in use in contemporary televisions. 

Differences

  • SDR works with cathode ray tubes.
  • SDR has a narrower dynamic range than HDR.
  • HDR can represent a wider range of light intensities, enabling it to render content correctly.
  • HDR has several open-source and commercial standards.

What is SDR Used For?

SDR is a term that was introduced in 2014 to distinguish HDR technology from the legacy display signal technology that was still in use. Modern displays still use SDR video, but this technology limits the dynamic range they can represent compared to HDR. 

A contemporary display has to operate according to the same visual parameters as a cathode ray tube display to use SDR optimally. In the absence of this, more sophisticated displays will try to adjust the SDR image to accommodate a color range that extends beyond the conventional limits of SDR. However, the results are often suboptimal for the screen and produce an impaired viewing experience when compared with HDR.

Does SDR Need to Be Upgraded?

Upgrading from SDR to HDR can certainly improve your viewing experience, but it’s important to recognize that SDR is still very much a mainstream technology and HDR requires HDR-compatible screens and hardware-like streaming boxes that can send and receive the HDR signal that carries the ‌metadata to render images properly.

Broadcast television and popular video formats like Blu-ray still use SDR standards, so viewing them on an HDR-compatible screen will not make any difference. However, the advancements in LED backlighting produce the range of color and lighting intensity that makes HDR a future-forward investment.

SDR vs. HDR: 6 Must-Know Facts

  • The Consumer Technology Association developed HDR 10 in response to the prohibitive cost of Dolby Vision for manufacturers.
  • The Hybrid log–gamma (HLG 10) format is a non-metadata version of HDR that is backward compatible with SDR, in particular, SDR UHTV and screens that use the Rec2020 color standard.
  • The SDR color range spans 16.67 million different individual colors.
  • The HDR color range spans up to 1.6 billion different individual colors.
  • SDR uses brightness, color, and contrast to set the parameters for light intensity.
  • Human vision can perceive a significantly wider range of color gamut levels and luminescence than SDR can display.

Up Next…

Interested in more technological comparisons? Read the articles below:

Frequently Asked Questions

What is wide color gamut?

The color gamut of a television is the number and range of colors it can display. A wide color gamut is necessary for displaying HDR content. HDR content looks best on an HDR TV with a wide color gamut.

How do you know if you're getting HDR on your TV?

Using HDR settings on a compatible TV should produce a marked difference in picture clarity and quality. Depending on your television manufacturer, there are several ways to determine if you are seeing HDR TV:

  • Check the manufacturer’s specifications for your television.
  • On LG and Vizio TVs, HDR 10 or Dolby Vision playback will display an icon in the upper right-hand corner of your screen.
  • On Samsung screens, you can bring up the information bar on your screen, which should say ‘HDR’ if it is being used.

For other brands, try accessing the settings via your remote and look for ‘Preferences,’ ‘Picture,’ and ‘Picture Mode’ to see if you are viewing in HDR.

Does an HDR-capable TV always use HDR?

Though an HDR TV is essential for watching HDR content, the TV has to receive a signal that includes the HDR metadata to render HDR images correctly. This means that the TV requires adequate bandwidth for streaming HDR content, an HDMI port that supports HDCP 2.2, and an HDR-compatible streaming device with a subscription that includes HDR content streaming. You also need to select HDR-compatible content to finally enjoy HDR viewing.

How many HDR formats are there?

There are currently five different HDR standards that are commercially used. They are:

  1. Dolby Vision
  2. HDR 10
  3. HDR 10+
  4. HLG
  5. Advanced HDR by Technicolor

They are not inter-compatible and require the correct hardware and compatible content to deliver HDR viewing. Thankfully, streaming services usually offer HDR viewing in more than one HDR format, so all HDR TVs can use it.

How is HDR for TV different from HDR for photography?

HDR imaging in photography pre-dates the introduction of HDR for television and has some marked differences in its purpose and operation.

Like HDR television, HDR for photography utilizes a wider dynamic range to create richer and more realistic digital images. However, HDR for TV is a display technology, but HDR for photography is an image capture technology. HDR image capture involves multiple exposures to produce images with HDR properties. Screens can then display HDR camera images in the same way a television screen displays them. 

To top