
© iStock.com / haizon
Key Points
- HDR and SDR technology are both forms of display signal technology.
- HDR was first released in the 1980s, while SDR was first revealed in 1934.
- HDR has a dynamic range of 17.6 stops which is almost thrice that of SDR’s range of 6 stops.
In recent years, advancements in the image quality of televisions have been unrelenting. There are now so many competing technologies that offer video and image viewing with enhanced resolution and realism.
In particular, HDR is a next-generation display signal technology that creates a high degree of color clarity and contrast. It’s a big leap forward from the original SDR technology that has been around since the very first color TV.
Let’s break down what SDR and HDR are, and examine the key differences between them so you can evaluate what you need to get the best viewing experience.
What is SDR?
Standard Dynamic Range (SDR) is a screen contrast technology used in cathode ray tube (CRT) displays.
SDR’s representation of screen light intensity is determined by the color characteristics, contrast, and brightness of CRTs. SDR uses the gamma curve to encode light intensity information so a video can represent it accurately.
The dynamic range is the range spanning the lightest and darkest parts of a display image. This information, called luminosity, impacts the accuracy and quality of an image display. If a single image has a wide range of dark and bright properties it is described as having a high dynamic range. SDR is able to represent image luminosity within predetermined limits. The technical parameters of CRT include:
- A maximum luminous intensity of 100 candelas per square meter. Luminance measures the amount of light that passes through, is emitted from, or reflected off an object.
- A black level around 0.1 candelas per square meter. The black level measures the level of brightness at the darkest part of an image or where no light is emitted.
- Use of the sRGB color space that is routinely used on television screens and monitors. SDR may also use the Rec.709 color standard developed by the International Telecommunication Union (ITU).
The History of SDR
The development of SDR follows the development of the CRT tube and television sets, going back as far as 1934.
Until HDR technology (discussed below) was introduced, SDR was the primary means of rendering light intensity on screens. The fixed parameters of CRT limit its use with high-definition screens, as the image quality is unpredictable and difficult to calibrate.
Applications of SDR Technology
SDR is primarily used to represent light in images and videos shown on a CRT display. Some forms of cinematography and photography also use this technology.
What is HDR?
High Dynamic Range (HDR) is a display technology that renders screen light intensity with a wide or high dynamic range.
Unlike SDR, HDR is capable of extremely bright and detailed contrast in image and video, with a color intensity that the older SDR technology cannot achieve.
HDR technology enables screens and displays to utilize a higher quality image source. It can be used with moving and still images. The effect of HDR is determined by the specifications of the display such as its brightness, contrast, and color properties.
This novel technology for representing light exceeds the limits of SDR and CRT technology, which has been superseded by current displays which are significantly more advanced. HDR facilitates higher brightness and wider color range, in line with the intent of the image or video creator.

©Antonio Salaverry/Shutterstock.com
How Does HDR Work?
HDR technology can reproduce light intensity with remarkable accuracy by quantifying the light used in an image and preserving key details that aid the reproduction of realistic-looking images.
HDR uses Electro-Optical Transfer Function (EOTF). This metadata transfer technology takes the image or video input and calculates an electronic value for brightness that your TV or monitor can use to display images with the correct light intensity.
The History of HDR
HDR imaging technology was first introduced commercially in 2014 as HDR-TV.
Dolby developed HDR as part of Dolby Vision, a set of proprietary imaging technologies spanning content creation, distribution, and playback.
Consumer electronics companies like Samsung have since devised their own version of HDR, known as HDR10, and other companies have devised standards to facilitate backward compatibility with SDR and use in photography.
Applications of HDR
HDR is used in television, computing, cinematography, photography, and even smartphones.
What’s the Difference Between SDR and HDR?
SDR is the legacy standard for image display but is still widely used in televisions and monitors. HDR standards are newer and designed to work with new televisions that have the correct picture quality.
SDR has only a fraction of the dynamic range of HDR. HDR can represent a greater range and level of detail of light intensity in images, preserving details that SDR cannot render. The visual effect of this is that SDR will clip colors that are beyond its range; for example, dark grey tones may be clipped to black and it may clip bright or light tones to white.
Photography measures dynamic range in stops. HDR has a dynamic range—17.6 stops—that is almost three times the stops value of SDR—6 stops.
SDR uses the gamma curve to encode data about light intensity, while HDR uses HDR metadata in line with standards like Dolby Vision or HDR 10.
SDR vs. HDR: A Side-by-Side Comparison
SDR | HDR | |
What is it? | Display signal technology | Display signal technology |
Primary Use | Storage of temporary system files | Storage of image data |
Initial Release | 1934 | The 1980s |
Influential Developers | Julius Plücker and Johann Wilhelm Hittorf (cathode ray tube) | Dolby, Samsung, the Consumer Technology Association |
Technologies Influenced | Color television | Ultra HD, high dynamic range imaging, image formatting |
Similarities and Differences
While SDR and HDR share some similarities, the technologies have plenty of differences to set them apart.
Similarities
- Both SDR and HDR are display signal technologies.
- Both SDR and HDR are used to determine the display of the light intensity of images on a screen or monitor.
- SDR and HDR are both currently in use in contemporary televisions.
Differences
- SDR works with cathode ray tubes.
- SDR has a narrower dynamic range than HDR.
- HDR can represent a wider range of light intensities, enabling it to render content correctly.
- HDR has several open-source and commercial standards.
What is SDR Used For?
SDR is a term that was introduced in 2014 to distinguish HDR technology from the legacy display signal technology that was still in use. Modern displays still use SDR video, but this technology limits the dynamic range they can represent compared to HDR.
A contemporary display has to operate according to the same visual parameters as a cathode ray tube display to use SDR optimally. In the absence of this, more sophisticated displays will try to adjust the SDR image to accommodate a color range that extends beyond the conventional limits of SDR. However, the results are often suboptimal for the screen and produce an impaired viewing experience when compared with HDR.
Does SDR Need to Be Upgraded?
Upgrading from SDR to HDR can certainly improve your viewing experience, but it’s important to recognize that SDR is still very much a mainstream technology and HDR requires HDR-compatible screens and hardware-like streaming boxes that can send and receive the HDR signal that carries the metadata to render images properly.
Broadcast television and popular video formats like Blu-ray still use SDR standards, so viewing them on an HDR-compatible screen will not make any difference. However, the advancements in LED backlighting produce the range of color and lighting intensity that makes HDR a future-forward investment.
SDR vs. HDR: 6 Must-Know Facts
- The Consumer Technology Association developed HDR 10 in response to the prohibitive cost of Dolby Vision for manufacturers.
- The Hybrid log–gamma (HLG 10) format is a non-metadata version of HDR that is backward compatible with SDR, in particular, SDR UHTV and screens that use the Rec2020 color standard.
- The SDR color range spans 16.67 million different individual colors.
- The HDR color range spans up to 1.6 billion different individual colors.
- SDR uses brightness, color, and contrast to set the parameters for light intensity.
- Human vision can perceive a significantly wider range of color gamut levels and luminescence than SDR can display.
Up Next…
Interested in more technological comparisons? Read the articles below:
- Monitors vs Televisions (TVs): What’s the Difference? Which of them has a higher resolution and refresh rate? And which of them comes with built-in streaming? Find out the answer here in addition to other key differences between these devices.
- Oculus Quest 2 vs HTC Vive Pro 2: Which is Better? One is a standalone device and the other, a tethered appliance. They both require different hardware too. Find out which option is the best for you.
- S22+ vs S22 ultra: Which One is Better? They both feature awesome power, combined with the ability to withstand wear and tear. But which one is the better option?