What Is HDR? | High Dynamic Range Thoroughly Explained

HDR is a technology used to enhance the features of a photo or a video to look more appealing to the eyes. It adds luminance, colors, and contrast to an image or video. HDR is common in modern televisions. It provides more information about the color and brightness of a display, eliminating problems experienced on older video signals. This feature makes the display more vibrant. An HDR video contains more details on the video's extremes. This aspect means that a bright and a dark image display on the same screen can be shown distinctly with a perfect and detailed transition. You're supposed to play HDR video on an HDR TV to enjoy the full HDR advantage; otherwise, playing the HDR video on a non-HDR-compatible TV will have the display no different from non–HDR content. 

The main feature of HDR is its ability to widen the color and contrast. This feature means that the bright sections will appear much brighter, while dark parts will appear darker, bringing out the depth in the display quality. The benefit of HDR is that it makes the display dark, bright color, vibrant, and more appealing to the human eye. Types of HDR are Dolby vision, HDR 10, HDR10+, and Hybrid Log-Gamma (HLG).

Which Devices Can HDR be Used on?

There are several devices that you can use HDR on. 

You can use HDR on the devices listed below.

  • Smart TVs

  • Ultra-HD Blu-ray players

  • 4K streaming devices such as Fire TV 4K, Chromecast Ultra, Roku Streaming Stick 4K, Roku Ultra, and Apple TV 4K. 

  • Both Android and iOS phones include Apple iPhone 8, 8 Plus, X, Samsung Galaxy Note 8, Book, and Tab S3. 

  • Streaming networks like Youtube and Netflix

  • Computer monitors, including Gigabyte AORUS FV43U, LG OLED48C1, and Gigabyte AORUS FO48U OLED.

  • Gaming consoles like PlayStation 5 and Xbox X.

You can check if your device supports HDR by playing an HDR video on it. If you see that the HDR standard is supported, you'll see HDR next to the video resolution. You're supposed to click on the HDR tab to play the video in HDR. Keep in mind that there are different types of HDR, and your device can only support one. 

a person in front of the large screen

Why is HDR Preferred?

HDR is preferred because it produces more vibrant colors and quality displays. The HDR display has more depth than the standard non-HDR display. HDR displays' brightness levels, contrast ratios, and color depth are more pronounced and appealing. 

The dynamic range is the difference between the peak white and the peak dark in a display; the greater the range between bright and dark displays, the more vibrant the display. 

What is the Relation Between HDR and Color Gamut?

A color gamut is a group of colors in a given color spectrum. This range of colors can be reproduced on a secondary display interface. The color gamut coverage determines how many colors the computer monitor will display. Most monitors have 16.7 million colors, although some produce more hues than this tally. 

HDR and color gamut are two different aspects, but they have a relationship in the build-up of the display. High Definition, Ultra-High Definition, and 4K devices apply HDR and color gamut. HDR widens the luminance range in a display, making both bright and dark parts more pronounced, while color gamut provides a wide color range in a display, ensuring it is more detailed and composed.

What are the Types of HDR?


The types of HDR are listed below.


  • Dolby Vision

  • HDR 

  • HDR10+

  • Hybrid Log-Gamma (HLG)

1. Dolby Vision

Dolby created this HDR version. The technology applies dynamic metadata rather than static metadata. Unlike static metadata, which has a fixed brightness level across all its contents, dynamic metadata varies its brightness levels depending on individual scenes and frames. This feature allows the display to be more detailed in bright and dark sections. The dynamic metadata ensures the complete control of light, exposure, and data a display interface can display, depending on specific needs. By doing so, darker scenes will have more details on dim conditions while, at the same time, preserving the composure of lighter scenes in bright surroundings. This aspect is advantageous as the display interface only focuses on the current frame, with no pressure of preparing for the next frame, hence a more detailed display.


The metadata used in Dolby vision can be adjusted to meet the specifications of your display. These eliminate the limits based on how the video or display was mastered. Dolby Vision will communicate to your display interface to insight on the levels of light and colors to use on a specific video or content. However, this will depend on the value specifications between your Dolby Vision-compatible device and Dolby vision. Depending on the need, the display vision can let your device display more colors or higher brightness levels than HDR 10, provided how the content was mastered allows for such modifications. 


Dolby Vision is a mastered HDR version, unlike other HDR types like HDR 10, which is open. A video having Dolby vision on its end means that the technology was used in the mastering of the video. 


The advantage of Dolby vision is that it allows for modification of video's brightness levels and colors displayed, unlike other HDR types. The Dolby vision also uses dynamic metadata that allows variation of brightness levels depending on the specific needs of different contents. Being licensed is also an advantage to using it—also, TVs compatible with Dolby Vision support HDR 10 by default. In general, Dolby vision has a more quality display than any other HDR version. 


The disadvantage with Dolby's vision is that it needs certification for media and screens for compatibility, but despite this aspect, it is not specific like HDR 10 is.

2. HDR10

HDR10 is an HDR standard version that UHD Alliance enhanced. This HDR technology has specific and more defined color ranges across all contents. Unlike Dolby vision that uses dynamic metadata, HDR 10 uses static metadata that ensures the brightness level is uniform in all continents supporting HDR 10. Playing HDR 10 video on different screens will produce the same effect, regardless of the panel type used. Also, HDR 10 is an open HDR version; therefore, anyone can use it without any charge. 


The advantage of using HDR 10 is an open HDR version that you can use for free. The openness makes it widely used in most devices. Also, HDR 10 is a more cost-effective HDR version than Dolby vision, a premium HDR version. 


The disadvantage with HDR 10 is that it has fixed preset brightness levels and color range for all its contents. Therefore, you can modify these specifications to match your specific display needs. 

3. HDR10+

HDR10+ is an HDR technology developed from HDR 10. It is more advanced and improved than its predecessor. Three companies developed it: Samsung, Panasonic, and 20th Century Fox, which explains why HDR 10 support is mainly limited to Samsung and Panasonic devices. HDR 10+ uses dynamic metadata, the same as Dolby vision. Despite its inability to use individualized data for specific frames, it still controls the brightness levels and the color range the TV displays on individual frames. It can enhance your display more than HDR 10, but like HDR 10, it is an open HDR version that may not need a license to use it. 


HDR 10+ supports high brightness levels, which can reach 4,000 nits. Its contrast ratio is also high, ensuring color depth on all its displays. The nature with which HDR 10+ processes data is different from other metadata compatible HDR versions. It uses dynamic metadata that changes the display details and composure on individual frames depending on their specifications. Therefore, each frame will have a distinct color, brightness, and contrast. HDR 10+ leaves no room for display saturation, as it ensures every detail is clear and pronounced. 


What's more impressive is that the technology uses adaptive technology that observes your environment and adjusts the brightness levels, contrast, and colors of your display to match the environment's character. 


The advantage of HDR 10+ is that it uses metadata that adjusts the brightness, colors, and contrast of a display on individual frames to produce a more realistic and detailed display (see also 'Samsung TV Best Color Settings' post). The technology is also backward compatible, meaning a device using HDR 10 can also be compatible with HDR10+. Besides, HDR 10+ has high brightness levels of up to 4,000 nits that ensure the display is clear and vivid. Not to mention its adaptive technology that reads the environment and allows auto-modification of the display's brightness and contrast to match the surrounding. 


The main disadvantage of HDR 10+ is that it is a more expensive HDR version than its companions.

4. Hybrid Log-Gamma (HLG)

Hybrid Log-Gamma is an HDR version that combines SDR metadata and HDR metadata in one signal. The BBC and NHK (a Japanese broadcasting company) developed this version in 2014.


HLG SDR and HDR metadata are transmitted in one signal, meaning that SDR and HDR televisions can use an identical HLG signal. Therefore, an HDR-compatible device will use HDR signals from the HLG signal, while an SDR device will use the SDR signal from the HLG. 


Hybrid Log-Gamma's popularity and applicability are not as comprehensive as HDR 10 or Dolby vision. It was developed to transmit HDR and SDR signals when broadcasting. One main difference between Hybrid Log-Gamma and other HDR versions is that they don't use metadata. Instead, it uses the TV's gamma curve for brightness and the logarithmic curve used in HDR devices to determine high brightness levels. 


The main advantage of HLG is that it does not use metadata; therefore, you can use it in HDR and SDR devices without any limitations. The main disadvantage of HLG is that it was primarily devised for use in broadcasting. Despite its specifications and function-oriented, it hasn't been adopted in broadcasting stations. 


Another issue with the HLG is that although it makes the display darker, it is still way too far from other HDR versions. There is very little difference between its images and those produced by SDR devices. Lastly, despite its backward compatibility, it is still not compatible with non-HLG HDR devices.

What are the Requirements for Using HDR?

The requirements for using HDR are shown below. 

Built-in Display

You can only play or stream HDR video on your Windows 10 (version 1803 or later) device if it satisfies the list of requirements listed below. 

  • The display should have a display resolution of 1080 pixels or more. 
  • The screen resolution should not be less than 1920 × 1080.
  • The display should have at least 300 nits of brightness.
  • The graphics card of the display should be integrated and compatible with PlayReady hardware digital rights management. This graphics card specification caters to its protection. It should also have the recommended codecs (10-bit video decoding). Such codecs may include AV1, VP9, and HEVC.

You should browse your device manufacturer's website to access the specifications for your device. 

a person in front of the screen

The requirements for using HDR on Windows, version 1709 or later, are shown below.

  • The display interface should have a minimum brightness of 300 nits.
  • The display should allow you to control its backlight.
  • The graphics card of the display should be integrated and compatible with PlayReady hardware digital rights management. This graphics card specification caters to its protection. It should also have the recommended codecs (10-bit video decoding). Such codecs may include AV1, VP9, and HEVC.
  • The manufacturer should have enabled HDR on the device.

External Displays

To play or stream HDR video on Windows 10 on an external display, the display and the PC need to be compatible with HDR. 

The list below shows the requirements for using HDR on an external display.

  • The external display must be compatible with HDR 10. 
  • The external display should support HDMI version 2.0 or DisplayPort version 1.4 input connectivity.
  • The external display should have DisplayHDR certification.
  • The graphics card of the display should be integrated and compatible with PlayReady hardware digital rights management. This graphics card specification caters to its protection. It should also have the recommended codecs (10-bit video decoding). Such codecs may include AV1, VP9, and HEVC.
  • The graphics card drivers on your PC should be the latest version. That is WDDM. You can update your graphics card through the Windows Update in the PC's Settings manual or visit your PC manufacturer's website. 

To check for the specific specifications for your external display and the PC, you should visit the manufacturer's website. 

3 monitors in different sizes

Is HDMI 2.1 Required for HDR?

HDMI 2.1 standard is built with features that enable dynamic HDR. HDR10+ and Dolby Vision are compatible with dynamic metadata. This compatibility allows HDMI 2.1 standard and another HDMI cable to support HDR. HDMI 2.1 is the most recent version of HDMI that supports high resolutions, refresh rates, and, most importantly, the bandwidth range. It can support 8K resolution at 60 Hertz refresh rate and 4K resolution at 120 Hertz refresh rate. It can also support up to 10K resolution. 

The benefits of using HDMI 2.1 include its support for a higher bandwidth range of up to 48 Gbps that allows for fast data transfer. Its support for a higher resolution of up to 10K will enable it to display clear and quality images. The cable also supports higher frame rates and resolutions and is compatible with modern monitors, PCs, PlayStation 5, Xbox Series X, and other devices. For HDMI 2.1 monitors, you'll need an HDMI 2.1 cable to connect your monitor to the PC to enjoy the full benefits of HDR. 

What Should the Screen Resolution Be for HDR?

The screen resolution for HDR should be 3840 x 2160. This screen resolution is the recommended resolution for HDR. The stated resolution is a 4K resolution. That explains why most HDR-compatible TVs are 4K TVs. an HDR display has a higher dynamic range than SDR. They also have no less than 10-bit color depth. You'll enjoy the full HDR advantage if you use a PC or monitor with a resolution of 3840 x 2160. However, HDR is also available on lower resolution monitors and PCs like 1080 pixels resolution and 2K pixels resolution. 

The relationship between HDR and screen resolution is that HDR represents the range of colors between the brightest and darkest parts. On the other hand, screen resolution represents the total number of pixels on a display interface's vertical and horizontal axis. The higher the pixel count (screen resolution), the more compatible the HDR feature is. 

a person watching tv

What is HDR Good For?

What the HDR is good for is listed below.

  • Gaming
  • Movies
  • Work
  • Design
  1. Gaming

High Dynamic Range (HDR) affects your gaming performance and experience in several ways. First, your monitor's color performance and integrity become stable, allowing you to have an accurate view of graphics and images in your gameplay. HDR also supports deeper saturation, increases the contrast of the display, and presents clear displays of brighter images while at the same time showing enough details of dark scenes. All these features improve the performance of your monitor while gaming. It also enhances your overall gaming experience. 

While gaming, darker scenes may present challenges distinguishing between partners and enemies. HDR has a diverse contrast that allows you to distinguish and identify your gaming enemies, enhancing your gaming performance in the long run. You won't experience problems in bright conditions, too, as HDR allows for higher brightness to enable you to see dark scenes without issues. HDR improves the display quality when gaming without compromising the monitor's performance. The monitor, the PC, and the game need to be HDR-ready for you to enjoy the full HDR gaming benefits. 

HDR is increasingly becoming crucial and widely used in AAA games. These games include Horizon Zero Dawn, Gears of War 5, and The Last of Us. These games are competitive games that need HDR support for absolute performance. You'll therefore need to have the best gaming monitor that supports HDR. 

So, what are the best monitors for gaming? The best monitors for gaming are the monitors that meet the minimum specification requirements for the various games. These monitors have high refresh rates, fast response time, high frame rates, and variable refresh rates. 

Some of the best monitors for gaming are listed below.

a person playing a video game
  1. Movies

HDR also has several benefits in movies. One thing about HDR is that it improves the quality of images and graphics displayed on your display screen. Watching movies on an HDR display interface is full of experience. Images appear brighter, detailed, and vibrant. However, to enjoy HDR benefits in your films, you need a display interface that supports HDR. 

Some of the benefits of watching movies on HDR devices include producing brighter images that look realistic and more appealing to the eyes and broader color gamut coverage on the display that makes images have color depth. HDR contents are widely and commonly reproduced in 4K resolution, which looks clear, sharper, and more detailed. 

So, what are the best monitors for watching movies? The best monitors for watching movies are the monitors that satisfy conditions like a minimum screen resolution of 3840 X 2160 and a color depth of 10 bits. They also feature an ergonomic design that allows for comfort through swivel, tilt, and height adjustments. Moreover, they are compatible with Sync technologies like AMD Freesync and Nvidia GSync, which eliminate screen tearing and stuttering. Such monitors also need wide viewing angles, specifically 178 degrees, for an excellent view of all sitting positions. 

Some of the best monitors for watching movies are listed below.

two people watching the movie
  1. Work

HDR does affect your working experience. HDR widens the color and contrast range of the display. Bright parts will look brighter, and the darker sections will appear darker without compromising the details of the display. Such composure in a display allows you to see images and graphics clearer and accurately. Seeing fine details on your screen while working reduces fatigue while working. This state will enable you to work for long sessions, improving your productivity in the long run. You're supposed to use an HDR-compatible monitor to enjoy the HDR benefits while working. 

So, what are the best monitors for the office? The best monitors for the office are the monitors that have high screen resolution, the perfect screen size of between 24 inches and 30 inches, HDR compatible, and ergonomic design. Such monitors also need eye care technologies like low blue light that reduces eye strain

Some of the best monitors for work are listed below.

a person working on the computer
  1. Design

When it comes to design, you'll much need HDR technology. A designer needs to see every detail of the content displayed on the screen. HDR delivers just that, what designing needs. Enjoy designing with the clarity and detailed images and graphics of HDR. The best monitor for design should be 27 inches. When designing, you would want to go for a larger monitor with a widescreen that can allow you to create several windows on the same screen. 

The monitor should also have eye-care technologies to care for the user's wellbeing as you will have long working sessions on the monitor. These monitors also need to have an ergonomic design so that the user's comfort is well taken care of. The refresh rate for such monitors needs to be 60 Hertz or higher. The monitor should also have a resolution of not less than 1920 x 1080.  

So, what are the best monitors for design? The best monitors for design are monitors with HDR support, large display interfaces, eye care technologies, and high display resolutions for quality image and graphics production. Such monitors also need an ergonomic design that allows the user's comfort. Large screens provide enough space for creating multiple windows on the same screen, allowing for multitasking. 

Some of the best monitors for design are listed below.

working desk

What are the Best Monitors That Support HDR?

The best monitors that support HDR are listed below.

The best HDR Monitors are monitors that support HDR technology and have high screen resolutions. Such monitors have high contrasts, high brightness levels, and detailed bright and dark images. They are suitable for gaming monitors, work, design, and monitors for watching movies. 

What are the Alternatives to HDR?

Conclusion: there are alternative technologies to HDR that work closely with how HDR works. Some are even better in performance than HDR. Such technologies include Ultra High Definition (UHD) and Standard Dynamic Range (SDR). 

UHD is a technology developed for consumer application and a standard for broadcasting (see QHD compared to FHD, too). It is a more efficient technology than its predecessor (Full HD). It has a screen resolution of 3,840 x 2,160. Standard Dynamic Range (SDR) is a standard technology for cinema and video display. However, it has a low-quality display than HDR. It only represents a section of the dynamic range HDR can display. 

About Dusan Stanar

I'm the founder of VSS Monitoring. I have been both writing and working in technology in a number of roles for dozens of years and wanted to bring my experience online to make it publicly available. Visit https://www.vssmonitoring.com/about-us/ to read more about myself and the rest of the team.

Leave a Comment

PHP Code Snippets Powered By : XYZScripts.com