Tech Talk: What Does The HDR Label Actually Mean?

HDR is the latest trend to sweep across the imaging technologies industry and, although it is fairly straightforward, there are different formats available that can potentially muddy the waters for everyday consumers. That's likely because humanity, in general, has become accustomed to competing technologies that show clear differentiators and which are highly incompatible - making it easier to see differences between competing technologies. More directly, that comes down to the fact that most of those differences have historically been in the hardware and in plain view to users and HDR is mostly thrown under the same label. So, those traits consumers look for are not necessarily present with high dynamic range (HDR) implementations. The similarities don't really outweigh the differences between HDR10, Dolby Vision, HLG, and the more recently introduced HDR10+ but from a consumer perspective, the differences can seem subtle. Making matters worse, "HDR" means something different whether discussing photography, cinematography, and image or video viewing technologies. With display technologies specifically, HDR is used to describe the capability of a display to better show a higher dynamic range, or ratio, of light and dark as compared to standard dynamic range (SDR) displays. With photography or shooting videos, it's a manner of describing how much control over contrast a content creator can exert for each part of an image or video and from frame to frame.

To really understand what is meant by the term "high dynamic range," for the sake of comparison, it is particularly useful to understand how HDR-enabled cameras work. Many modern smartphones support HDR capture as a feature built into the camera's software control. At its most basic, when taking a photo, HDR works by taking multiple photographs in rapid succession and then effectively stitching those together through software. What that allows for is a higher level of contrast since lighter areas of a shot can be brightened and darker areas made darker, by combining the images or portions of the images. Each of the HDR standards mentioned above accomplishes the intended effect of increasing the range of contrast substantially. However, because the feature has become heavily dependent on software and video decoding or encoding, they are not created equal and not every playback medium will support every kind of HDR.

The two biggest competitors in the space, as of this writing, are Dolby Vision and HDR10. One difference between the two that isn't immediately obvious is that Dolby Vision actually supports a higher number of colors. That's because is it a 12-bit technology, as compared to HDR10's 10-bit color depth. What that means is that the dynamic range on offer via Dolby Vision, giving no thought to how much more natural colors can appear when displays begin to support 12-bit depth, much better. In fact, it should make Doby Vision the clear winner between the two, except that there currently aren't any consumer-ready displays that support 12-bit color - making the additional depth almost pointless for now. Bearing that in mind, HDR10 also has static metadata, limiting the dynamic range from frame to frame in any video content. Dolby Vision, meanwhile, allows content creators to shift dynamic focus from frame to frame. That feature comes at a cost. HDR10 is an open format, while Dolby enforces a greater level of control over its platform and charging manufacturers of hardware to ensure compatibility. While content creators definitely have more control to make media display exactly as intended, that fee has generally made HDR10 the more popular format for a long time and one that is supported by every HDR display. There has been a shift in that position with mobile manufacturers, in particular, looking to provide consumers in the highly competitive space with the best experience possible. It also marked a significant shift from hardware decoding solutions to software-based decoding for Dolby Vision, making it a much more appealing prospect for other electronics because of its advantages. With that change, Dolby Vision has taken off and become more of a defacto standard for HDR content.

It should be said for clarification that HDR is still a technological breakthrough very much in its infancy. SDR has been the predominant format for decades. That fact shows through nowhere better than via hybrid log gamma (HLG) HDR. First introduced as a concept back in 2015,  HLG is a standard for the traditional broadcasting of content that is technically HDR content, but which is quite a bit diminished from what modern HDR really shows to be possible. Because it is intended for use in circumstances where transmission rates are relatively low, as compared to data sent over a high-speed data or Wi-Fi connection, or an HDMI cable, it is a hybrid format that can be read by SDR or HDR hardware and has a much lower dynamic range than the aforementioned technologies. It isn't really a direct competitor in the HDR market, but that it could certainly be viewed as a kind of transitional format because of its hybrid nature and backward compatibility.

With all of that said, there could be another change to which HDR format manufacturers are most fond of in the near future because of HDR10+. The relatively new format was developed by Amazon and Samsung, as a more direct competitor to Dolby Vision. It essentially takes a page from Dolby's book in that it allows for dynamic metadata, but has also been kept open-source by the two companies involved. That gives it something of a potential edge over Dolby Vision, of course, since it won't require the added fees but could also present a similar problem to HDR10 in that manufacturers and content creators won't necessarily incorporate it to perfection. Having said that, it could also level the playing field quite a bit. Moreover, there has been some speculation that current HDR-capable televisions may, at some point, be able to support content encoded to HDR10+ format through software updates. Unfortunately, HDR10+ was only just introduced earlier this year and there are nowhere near as many backers for it, as of this writing. So whether or not it can really overtake its competitors or not probably won't be obvious for a while longer. However, if HDR10+ support is adaptable enough to be easily added by software, that could create a sense of universality to HDR, as a whole. By proxy, the differences outlined here would become mostly meaningless from a media consumption perspective.

Copyright ©2019 Android Headlines. All Rights Reserved
This post may contain affiliate links. See our privacy policy for more information.
You May Like These
More Like This:
About the Author
2018/10/Daniel-Golightly-2018-New.jpg

Daniel Golightly

Senior Staff Writer
Daniel has been writing for AndroidHeadlines since 2016. As a Senior Staff Writer for the site, Daniel specializes in reviewing a diverse range of technology products and covering topics related to Chrome OS and Chromebooks. Daniel holds a Bachelor’s Degree in Software Engineering and has a background in Writing and Graphics Design that drives his passion for Android, Google products, the science behind the technology, and the direction it's heading. Contact him at [email protected]
Android Headlines We Are Hiring Apply Now