HDR TVs explained
You can think of HDR as the next step after 4K Ultra HD. At least that is how the industry is positioning it. 4K is “more pixels” – four times as many as HD – whereas HDR is “better pixels”. There is obviously much more to it than that.HDR is short for high dynamic range, which implies that you are currently watching standard dynamic range. It is impossible to show you what it looks like – your monitor is not capable of HDR – but consider the simulated photo below (from Dolby). HDR on the right side.
In essence, HDR is about brighter whites and deeper blacks, and more details in each end. HDR is about reproducing the world around us on a display. Current displays are not capable of reproducing the world as it really is because the world is more than just pixels. Light is just as important. That might sound confusing but we will get back to that.
Imagine being able to see bright sunlight reflections on metallic surfaces or all the stars in the sky on a perfect black canvas, even have your TV reproduce the colors of the worlds around you such as Coca Cola red (your current TV cannot reproduce this color).
There is quite a bit of confusion around HDR and for good reason. There are several players in the industry that are trying to make HDR happen, and you might already have heard about Dolby Vision. There is also an open HDR standard that has been adopted by Blu-ray and other distribution channels. TV manufacturers have come up with even more names.
For example, Samsung calls its HDR-capable TVs “SUHD” and refer to the system that enables it “Peak illuminator”. Panasonic refers to it simply as HDR but calls a panel that supports it “Super Bright Panel”. Sony refers to it as HDR and to be sure you should look for “X-tended Dynamic Range” in the specifications sheet. All of this is just marketing. Other players such as Dolby are talking about Dolby Vision, which actually has more elements to it than just HDR.
However, the standards are almost in place and we can now start talking about how the market will approach HDR.
Let’s take a few steps back and look at HDR from a more fundamental level. The next section is quite technical but you don’t have to understand every nuance. We will repeat the important things later.
A completely new foundation - a technical look
The subject of gamma and light is beyond the scope of this article but to fully comprehend HDR it is important to understand that most of the picture standards for today’s TVs were developed based on CRT (cathode ray tube) displays. We have yet to define fundamental standards for digital displays.Today’s TVs use an EOTF (Electro-optical Transfer Function) method to convert an input signal into visible light (and subsequently an image), and this system still relies on the characteristics of analog CRT displays, the so-called gamma curve. This is why displays use a gamma function (typically 2.2 or 2.4). We often refer to this gamma curve in our reviews.
However, today’s LCD and OLED display technologies are capable of more than CRT, and with HDR it looks like we will finally develop display standards that are based on the characteristics of the human eye instead of the limitations of an old analog display technology.
Before we get to that consider the following. Movies and TV shows are created and graded based on these principles that assume a maximum brightness level (white) of around 80-120 nits (or cd/m2) and a minimum (black depth) of around 0.05 cd/m2 for living room TVs (around maximum 48 nits for cinema). Absolute black is zero and the best consumer displays such as OLED can reach that. Modern TVs can also go way beyond 80-120 nits for maximum brightness, which means that most TV manufacturers have tried to “enhance” the picture in numerous ways, not unlike how TV manufacturers try to “enhance” motion. The content creators hate it but the point is that our displays are capable of more than the standards allow.
Unfortunately, most people associate “high brightness” on displays with bad things due to how TV manufacturers have approached it in the past. We often hear questions like “is HDR kind of like the dynamic mode on my TV”? Forget those thoughts for now and consider that a typical day with thin clouds equals something like 4000-7000 nits and a sunny day has an ambient light level of over 30,000 nits. Direct sunlight is even more extreme. We obviously don’t want to have to wear sunglasses in front of our TV but if we want to recreate the real world on a display there is no other way; we need higher brightness. Also, remember that the human eye dynamically adapts to light in our environment by closing and opening the pupil. That is how the human vision dynamically adjusts to daytime and night time.
So how much brightness do we need? That is a subject for debate. Dolby believes that we need a dynamic range of 0 to 10,000 nits, even though its Dolby Vision format usually has a lower maximum. The Blu-ray association recommends that “over 1000 nits should be limited to specular highlights”. Below you see the results of Dolby’s research.
The challenge is that our current EOTF gamma method cannot accommodate that. We need a new EOTF method, a new way of converting a digital signal into visible light, a method that takes the dynamic nature of the human vision into account. This method has been dubbed “Perceptual Quantizer” or PQ by Dolby. It is a completely new way of defining light in a digital display, and it allows us to finally leave the analog legacy behind. It is necessary for high dynamic range, which again is necessary if we want to improve picture quality.
These principles have been adopted in the “SMPTE 2048” standard, which sometimes refer to the format as HDR EOTF or PQ EOTF. Another standard called “SMPTE 2086” describes how to input HDR signals into HDR and non-HDR TVs. It obviously follows that a HDR TV needs to support the PQ format. The content/distribution system needs to support it, too.
To make it possible we also need more options to define “steps”, or grey tones. Dolby says that we need 12-bit per channel (36 bit total) whereas the rest of the industry seems to think that 10-bit per channel (30-bit) is enough for now. As you probably know, today’s TVs typically use 8-bit per channel. If we really, really wanted to use the old EOTF gamma, developed for analog CRTs, we would need something like 14 or even 16-bit for HDR. That is impossible and unpractical. HDR EOTF makes better use of the available bits.
If you want to learn more see SMPTE’s video on YouTube.
HDR is helping to lay a new foundation for digital video systems. That alone should be enough to love it, right?
There is more to “HDR”
As discussed in the previous sections, HDR requires higher brightness levels, deeper bit rates for colors, and a new PQ format. An interesting thing to observe is that the industry intends to do more. With higher brightness come better colors. Or to be precise; the possibility of a wider color gamut. A wider color gamut is not an element of “HDR” per se but when most people in the industry say “HDR” they typically mean better colors, too.The same can be said for 4K Ultra HD resolution. HDR works with HD resolution, sure, but no one seems interested in making it happen, so usually when you hear “HDR” it will imply HDR in 4K resolution.
As said, these things are not actually part of “HDR” but the industry seems to be taking the step to first DCI P3 and later BT.2020 with the introduction of HDR. And that is amazing! Full BT.2020 coverage will likely take some years to fulfill and as the name suggests it is actually a recommendation for year 2020. But the industry is moving forward and it looks like we could see the first high-end TVs with full support quite soon.
Read our backgrounder on Ultra HD & color spaces
Below you see a "normal HD Blu-ray" (on top with pause logo) vs. "HDR + wider color gamut (DCI)" (on bottom without pause logo) on Samsung's "SUHD" JS9500 TV. Your current display cannot reproduce DCI or HDR so the examples are not representative but it should give you an idea. Click on the photos to zoom.
So that was the technical side of things. Let us get back to the real world again.
HDR in the real world - you need a new TV
First of all, HDR is not a gimmick. It is a real improvement so the industry has a responsibility. You will surely see some manufacturers try to take advantage of the hype and “upscale” normal content to HDR. Heck, it is even happening today to some degree with “Dynamic” or “Vivid” modes on TVs.In order to get real HDR the industry needs to implement it in every link of the chain. The camera needs to capture HDR, studio grading needs to be done in HDR, the distribution channels need to support HDR, and your TV needs to support HDR (not just imitate it). We will focus on the last two links.
TVs obviously need to be able to output a higher brightness level but also a very low black level. Plasma TVs were not able to do that but LCDs and OLEDs are. On a LCD you will need a “local dimming” system to be able to control brightness locally in zones. The more zones the better. We have already seen edge LED based LCD TVs claim HDR support but in our opinion this is stretching it. Ideally you would want to be able to control light output from 0 nit to a maximum brightness level of 800-1000 nits (or much higher for Dolby Vision, typically 4,000-10,000) in every single pixel. Does that sound familiar? Yes that is how OLED displays work.
You will not be able to experience HDR on your current TV. You need a TV that is able to output a higher brightness level. The TV also needs to support the new PQ format, as discussed above.
TVs that claim HDR support:
These are all high-end TVs and 2015 is the first year of HDR. As always, you can expect the high-end features to appear in mid-range TVs after a year or two. There are still many issues to overcome going forward and companies such as Technicolor are trying to bring HDR to set-top boxes and TV broadcasters. As said, there are several approaches to HDR right now and besides Dolby Vision and the “open” version that is mandatory on Ultra HD Blu-ray (other formats such as Dolby Vision are optional for BD player manufacturers), Philips and Sony have been looking into it, too.
The UHD Alliance (link 1, 2) is currently trying to bring order to the industry so everyone moves in the same direction. That does not necessarily mean a single HDR specification but the UHD Alliance wants to unite efforts. Great initiative by the way.
You might be wondering what will happen if you try to play a HDR movie on a non-HDR TV. Well, the industry has actually come up with a great solution. In that case you will just get the regular SDR (standard definition range) picture. HDR is layered on top of the signal as an extra package and signaled to the TV with metadata (requires HDMI 2.0b). If the TV does not support HDR it will simply ignore the extra package.
Hollywood studios have started preparing several titles for release in 4K HDR, and the new Ultra HD Blu-ray format will obviously support it.
How to get watch HDR content:
Learn more in our 4K / HDR section
HDR is a more important development than most people realize so we suggest that you consider it for your next TV purchase. So, there you have it. Excited yet?