In part one of this series, I discussed some common misconceptions about gamma and high dynamic range content. In this part, I look at another area of confusion, brightness, and then discuss emerging HDR standards and offer tips for adjusting HDR content using a waveform monitor.
Sometimes high dynamic range is confused with high brightness. The two are not the same. For example, you can have high dynamic range in a dark movie environment, with maximum brightness of only 48 nits but with a minimum much less than .03 nits in a good theater. That’s a good dynamic range. Alternatively, you can see displays with very bright screens of hundreds, or even thousands, of nits with a relatively poor displayed dynamic range. Just turn off the TV and if the screen looks light gray because of a light viewing environment, that is the bottom of your dynamic range since TVs cannot produce negative light. Overall, however, bright screens are beneficial in bright viewing environments.
The idea that making displays brighter will always give higher dynamic range is misleading for another reason. For instance, you can’t increase the dynamic range of audio by turning up the volume. With audio, turning up the volume increases the noise as well. The same is true for video. With video, the “noise” is quantization noise, where the steps between quantization levels become clearly visible. This typically appears as banding or contouring. To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma.
Evolving HDR Standards
SMPTE has standardized the ST2084 PQ, also known as Dolby Vision, for mastering reference displays. This uses a PQ based on the Barten contour perception with an EOTF that is the inverse of the OETF. Although the standard allows 0.001 to 10,000 nits with 10-bits, currently the best HDR displays peak at about 4,000 nits.
Another approach, standardized by ARIB STD-B67, uses a hybrid log gamma and was developed by the BBC and NHK. This process extends the log processing of high brightness peaks to mitigate blown out or clipped whites. By using the gamma function it seamlessly uses the power-law processing in the blacks as in BT.709 and BT.2020 standards but without the linear segment. This standard allows the display’s EOTF to adjust system gamma to correct for surround illuminations in the range or 10 to 500 nits.
Additional proposed standards from Philips and Technicolor are in the process of being combined. The Philips is parameter-based and embeds low bit rate HDR and SDR conversion parameters into the metadata. These parameters are extracted during decoding and used to tune the display for peak luma. Technicolor video mastering and distribution workflow allows for grading both an HDR and SDR master. This is vital to maintain the artistic intent of the image.
The academy color encoding system (ACES), while not an HDR format per se, allows for wide dynamic range and wide color gamut preserving the color workflow with 33-bit floating point and 10-bit proxy output in stops.
At this time, these standards and other proposals continue to evolve to help define the HDR workflow from the camera to the home.
Adjusting HDR Gamma
Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images. The has led to the use of non-linear processing such as S-Log2, ST2084 PQ, and HLG that use the bits more efficiently to capture images. This in turn allows HDR displays to use the bits more effectively. However, it is important when capturing an image for the camera that the correct white point and 18% grey levels are set up on a waveform monitor to ensure correct processing of the signal through the chain.
A waveform monitor was used to assist in capturing the camera RAW image for a 100 nits SDR display as seen below. The image on the left was processed so as to appear to see what the camera sees with the exception that the highlights in the sky are lost. The image on the right shows how a standard BT.709 gamma camera signal would appear on a calibrated BT.1886 SDR reference monitor. Note on the right image that the tree and bicycle handle appear darker with more contrast than the actual scene at the camera due to the stretching of the blacks, even though the sky is still blown out and limited to 100 nits.
Figure 2. Comparison of a scene using an HDR monitor and a reference BT 1886 monitor. Note that these image are simulated to show the differences.
In order to see the scene’s highlights in the sky above 100 nits it would be necessary to deliver the content in HDR while maintaining roughly the same average picture level as the SDR display on a BT.1886 monitor. However, should the HDR gray-scale below 100 nits match the camera scene or should the blacks be stretched to look like the SDR BT.1886 display for compatibility?
Once the decision between a camera-side scene match and a compatible SDR BT.1886 display match is made, the next step is to determine the dynamic range of the target HDR monitor, as well as the level of reference white for the SDR image. These factors all need to be considered when calculating a table of conversion values or creating an HDR conversion LUT. A deeper look at this process is the topic for another article.
All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content. During the transition period from SDR to HDR, a waveform monitor will play an important role to correctly capture camera RAW footage and to correctly balance images for both SDR and HDR.
Educated in England where he received an Honors degree in Communications Engineering from the University of Kent, Mike Waidson started his career with a consumer television manufacturer as a research engineer in the digital video department, before moving into the broadcast industry. Mike has over 30 years of experience within the broadcast industry working for various video manufacturers. At Tektronix as an application engineer within the Video Business Division, Mike provides technical support on video measurement products.