When Dolby conducted a study of viewer preferences for luminance levels of specular highlights and emissive sources to determine the dynamic range needed to display HDR content, it was found that the average preferred maximum luminance for highlight reproduction satisfying 50% of viewers is ∼2,500 cd/m2, increasing to just over 20,000 cd/m2 when catering to 90%. Fewer than 20% of participants in the study were happy with 1,000 cd/m2.
The extended brightness of HDR is not to increase the overall light level of the entire picture but to allow headroom for specular highlights. The APL (average picture level) of HDR movies is usually the same or even often less than SDR content, which is why we don’t need to jump for the remote to adjust brightness when switching between SDR and HDR content on Netflix or YouTube. A picture that is too bright will cause eye strain in the viewer, especially since HDR content is intended to be watched in a dark viewing environment. Making the entire image brighter would be perceived as being of poor quality and would also risk triggering the ABL (auto brightness limiter) feature on OLED displays whose average picture level is <20% of peak brightness. Unlike SDR, PQ HDR is an absolute standard, meaning that it’s not possible to raise brightness to accommodate ambient lighting.
In addition to the technical limitations, there is also the aesthetic element – as you raise the APL, there is less and less headroom for specular highlights and they’re no longer impactful.This is why you’ll often see diffuse whites in dramas lower than the recommended 203 nits. Exceptions to the rule are when filmmakers use the extended brightness of HDR when a character goes from a dimly lit interior to the sunny outdoors, a technique that is not used nearly often enough; or to intentionally create discomfort in the viewer (e.g. flashing strobe lights).
Leave a Reply