What becomes of skin tones when a clip that is mastered to 1,000 nits is viewed on a 4,000-nit display?

a) Luminance levels will all remain unchanged

b) Luminance levels will all increase exponentially

c) Skin tones will remain unchanged, highlights and shadows will expand and fur will become more saturated

d) The picture will fry the viewer’s eyeballs

e) None of the above

According to the Ultra HD Forum, skin tones should be rendered at the same absolute luminance:

“The PQ signal is “display-referred”, meaning that the pixel-encoded values represent specific values of luminance for displayed pixels. The intent is that only the luminance values near the minimum or maximum luminance capability of a display are necessarily adjusted to utilize the available dynamic range of the display. Some implementations may apply a “knee” at the compensation points in order to provide a smoother transition from the coded values to the display capabilities; e.g., to avoid “clipping”.

When default display settings are engaged, PQ enables pixel values in the mid-range, including skin tones, to be rendered on a display at the same (absolute) luminance level that was determined at production.

For example, if a scene was graded on a 1000 cd/m2 grading monitor and then displayed on a 4000 cd/m2 display, the skin tones can be rendered at the same luminance values on the 4000-nit display as on the 1000-nit monitor per the grader’s intent, while the speculars and darker tones can be smoothly extended to take full advantage of the 4000-nit display.”

Regrettably, we don’t happen to have a 4,000-nit Dolby Pulsar lying around, but we did manage to compare the brightness levels of a few videos on our MacBook Pro and iPhone 12 Pro Max. First, with True Tone and Auto Brightness disabled, we toggled between display presets Apple XDR Display (P3-1600 nits) and HDR Video (P3 ST2084) on the MacBook Pro (2021) while viewing YouTube HDR10 and Netflix Dolby Vision content. Skin tones and highlights both became brighter and blacks became crushed when switching from P3-ST 2084 to P3-1600 nits. Comparing the MacBook Pro with reference mode P3 ST2084 to the iPhone 12 Pro Max, highlights and skin tones on the 1,200-nit 6.7-inch OLED display were marginally brighter. It’s possible that in all instances, highlights are being stretched out more than skin tones, but it’s hard to quantify, especially as it’s still daylight here – so don’t be surprised if we revise everything when we run the same tests again this evening! he he That’s one of the reasons we tend to grade on the dark side. It’s also a sure bet that many would consider the brighter, tone-mapped version on their Android or Apple phone more impactful.

Further on, the document reads,

“Note that future displays may become available with higher peak brightness capability compared with those available today. Content that is expected to be of interest to consumers for many years to come may benefit from retaining an archive copy coded in “absolute values” of light (e.g., PQ) or original camera capture format (e.g., RAW, log) so that future grades of higher luminance range can be produced and delivered to viewers.”

Meanwhile, many devices ignore the metadata we insert to preserve the creator’s intent altogether, so admittedly, it’s all a little confusing!

6 thoughts on “What becomes of skin tones when a clip that is mastered to 1,000 nits is viewed on a 4,000-nit display?

Add yours

  1. This is an advanced topic, and really the only good HDR conversation on the internet.
    Several observations here:

    [quote]“The PQ signal is “display-referred”, meaning that the pixel-encoded values represent specific values of luminance for displayed pixels. When default display settings are engaged, PQ enables pixel values in the mid-range, including skin tones, to be rendered on a display at the same (absolute) luminance level that was determined at production.[/quote]
    ^^^100%

    [quote]Some implementations may apply a “knee” at the compensation points in order to provide a smoother transition from the coded values to the display capabilities; e.g., to avoid “clipping”.[/quote]

    This, and any time a “knee” is under discussion; sounds more like gamma, a scene referred OETF. The scene is expressed as E’ the IRE voltage. This is not the way PQ – ST2084 EOTF works. This and Apple XDR Display (P3 1600 nits mode), to give it a name would be “high brightness SDR.” Do you agree? It should not be a true PQ because tone mapping is not a reverse process like 1000 -> 4000. Tone mapping is to compress a larger space 4000 into a smaller space 1000. A 4000 nit PQ display should display 1000 as 1000, not expand it to 4000, if it’s ST2084-PQ. But a high brightness SDR (and again, that’s just a name I gave it) *could* expand to 1600 nits (Apple XDR 1600 mode) or 4000 nits (Pulsar) because it’s not bound by the PQ curve which hard-maps 10b code value 0 – 1023 to 0 – 10,000 nits. Again, the difference is in how gamma works versus EOTF.

    There is a capability in DaVinci YRGB Color Managed Mode to allow Resolve to master on a 1000 nit monitor, internally grade 1000 nits nominal with peak extensions to 4000 nits, render the output to 2000 nits. Here is a YouTube demonstration video with all the settings: https://youtu.be/ep6JDT6zU-4 Note: YouTube only outputs HDR to 1000 nits, so I am also including a download link to the actual video in 2000 nits: https://drive.google.com/file/d/1em4aezgDIbuncg-wCEHjLfKu3yG6DFCn/view?usp=sharing.
    With that video, I think your Apple XDR could display brightness equal to the 1600 mode in the ST2084 mode.

    1. Thanks so much, Tom. Maybe the document is referring to PQ without any metadata? Can’t wait to get home, watch the video and check out the download!

      1. It seem’s entirely possible. That may be an even better name for it, “PQ without any metadata.”

  2. What a mess of standards HDR is still, and as brighter screens are introduced it just seems to get more convoluted.

    Not to mention the marketing tricks. Why is a screen even allowed to claim they are 96% P3 – that basically equates to same as Rec 709 by my math.

    If most of the HDR signal is supposed to sit close to SDR, the brighter the peaks are being displayed the more ‘distracting’ they tend to become. (I recall being in the cinema where there was a fade to white scene, it completely pulled me out of the narrative/story as an extreme example)
    I don’t see evidence from many colourists who want to push beyond 800 nits for such creative reasons, so what is the point of these brighter screens, unless for outdoor viewing.

    1. “What a mess of standards HDR is still…”
      I read this all the time, but for the most part, you’ve got HLG, Dolby Vision and HDR10. The viewer at home doesn’t have to lift a finger, the television switches between them automatically. Regardless of brightness, all displays are supposed to adhere to the same standards.

      “Why is a screen even allowed to claim they are 96% P3 – that basically equates to same as Rec 709 by my math.”
      If manufacturers really said that their television sets covered 96% of P3, then they should be commended for their honesty!

      “If most of the HDR signal is supposed to sit close to SDR, the brighter the peaks are being displayed the more ‘distracting’ they tend to become.”
      If specular highlights are distracting, that would be the fault of the cinematographer/colorist, not the PQ curve.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog at WordPress.com.

Up ↑