Panel tech is rapidly improving. My LG OLED was a revelation, not only for the perfect black levels, but for the incredible color accuracy as well. The increased local contrast also gives the impression of much greater sharpness – I can see details that aren’t visible on my 5K iMac. Black levels on the iMac are a weak .5 nits, just a dull grey, whereas OLED blacks are below .0005 nits.
I ordered the Asus ProArt 32UCX as an HDR grading monitor – using a 55” television set is too large for me to work comfortably in a small room, though high end studios use them; and more and more enthusiasts are, too.
The Asus has 1,152 local dimming zones and boasts 1,400 nits of brightness: that’s 700 nits more than my television set! By way of comparison, your average movie theater screen is somewhere in the neighborhood of 50 nits. On an SDR display, sunlight glaring off of the hood of a waxed automobile is reproduced as exactly the same brightness as a white sheet of paper. When 10,000 nit televisions are commonplace, I imagine viewers will squint when seeing bright flashes of light in a darkened room.
A problem with these high brightness LEDs with FALD is they draw a ton of power: I believe mine uses as much as 220 watts; they require efficient cooling, which I’m assuming accounts for the noise. It isn’t just that the monitor is expensive: it requires a $1,000 i/o device to work with a Mac. Still, it was only a couple years ago that Mystery Box was recommending the $3,700 SmallHD P3X, a 17” HD monitor, as a ‘budget’ solution to HDR grading.
A couple of weeks ago, it was announced that Netflix is now requiring all productions be shot in HDR. As they already dictate only log or RAW for capture – which are in fact HDR – I’m guessing this pertains to production design and lighting (?), but not really sure, and I don’t understand how that could be regulated. The original Chef’s Table is just one example of outstanding cinematography combined with HDR.
Leave a Reply