Just kidding! Before I delve further into the current monitor situation for HDR, let me first rattle off some of the specs of the 27″ Dell UP2718Q, talk a bit about why it’s so freakin’ awesome, then why you might not want to pull the trigger just yet, and lastly suggest some options that won’t require you to sell your kidney to pay the bills.
Dell proudly boasts that the UP2718Q is the first certified Ultra HD Premium monitor. What exactly does that mean? It means that it meets criteria regarding resolution, screen brightness, bit depth, color gamut and so on, all of which were developed with television sets in mind. some of which may not be necessary or even desirable for a reference monitor that you’ll be seated three feet away from in a smoke-filled dimly lit room decorated with heavy metal posters.
Now for the specs: the UP2718Q’s got an impressive 3840X2160 screen, a ridiculous 384-zone full array local dimming backlight (making the monitor a little chunky compared to its svelte competitors, and which probably accounts for a fair share of the hefty price tag), 1000 nit brightness, hardware calibration, 10-bit color, numerous connections, including Display Port 1.4, which allows for HDR support for high-end (read: expensive) graphics cards, an IPS panel and pretty nifty color accuracy for sRGB, Adobe and rec.709 (all 100%), 97.7% for DCI-P3 and 76.9% rec. 2020. Before you go ballistic, no monitors can currently display 100% of rec.2020, and in the neighborhood of 77% is supposedly an altogether respectable result. The monitor also happens to be rather well-built. While the unit itself is excellent, it’s got downsides that’ve got nothing at all to do with Dell (afaik): (1) it’s not compatible with Apple; (2) some behaviors with Windows will be a little aggravating (if memory serves me correctly, having to do with how it handles non-HDR content and screen brightness); (3) PC graphics cards will be terribly expensive; (4) support for HDR movies, such as streaming services from Netflix, Amazon Prime and YouTube are not available for PC and; (5) HDR PC games are in short supply. Oh, and did I mention that the monitor retails for $1,500?
I’m writing this after having sat through hours and hours of podcasts and webinars, some of which lasted as long as an hour and-a-half, which went on and on about the benefits of HDR, introduced dozens of new terms and acronyms I’d never heard of before, and didn’t get to the ‘what about the monitor?’ part until the last ten minutes, when you’d learn that the industry standard, the one used by practically all the studios, is the Sony BVMX300, which costs $45,000. What the heck!?
At this point, you might be wondering, why the push for HDR? And the short answer would be money – after all, these manufacturers have to sell their television sets! Seriously though, cinema cameras and even consumer cameras have long been able to shoot high dynamic range images (10-16 bit, 10+ stops of DR), but it’s only recently that display technology has followed suit. And now, reference monitors are playing catch up with premium TV sets. It couldn’t have been the other way around, because the displays used in monitors all roll off the same assembly lines as those used for consumer televisions, mobile phones, tablets and watches.
To make a long story short, I listened to Alister Chapman giving a webinar – and this is someone who’s got a fair amount of experience both behind the camera and as a colorist – and he claims to have graded a project or two with the Ninja Flame, a 7″ external recorder/monitor that runs $800, and when he took the finished work to Sony’s Pinewood Studios and looked at it on the Sony X300, he says the grade was nearly spot-on. Not perfect, mind you, but very good. So while it’s obviously not an ideal situation (I personally dislike editing on anything smaller than a 27″ monitor), anybody who’s able to get hold of an Atomos or other HDR external monitor should be able to begin editing HDR right away.
While he hadn’t done so himself as of the time of the webinar, Chapman cautioned against using televisions as reference monitors because they each have different curves (not sure if I remember correctly!) and auto-enhancement features to make their picture stand out; and while I’ve heard several colorists talk about picking up an OLED TV, I’ve yet to read about anyone’s experiences grading on one – if I’m not mistaken, I think they’re mostly used to demo work to clients…
So there you have it: at least one monitor (the Dell) that I’d consider buying myself if I had a powerful PC; a number of budget options (an Atomos or Small HD monitor); or wait it out till next year or beyond, when several other affordable monitors hit retailers’ shelves. I’ve already put a deposit down on a Ninja Inferno, which is due to arrive in three weeks.