You’ve probably seen a lot of “4K” and “UHD” being thrown around this year, which makes sense considering it’s been touted as the next frontier of video following 1080 and 1440. Most current TVs now support UHD 4K video, as well as OTT services (Roku 4K, Nvidia Shield, Chromecast Ultra, etc.), Netflix, Amazon, Blu-Ray players, and even some laptops. Pretty soon, all of our smartphones and tablets will support 4K. As the world inches closer to 4K, it’s time we really understood what these terms really mean.
There are some common misconceptions about things like 4K and UHD (Ultra High Definition), the biggest being that the two are synonymous with each other. Not the case. 4K resolution is actually just one facet of UHD. What else is part of UHD? Wide color gamut (WCG), high dynamic range (HDR), 4:2:2, 10-bit color, and more. If you’re just learning about these, here’s our rundown of the parts that make up UHD.
Ultra High Definition
The term UHD is derived full HD, the standard we currently use for our video playback technologies. UHD, by itself, doesn’t actually define a specific video resolution or color, but is a combination of technologies that make up what we know as UHD. The terms 4K and UHD have been used pretty interchangeably, and we can thank the TV industry for the confusion. But think of UHD like a film. It’s a combination of cameras, lighting, talent, and scripts that make the film what it is.
Simply put, 4K is 3840x2160 resolution, where the 4K represents 4000 pixels in width. This is four times the amount of pixels of the previous 1920x1080, which means a significant jump in the nuances and details in videos. You’ll also see this being mentioned as 2160p (like the current 1080p), but 4K rolls off the tongue so well that it’s pretty much here to stay.
You might be wondering, is the image quality going to be THAT much better in 4K than 1080? The short answer is yes, but that depends on your monitor. Obviously, the larger your screen, the more noticeable the difference. If you played a 1080 video on a large screen, there are parts that may be blotchy, especially sitting farther away. But with 4K, the image is much sharper and nuanced so that you can use a larger screen, sit further back and have an image that’s crystal-clear and natural. Basically, 4K is significantly better than 1080.
Wide Color Gamut
Most of our current video content (standard HD video) is created and distributed in the REC.709 color gamut. But with UHD comes the next generation of color gamuts, REC.2020, which offers an even larger spectrum of visible colors than REC.709. This means that monitors that support REC.2020 will reproduce colors more vividly, so much as to cover 75.8% of the CIE 1931 color space (aka the mathematical limit of human vision for seeing color). But don’t assume all UHD content will be delivered in the REC.2020 format. Videos have to be specifically graded for REC.2020, and if they’re not, you can assume that they’re delivered in REC.709.
HDR (high dynamic range) is a technique that comes from photography that heightens an image’s dynamic range, which is the contrast between the darkest blacks and brightest whites. What this means for video is that HDR allows your monitor to display finer increments of shading, which allows things like reflections to be much more detailed and accurate. Colors can be richer and realer. The gist of it is, the higher the dynamic range, the more lifelike things look. And with monitor screens having higher nits of brightness than ever before, HDR stands to really make things pop in UHD content.
Bit depth represents the number of possible color combinations in a video signal. In recent history, most devices have captured and distributed content in 8-bit, which equates to 256 shades of red, green and blue. What 10-bit brings to the table is 1024 shades of red, green and blue.
8-bit: 256 x 256 x 256 = 16,777,216 possible color combinations
10-bit: 1024 x 1024 x 1024 = 1,073,741,824 possible color combinations
16 million sounds like plenty, but a billion color combinations really give the image a greater feeling of reality.
We don’t need to capture the color in every single pixel simply because our naked eyes can’t usually tell the difference anyway. So to save bandwidth, our current technology tosses out color values of every other pixel and still produces the colors we all know and love in our 1080 videos. This is represented by numbers like these: 4:2:2, 4:2:0, 4:4:4, etc.
Pretend we’re looking at a 4x4 matrix of pixels in an image. Each little block here represents a pixel with 3 values: Y (black), Cb (blue), and Cr (red). As you can see, each row has 4 values of Y, 2 values of Cb, and 2 values of Cr. This constitutes a 4:2:2 color, where about ⅓ of the colors have been omitted. Our devices take neighboring pixel colors and “guess” the missing color values in.
The above is a 4:4:4 color space, which is considered the best color standard at the moment. In a 4:4:4, every bit of color is retained. When it comes to 4K, it makes sense to pack in all of the original pixel colors because the number of pixels are jumping 4 times from 1080. If our devices were still guessing color values on a 4K video, the color inaccuracies would be much more noticeable and less real.
What Does 4K Mean for Filmmakers?
Now that we’ve established that 4K is just a facet of the UHD umbrella, we want to know how 4K is going to benefit filmmakers. As any cinematographer will attest to, filmmakers have been working with 4K, 6K and even 8K for years, downsampling to resolutions that are supported by playback devices. Shooting in immense resolutions guarantees that a downsampled 4K or 1080 video looks flawless, and offers more freedom in production and post.
So filming in 4K is nothing new. But getting to monitor a 4K image can be a game changer, especially for crews like ACs and DITs. At four times the resolution of current monitors, 4K will offer much more information than ever before, allowing ACs to pull critical focus more accurately and notice even minor changes in focal distance. On a 1080 monitor, you don’t get a proper representation of the sharpness of the image. 4K will enhance this significantly.
Some DITs out there already have a 4K compatible monitor on their cart, which is mostly for their personal monitoring. With 4K monitoring, DITs work with directors and ACs to monitor closer to the final image of the film, which is also offloaded by him to the lab for post. DITs could also distribute 4K feeds all throughout sets.
Then, of course, there is general monitoring by people like the DP and director. Since they’re in charge of the looks, it’s critical that they see the footage in the truest possible way. The closer to the representation of the final version (resolution & colors), the more it helps them creatively in strategizing their shots.
This isn’t to say that 4K monitoring is something novel, but in the current ecosystem, 4K monitoring on set is few and far between. That’s because 4K monitoring systems for productions are simply inefficient and expensive. To upgrade a standard film set to 4K monitoring would be immense, and there just isn’t a way to justify the cost. Not to mention the amount of cabling that would be required to get this done effectively.
4K monitoring will see mass adoption on film sets if it can be achieved efficiently and cost effectively. This will require a massive push towards more utility, more flexibility and lower cost. There’s one compelling way to make 4K monitoring more efficient on set: wireless. Going wireless eliminates all of the cables needed to deliver 4K video to receivers, which will be essential for all shoots from studios to outdoors.
Will we see these kinds of developments soon? The benefits are very clear, but the technology must catch up if we are to make these feasible for filmmakers. But it’s worth keeping an eye out, because much like a new generation of smartphones, 4K wireless monitoring aims to transform the filmmaking world.