4K is Sometimes Actually 2K–But That’s OK

4K is Sometimes Actually 2K--But That's OK

From time to time in our reviews of 4K/HDR home video releases, you may have stumbled across a phrase that seems downright perplexing: “Taken from a 2K digital intermediate.” It stands to reason, after all, that a video file that has spent some portion of its life at 2K resolution can’t really be considered 4K. Or can it?

 

This can be doubly confusing when the sentence before or after makes note of the film being shot “on ARRIRAW at 6.5K resolution” or something to that effect. That’s a whole lot of different Ks for a film that’s ostensibly being released in 4K (or, more accurately “Ultra HD”) for home video. So, what exactly does all of this mean? And should you really care?

 

To get to the bottom of these questions, we need to back up and discuss how movies are shot, produced, and distributed. To keep the discussion as simple as possible, we’ll ignore films that are still captured on actual film stock and just focus on digital cinema, since that’s the way most movies (and TV shows) are shot.

 

Depending on the model of camera used, as well as other technical considerations, the resolution captured by these cameras generally ranges between 2K (2,048 x 858 or 2,048 x 1,152) and 6.5K (6,560 x 3,102), with a few other resolutions in between—like 2.8K (2,880 x 1,620) and 3.4K (3,424 x 2,202)—also commonly used. The “K” is short for “thousand,” and the resulting abbreviation is simply a rough approximation of the horizontal resolution of the resulting file.

 

At any rate, no matter what resolution a film is shot in, the footage has to be reformatted to standard digital cinema projector resolutions, either 2K (2,048 × 1,080) or 4K (4,096 × 2,160), before being distributed to commercial movie theaters. But a lot more than that happens to most films before they’re released. They have to be edited and color timed, and with most 

4K is Sometimes Actually 2K--But That's OK

blockbusters, special effects have to be rendered and composited into the footage that was shot on-set.

 

This work is time-consuming and expensive, and the higher the resolution at which the work is done, the costlier and more time-consuming it is. As such, due to budget constraints, release schedules, or in some cases simply preference, this work is usually done at 2K (2,048 × 1,080) resolution, the result of which is what we refer to as a 2K digital intermediate. This is the last step in the post-production process for most films, before their conversion to Digital Cinema Distribution Master (DCDM) and Digital Cinema Package (DCP), the latter being the compressed version of the final film sent to movie theaters for public consumption.

 

Sometimes, budget and time allowing, films are finished in a 4K digital intermediate—Black Panther, for example, just to name one recent Hollywood blockbuster. But by and large, the vast majority of effects-driven tentpole films go through the 2K bottleneck during postproduction.

 

Which may lead to you ask why they don’t just shoot the movies in 2K to begin with, if they’re going to be downsampled to 2K. It’s a good question. And the answer isn’t a simple one.

 

But, to simplify it as much as possible, shooting in 6.5K or 3.4K or even 2.8K, then downsampling to 2K, will often result in an image that’s crisper, clearer, and more

detailed than an image shot natively in 2K resolution. Ironically, you’ll also find some filmmakers who admit to shooting closeups of actors through filters of one form or another because the enhanced clarity of shooting in 6.5K or 3.4K or whatever can be somewhat less than flattering, even once the footage is downsampled to 2K. Nevertheless, there are technical advantages to shooting at such high resolutions, even if you and I will never see the original full-resolution footage.

 

Of course, there’s one other obvious question you may be asking: If all of this imagery has been shrunk down to 2K resolution, and all of the special effects have been rendered in 2K, why not just be honest about it and release the film in 2K? Why make the bogus claim that these home video releases are in 4K?

 

The cheeky answer is that we don’t have a 2K home video format. Digital cinema resolutions and home video resolutions simply don’t match up for historical reasons that I won’t delve into here. The older high-definition home video format, with its 1,920 x 1,080 pixels, is pretty close to 2K, but it’s still about six percent fewer pixels.

4K is Sometimes Actually 2K--But That's OK

The Oscar-winning Spider-Man: Into the Spider-Verse, which many feel is one of the most
visually stunning recent films and a reference-quality 4K HDR release, was created solely in the
2K domain and then upsampled to 4K for distribution

When you get right down to it, though, pixel count is actually one of the least important contributors to perceived image quality, once you get above a certain resolution. High dynamic range (HDR) video and wide color gamut actually play a much greater role in our perception of the quality of the picture. And HD video formats, such as Blu-ray or 1080p downloads and streams, simply don’t support the larger color gamut and higher dynamic range that modern video displays support.

 

For that, we have to step up to Ultra HD, which is colloquially called “4K” by many in our industry, if only because “Ultra HD” is a mouthful. The thing is, most UHD home video displays have a resolution of 3,840 x 2,160—a little less than the digital cinema standard 4K resolution of 4,096 × 2,160. But still, close enough.

 

And here’s the important thing to consider, if you take nothing else away from this long and rambling screed: If you want to enjoy the best that home video has to offer these days, you’re going to be watching your movies (and TV shows) in Ultra HD on an Ultra HD display. Would it be technically possible for Hollywood to release those movies and shows in something closer to 2K resolution, while also delivering HDR and wide color gamut? Sure. It may be contrary to home video format standards,

but nothing about that would violate the laws of physics.

 

But why would they? Your display (or your player, or maybe even your AV receiver or preamp) is going to upsample any incoming video to match the resolution of your screen anyway. One way or another, you’re going to be viewing 3,840 x 2,160 pixels. As such, why wouldn’t you want the studios to use their vastly more sophisticated professional video scalers to upsample the resolution before it’s delivered to you via disc, download, or streaming? Those video processors don’t work in real-time, the way the processors built into your player, receiver, or display do. They’re slow, methodical, and do a much better job.

 

So even if the movie you’re enjoying this evening technically passed through a 2K-resolution digital intermediate at some point, that doesn’t mean you’re being duped when you’re sold a “4K/UHD” home video release. You’re still enjoying the most important technical advantages of the Ultra HD format—namely the increased dynamic range and color gamut.

 

Mind you, for David Attenborough nature documentaries and other footage that doesn’t require the addition of special effects, I want a genuine Ultra HD video master, with every possible pixel kept intact. But for big Hollywood blockbusters? I honestly think this whole “Fake 4K” discussion has gotten way out of hand.

 

I’ll leave you with one last thought to consider. This summer’s biggest film, Avengers: Endgame, reportedly had a budget of more than $350 million before marketing costs 

were factored in. Of that $350-ish million, roughly $100 million went to the visuals, including special effects. Had the film been finished in a 4K digital intermediate instead of a 2K one, you can bet that budget would have been significantly higher (remember, the jump from 2K to 4K isn’t a doubling, but rather a quadrupling of pixels, since both the horizontal and vertical resolution is doubled, and rendering four times as many pixels simply costs a heck of a lot more money and time.)

 

Would it have been worth it? Well, consider this: The original John Wick film was shot in 2.8K and finished in a 4K digital intermediate, whereas the latest release in the franchise, John Wick 3, was shot in 3.2K and finished in a 2K digital intermediate. I haven’t seen any of these films, but every review I’ve read seems to indicate that the UHD home video release of the third looks noticeably better than the first.

 

If 2K digital intermediates were truly the bane of the home cinephile’s existence, this simply wouldn’t be the case. So, when we mention in reviews that an Ultra HD release came from a 2K digital intermediate, we’re not implying that you’re somehow being cheated out of pixels you thought you were paying for when you bought that big new “4K” display. We’re just video geeks being video geeks and pointing out the most pedantic of details. In the few rare cases where it makes a legitimate difference, we’ll point that out explicitly.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

No Comments

Post a Comment