Video

Does a Luxury Cinema Really Need a Projector?

Does a Luxury Cinema Need a Projector?

Here’s a pop quiz to start your day with: How big is the TV you see in the image above? If you’re familiar with this specific model (LG’s C9 OLED), the proportions of its pedestal may give you some idea. The rest of you probably think this is an unfair question. You’re trying to look for other clues that could give it away: How tall are those ceilings? How wide is that wall? More importantly, how far away from the screen was the camera when this photo was taken?

 

That’s actually exactly my point. For the record, the image is of a 77-inch display. But if I had told you it was 55, or 65, or even 88 inches, would you have balked? Probably not, because you intuitively understand that a display’s screen size isn’t the beginning and end of the conversation when it comes to how large it actually appears to your eyes. It’s the relationship

between the display size and the distance from seat to screen that determines the degree to which an image fills your field of view.

 

Not to pick on my colleague and friend John Sciacca here, but in his recent piece “Rediscovering My Joy for Home Theater,” he says, “Watching movies on a 115-inch screen is incredibly more involving than a 65-inch one.” What John is leaving unsaid there, though, is, “. . . from the same seating distance.” That last bit, that unspoken relationship between seat and screen, was taken for granted in John’s story, because to him it’s obvious. But that fact often gets tossed out the window completely when the gatekeepers of home cinema attempt to discredit the “lowly” TV as a legitimate screen for a proper home entertainment system.

 

I think this outdated perception of projectors as the only valid screens for home cinema systems is probably rooted in the equally outdated notion that commercial cinemas are the gold standard against which the home movie-watching experience should be judged. As I’ve argued in the past, that ship has sailed. 

These days, with a few rare and special exceptions aside, commercial cinemas are simply a way for most people to check out the latest Avengers or Star Wars flick before someone else ruins the plot for them. Or maybe they just want to view those big event movies with a few more subwoofers than their home AV systems can accommodate. But I guarantee you that almost none of the people who opt to go to their local movie theater to see the latest blockbusters would tell you that the allure of seeing an image bounced off a big sheet of perforated vinyl was what drew them out of the comforts of their own homes.

 

And mind you, I’m not claiming there aren’t plenty of valid reasons to install a projector at home. In his own media room, John sits roughly 12 feet from his screen, by his own estimation. He also has two kids at home, so movie-watching is often a whole-family experience. For his needs and his lifestyle, yeah, a projector is absolutely the right screen.

 

I, on the other hand, only have to worry about my wife and me. The only other permanent resident is Bruno, our 75-pound pit bull, and more often than not he either leaves the room when we watch movies or curls up in my lap and goes to sleep. We also only sit about six and a half feet from the screen in the main media room. The smallest high-performance home cinema projection screen I’m aware of is an 80-incher that would frankly be too much at that seating distance. A 75-inch display is pretty much perfect for this room, as it takes up a healthy 45.5 degrees of our field of view—a little more than

THX’s recommended 36 degrees, but so be it. We’d rather have a bit too much screen than a bit too little. But we don’t want The Last Jedi turning into a tennis match, either.

 

Interestingly enough, John’s 115-inch projection screen, when viewed from 12 feet away, takes up roughly 38.5 degrees of his field of view. In other words, my 75-inch screen looks bigger to me and my wife than his 115-inch projection screen looks to him and his family.

 

Am I bashing John’s choice of screens? Of course not. What works for him works for him, and what works for me

How to Determine Your Viewing Distance

 

If you want figure out your screen size based on viewing distance, or vice versa, but without having to wade through technical specs or do any heavy math, click this link.

works for me. And I’m sure he would agree. Different rooms. Different families. Different viewing habits. Different solutions. Without a doubt, we’re both enjoying a better movie-watching experience than we would at the local cineplex, and his system gives him one big advantage over mine: He gets to watch ultra-widescreen 2.4:1 aspect-ratio films without any letterboxing.

 

In addition to the larger perceptual screen real estate, though, my TV also gives me better black levels, better dynamic range, better peak brightness, and better color uniformity than any two-piece projection system could. And if for whatever reason we ever decided to watch a movie with the lights on, we wouldn’t have to worry about the screen washing out. (Not that we would, mind you. My wife and I prefer to keep any and all distractions to a minimum when watching movies, going so far as to put our mobile phones away or turning them off entirely. I’m just saying that we could leave a light on if we wanted to.)

 

And yet, the naysayers and gatekeepers would have you believe that for whatever reason my viewing experience is subpar. That I would somehow be better served by lacking black levels, middling contrasts, less peak brightness, and worse screen uniformity, simply because that would be a more faithful facsimile of the local cineplex.

 

To which I say this: The New Vision Theatres Chantilly 13 across town isn’t the yardstick by which I judge my movie-watching experience at home anymore. My home cinema system looks better and sounds better, and quite frankly has a better selection of films from which to choose. Granted, if we had a much larger room, or typically invited large groups of friends over to watch movies, a projection screen would likely be a superior alternative to our 75-inch TV on the balance sheet. If we had two or three rows of seating? No question about it—we would need a projector.

 

The beauty of current AV gear, though, is that you don’t have to change your lifestyle or viewing habits to have a better-than-movie-theater experience at home. You can assemble a reference-quality home cinema that conforms to your lifestyle, not the other way around. And if, like me, that means employing a gigantic TV as your screen of choice, you shouldn’t pay much attention to anyone telling you you’re doing it wrong, or that your system doesn’t count as “luxury.” Chances are, they’re trying to sell you something.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

“Apollo 11” Goes 4K

"Apollo 11" Goes 4K

If you’ve read my review of the original HD release of Todd Douglas Miller’s documentary film Apollo 11 from earlier this year, you may recall that it was a bit more of a rant than a proper critique. Not about the film, mind you. Apollo 11 still stands as one of the year’s best cinematic efforts, especially in the more straightforward, less editorial approach it takes in capturing this one monumental moment in history.

 

The rant was instead about the film’s home video release, which was originally HD only, with no mention of a UHD/HDR followup. As I said in that original review, this was doubly troubling because Apollo 11 is among a small handful of films released recently to actually be sourced from a 4K digital intermediate. In fact, its original film elements were scanned at

resolutions between 8K and 16K. Given that most modern films, especially Hollywood tentpoles, are finished in 2K digital intermediates and upsampled to 4K for cinematic and home video release, the lack of a UHD option for Apollo 11 was as infuriating as it was puzzling.

 

Thankfully, that mistake has been rectified. Apollo 11 is now available in UHD with HDR on most major video platforms, including disc and Kaleidescape, with the latter being my viewing platform of choice. I know I mentioned purchasing the film in HD via Vudu in my original review, but that purchase doesn’t offer any sort of upgrade path for UHD, the way Kaleidescape does.

 

At any rate, I did a lot of speculation in that first review about the sort of differences I thought UHD would make for this title. And having now viewed it, most of those predictions turned out to be true. UHD does, indeed, reveal a lot of detail that was obscured in the HD release. That makes sense given that the source of so much of this film’s visuals existed in the form of 65mm/70mm archival footage.

 

One of the biggest differences you see when comparing the 

HD and UHD releases is in the textures of the Saturn V rocket. Ribbing in the first three stages of the rocket that dwindle to nothing in HD are clear and distinct in UHD. The little flag on the side of the rocket is also noticeably crisper, and the stars in its blue field stand out more as individual points of whiteness, rather than fuzzy variations in the value scale.

 

As predicted, the launch of Apollo 11 also massively benefits from HDR grading. The plume of exhaust that billows forth from the rocket shines with such stunning brightness that you almost—almost—want to squint.

 

One thing I didn’t predict, though—which ends up being my favorite aspect of this new HDR grade—is how much warmer and more lifelike the imagery is. In the standard dynamic range color grade of the HD version of the film, there’s an undeniable cooler, bluer cast to the colors that never really bothered me until I saw the warmer HDR version. Indeed, the HDR grade evokes the comforting warmth of the old Kodak stock on which the film was captured in a way the SDR grade simply doesn’t.

 

It’s true that the new UHD presentation does make the grain more pronounced in the middle passage of the film—where 65mm film stock gives way to 35mm and even 16mm footage. That honestly has more to do with the enhanced contrast of 

this presentation than it does the extra resolution. HD is quite sufficient to capture all the nuances and detail of this lower-quality film. But the boost in contrast does mean that grain pops a little more starkly.

 

This does nothing to detract from the quality of the presentation, though, at least not for me. And even if you do find this lush and organic grain somewhat 

distracting, I think you’ll agree it’s a small price to pay for the significantly crisper, more detailed, more faithful presentation of the first and third acts.

 

If you haven’t picked up Apollo 11 yet, congratulations—you get to enjoy your first viewing as it should have been presented to begin with. If you already bought the film in HD, I can’t recommend the upgrade to UHD highly enough. Thankfully, for Kaleidescape owners, that upgrade doesn’t mean purchasing the film all over again.

 

It is a shame Universal, the film’s home video distributor, has for whatever reason decided to hold back bonus features. The featurette included with the UHD Blu-ray release, which covers the discovery of the 65mm archival footage, is missing here—although it’s widely available on YouTube at this point (and is embedded above). And only Apple TV owners get access to an exclusive audio commentary. Then again, given how badly the studio fumbled the original home video release, it’s no real surprise that they’ve dropped the ball on making the bonus features widely available.

 

Don’t let that turn you off of the film, though. This is one that belongs in every movie collection, especially now that it’s available in UHD.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

4K is for Fanboys

4K is for Fanboys

I feel as if I might have a reputation around these parts, a heel of sorts. Why a heel and not a hero? Because I find that my opinions are often in opposition to that of my contemporaries. Not because they are wrong, but just because I think their focus is continually on things, topics, and ideas that play to a base that, well, is dying.

 

Dennis Burger wrote a terrific piece on why 4K isn’t always 4K. It is a truly good piece of writing and one that gets 99 percent of the argument absolutely correct. For as someone who has literally filmed a feature-length film for a motion-picture studio in 

true 4K only to have it shown in theaters in 2K, I can attest to the article’s validity. But Dennis, like me some years ago, missed the boat by even framing the argument around resolution at all.

 

You see, I thought people/viewers cared about things like resolution. Back in 2008, when I filmed my movie, the original RED ONE cinema camera just came out, and as a result the “whole world” was clamoring for 4K—or so it seemed. I had the choice of whether or not to film in 4K via the RED ONE or go with a more known entity by filming in 2K via cameras from Sony’s CineAlta line. Ultimately I chose option C, and went with a true dark horse contender 

in Dalsa, who up and to that point, had no cinema pedigree—unless you count being the ones who designed the sensor tech for the Mars Rover a cinematic endeavor. But I digress.

 

We didn’t use the RED ONE because it was buggier than a roadside motel mattress, and I didn’t choose to side with Sony because they were HD, and HD was yesterday’s news. Filming in 4K via the Dalsa back in 2008 was an absolute pain in the ass. (That’s me with the Dalsa Origin II in the photo at the right.) Spoiler alert, not much has changed in 2019, as 4K 

continues to be a bit of a pain, it’s just more accessible, which makes everyone think that they need it—more on that in a moment.

 

What is upsetting is that I do know the monetary difference my need to satiate the consumer-electronics fanboys cost me and my film—a quarter of a million dollars. While $250,000 isn’t much in Hollywood terms, it represented over a quarter of my film’s total budget. The cost of filming in HD you ask? Less than $30,000. Oh, and post production 

4K is for Fanboys

would’ve taken half the time—thus lowering costs further. All of that headache, backache, and money only to have the film bow in 2K via 4K digital projectors from—wait for it—Sony!

 

Now, I will sort of agree with the assertion that capturing visuals at a higher resolution or quality and downscaling to a lesser format—say HD—will result in a clearer or better picture—but honestly, only if you preface what you’re watching as such ahead of time. Which brings me to my point: All of this HD vs. 4K talk is for fanboys who insist on watching pixels and specs rather than watch the damn movie. Not one person, or journalist (apart from me), wrote about my film from the context of being the first-feature length film ever filmed entirely in 4K. They didn’t ask about it, nor care, because it doesn’t matter.

 

It never mattered.

 

What digital has done is remove the magic from cinema and replace it with a bunch of numbers that bored middle-aged dudes (yes, dudes) can masturbate over in an attempt to differentiate their lot from the rest. None of it has any bearing on the story, enjoyment, or skill. It’s an arms race, one we all fall prey to, and one we continually perpetuate, because, well, it sells. We’ve gotten away from cinema theory, history, and storytelling in recent years and instead become infatuated with bit-rates, color spaces, and codecs. And yet, in the same breath, so many of us bitch about why there are no good films being made anymore. It’s because the only thing audiences will pay for is what they think is going to look great on their brand new UltraHD TV.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

Choosing My New Projector

Choosing My New Projector

Following up on my last post, “It’s Time to Update My Theater,” I’m going to delve into the thought process that caused me to splurge and finally upgrade my projector.

 

As I mentioned, my existing projector was about 11 years old, and, while it still produced watchable pictures from Blu-ray and DVD discs, it wasn’t compatible with many of the new 4K HDR sources in my system, so we had just stopped using it. I was

toying around with ditching both the projector and my current 65-inch Sony flat panel and upgrading to a new 85-inch flat panel.

 

Why 85 inches? Well, that is about the current size limit before you start getting into ridiculously expensive pricing. For under $4,500, you can get a Sony XBR-85X950G flat-panel that has been universally reviewed as a fantastic display. This would provide a large screen image for viewing all the time, not just at night with the lights down. It would also handle HDR signals (and Dolby Vision) far better than a projector at any price could.

 

As this was a significantly cheaper upgrade option, I really considered it, but ultimately decided I would miss the truly large-screen experience of my 115-inch, 2.35 aspect screen.

 

We use the projector almost exclusively for movie watching, and having nearly double the screen real estate makes a massive difference, and is far more engaging than a direct-view set, even one at 85 inches. (Now, had the 98-inch 

Sony Z-series TV been a tenth of its price—selling for $7,000 instead of $70,000—that probably would have been my pick.)

 

So, having made the decision to stick with front projection, I had to settle on a model. I had a few criteria going in that helped narrow the search.

 

First, I wanted it to be true, native 4K resolution on the imager, not using any pixel shifting or “wobulation” to “achieve 4K resolution on screen.” This ruled out many of the DLP models from companies like Epson and Optoma. Nothing against them, I just wanted native 4K.

 

Second, it had to have a throw distance that worked with my current mounting location. Actually, this isn’t much of a concern anymore, and most modern projectors have an incredibly generous adjustment range on their lens.

 

Third, I needed a model that offered lens memory so it would work with my multi-aspect screen (92 inches when masked down to 16:9, and 115 inches when opened to full 2.35:1.) This allows the projector to zoom, shift, and focus for a variety of screen sizes at the push of a single button, and is crucial for multi-aspect viewing.

 

Fourth, it needed to integrate with my Control4 automation system. Sure, I could cobble together a driver, but it would never offer integration as tight as one that was meant to work with that particular model.

 

Finally, it had to fit my $10,000 budget. Unfortunately, this ruled out brands like Barco and DPI. I was super impressed with Barco’s Bragi projector, but, alas, it doesn’t fit in my tax bracket.

 

Basically, with these criteria, my search was narrowed to two companies: JVC and Sony. And primarily to two projectors: The JVC DLA-NX7 (shown at the top of the page) and the Sony VPL-VW695ES. (Were my budget higher, I would have added the JVC DLA-NX9 to that list, which has the primary advantage of a much higher quality, all-glass lens, but it was more than double the price. And while the less expensive JVC DLA-NX5 also met all my criteria, the step up NX7 offers more bang for just a little more buck.)

 

So, I did what a lot of people do prior to making a big technology purchase: Research. I read a ton of forum posts, read all of the reviews on both models, and watched video comparisons. I also reached out to a couple of professional reviewers and calibrators who had actually had hands-on time with both models.

 

The CEDIA Expo is a place where manufacturers often launch new projectors, so this past month’s show coincided perfectly with my hunt. Since both companies had models that had been launched at CEDIA 2018, I was eager to see what announcements they might have regarding replacements or upgrades. Alas, there were no model changes, which, in a way, can be a good thing, since it means both models are now proven, have had any early bugs worked out with firmware updates, and  are readily available and shipping.

 

I really hoped to check out both projectors at the show, but, unfortunately, no one was exhibiting either. (Apparently, CEDIA is not the place to show your sub-$10,000 models.)

 

Ultimately, two announcements at the show swayed me to pull the trigger on the JVC. First, the product manager I spoke with said the price was going up by $1,000 on October 1, so buying sooner than later would actually save me money. But more importantly, JVC introduced new firmware at CEDIA that would add a Frame Adapt HDR function that will dynamically analyze HDR10 picture levels frame by frame, automatically adjusting the brightness and color to optimize HDR performance for each frame.

 

Projectors historically have a difficult time handling HDR signals, and this firmware is designed to produce the best HDR images from every frame. This used to be achieved by using a high-end outboard video processor such as a Lumagen Radiance Pro, but that would add thousands of dollars to the system. When I saw this new technology demonstrated in JVC’s booth, I was all in.

 

In my next post, I’ll let you know if the purchase was worth it. (Spoiler: It totally was!)

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

4K is Sometimes Actually 2K–But That’s OK

4K is Sometimes Actually 2K--But That's OK

From time to time in our reviews of 4K/HDR home video releases, you may have stumbled across a phrase that seems downright perplexing: “Taken from a 2K digital intermediate.” It stands to reason, after all, that a video file that has spent some portion of its life at 2K resolution can’t really be considered 4K. Or can it?

 

This can be doubly confusing when the sentence before or after makes note of the film being shot “on ARRIRAW at 6.5K resolution” or something to that effect. That’s a whole lot of different Ks for a film that’s ostensibly being released in 4K (or, more accurately “Ultra HD”) for home video. So, what exactly does all of this mean? And should you really care?

 

To get to the bottom of these questions, we need to back up and discuss how movies are shot, produced, and distributed. To keep the discussion as simple as possible, we’ll ignore films that are still captured on actual film stock and just focus on digital cinema, since that’s the way most movies (and TV shows) are shot.

 

Depending on the model of camera used, as well as other technical considerations, the resolution captured by these cameras generally ranges between 2K (2,048 x 858 or 2,048 x 1,152) and 6.5K (6,560 x 3,102), with a few other resolutions in between—like 2.8K (2,880 x 1,620) and 3.4K (3,424 x 2,202)—also commonly used. The “K” is short for “thousand,” and the resulting abbreviation is simply a rough approximation of the horizontal resolution of the resulting file.

 

At any rate, no matter what resolution a film is shot in, the footage has to be reformatted to standard digital cinema projector resolutions, either 2K (2,048 × 1,080) or 4K (4,096 × 2,160), before being distributed to commercial movie theaters. But a lot more than that happens to most films before they’re released. They have to be edited and color timed, and with most 

4K is Sometimes Actually 2K--But That's OK

blockbusters, special effects have to be rendered and composited into the footage that was shot on-set.

 

This work is time-consuming and expensive, and the higher the resolution at which the work is done, the costlier and more time-consuming it is. As such, due to budget constraints, release schedules, or in some cases simply preference, this work is usually done at 2K (2,048 × 1,080) resolution, the result of which is what we refer to as a 2K digital intermediate. This is the last step in the post-production process for most films, before their conversion to Digital Cinema Distribution Master (DCDM) and Digital Cinema Package (DCP), the latter being the compressed version of the final film sent to movie theaters for public consumption.

 

Sometimes, budget and time allowing, films are finished in a 4K digital intermediate—Black Panther, for example, just to name one recent Hollywood blockbuster. But by and large, the vast majority of effects-driven tentpole films go through the 2K bottleneck during postproduction.

 

Which may lead to you ask why they don’t just shoot the movies in 2K to begin with, if they’re going to be downsampled to 2K. It’s a good question. And the answer isn’t a simple one.

 

But, to simplify it as much as possible, shooting in 6.5K or 3.4K or even 2.8K, then downsampling to 2K, will often result in an image that’s crisper, clearer, and more

detailed than an image shot natively in 2K resolution. Ironically, you’ll also find some filmmakers who admit to shooting closeups of actors through filters of one form or another because the enhanced clarity of shooting in 6.5K or 3.4K or whatever can be somewhat less than flattering, even once the footage is downsampled to 2K. Nevertheless, there are technical advantages to shooting at such high resolutions, even if you and I will never see the original full-resolution footage.

 

Of course, there’s one other obvious question you may be asking: If all of this imagery has been shrunk down to 2K resolution, and all of the special effects have been rendered in 2K, why not just be honest about it and release the film in 2K? Why make the bogus claim that these home video releases are in 4K?

 

The cheeky answer is that we don’t have a 2K home video format. Digital cinema resolutions and home video resolutions simply don’t match up for historical reasons that I won’t delve into here. The older high-definition home video format, with its 1,920 x 1,080 pixels, is pretty close to 2K, but it’s still about six percent fewer pixels.

4K is Sometimes Actually 2K--But That's OK

The Oscar-winning Spider-Man: Into the Spider-Verse, which many feel is one of the most
visually stunning recent films and a reference-quality 4K HDR release, was created solely in the
2K domain and then upsampled to 4K for distribution

When you get right down to it, though, pixel count is actually one of the least important contributors to perceived image quality, once you get above a certain resolution. High dynamic range (HDR) video and wide color gamut actually play a much greater role in our perception of the quality of the picture. And HD video formats, such as Blu-ray or 1080p downloads and streams, simply don’t support the larger color gamut and higher dynamic range that modern video displays support.

 

For that, we have to step up to Ultra HD, which is colloquially called “4K” by many in our industry, if only because “Ultra HD” is a mouthful. The thing is, most UHD home video displays have a resolution of 3,840 x 2,160—a little less than the digital cinema standard 4K resolution of 4,096 × 2,160. But still, close enough.

 

And here’s the important thing to consider, if you take nothing else away from this long and rambling screed: If you want to enjoy the best that home video has to offer these days, you’re going to be watching your movies (and TV shows) in Ultra HD on an Ultra HD display. Would it be technically possible for Hollywood to release those movies and shows in something closer to 2K resolution, while also delivering HDR and wide color gamut? Sure. It may be contrary to home video format standards,

but nothing about that would violate the laws of physics.

 

But why would they? Your display (or your player, or maybe even your AV receiver or preamp) is going to upsample any incoming video to match the resolution of your screen anyway. One way or another, you’re going to be viewing 3,840 x 2,160 pixels. As such, why wouldn’t you want the studios to use their vastly more sophisticated professional video scalers to upsample the resolution before it’s delivered to you via disc, download, or streaming? Those video processors don’t work in real-time, the way the processors built into your player, receiver, or display do. They’re slow, methodical, and do a much better job.

 

So even if the movie you’re enjoying this evening technically passed through a 2K-resolution digital intermediate at some point, that doesn’t mean you’re being duped when you’re sold a “4K/UHD” home video release. You’re still enjoying the most important technical advantages of the Ultra HD format—namely the increased dynamic range and color gamut.

 

Mind you, for David Attenborough nature documentaries and other footage that doesn’t require the addition of special effects, I want a genuine Ultra HD video master, with every possible pixel kept intact. But for big Hollywood blockbusters? I honestly think this whole “Fake 4K” discussion has gotten way out of hand.

 

I’ll leave you with one last thought to consider. This summer’s biggest film, Avengers: Endgame, reportedly had a budget of more than $350 million before marketing costs 

were factored in. Of that $350-ish million, roughly $100 million went to the visuals, including special effects. Had the film been finished in a 4K digital intermediate instead of a 2K one, you can bet that budget would have been significantly higher (remember, the jump from 2K to 4K isn’t a doubling, but rather a quadrupling of pixels, since both the horizontal and vertical resolution is doubled, and rendering four times as many pixels simply costs a heck of a lot more money and time.)

 

Would it have been worth it? Well, consider this: The original John Wick film was shot in 2.8K and finished in a 4K digital intermediate, whereas the latest release in the franchise, John Wick 3, was shot in 3.2K and finished in a 2K digital intermediate. I haven’t seen any of these films, but every review I’ve read seems to indicate that the UHD home video release of the third looks noticeably better than the first.

 

If 2K digital intermediates were truly the bane of the home cinephile’s existence, this simply wouldn’t be the case. So, when we mention in reviews that an Ultra HD release came from a 2K digital intermediate, we’re not implying that you’re somehow being cheated out of pixels you thought you were paying for when you bought that big new “4K” display. We’re just video geeks being video geeks and pointing out the most pedantic of details. In the few rare cases where it makes a legitimate difference, we’ll point that out explicitly.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

The 4 Hottest Trends in Luxury Video

The 4 Hottest Trends in Luxury Video

Samsung’s The Wall Luxury microLED TV

I got a chance to get a bead on the latest trends in luxury video this past week at the annual custom integrators CEDIA Expo in Denver. It was great to see that some of the most intriguing products announced at January’s Consumer Electronics Show (CES) are finally becoming real. 

 

microLED DISPLAYS

Perhaps the most exciting technology on display were large-screen microLED video panels that can come in sizes up to 65 feet diagonal. The images on these screens are incredibly bright, have no loss of black level due to ambient lighting, offer incredible contrast, support a wider color gamut, offer superior off-angle viewing, and handle HDR signals far better than front-projection systems.

 

MicroLED systems use small LED tiles, usually little larger than a brick, that snap into a larger matrix to form the full panel. You can later add more tiles to form an even larger screen, and tiles can be replaced as needed. (Most displays ship with extra tiles that have been matched to ensure color uniformity in the picture and facilitate in-field replacement.)

 

MicroLED panels increase resolution by decreasing the size of the pixel structure, or pitch. Reducing the distance between pixels—as measured from center-of-pixel to center-of-pixel—makes individual pixels invisible at typical seating distances.

Many companies offer panels with pixel pitch of less than 8mm.

 

The downside? This technology is massively expensive. How massive? Samsung’s 146-inch diagonal The Wall Luxury (shown above) will retail for $400,000. Need bigger? Sony has you covered with it’s Crystal LED Display System—previously given the awkward nickname CLEDIS—with a 16 x 9-foot panel (219-inch diagonal) that is full 4K resolution, with 1-million:1 contrast and supports high frame rates up to 120 fps, selling for $877,000. Other manufacturers I spoke with—such as Planar, Barco, and Digital Projection—all offer panels of varying sizes with similar pricing.

 

For the luxury market, this is truly the ultimate solution; but it looks likely microLED will never reach mainstream pricing.

 

 

LARGE-SCREEN PROJECTION

If you want a screen larger than 90 inches for your luxury theater or media room but don’t want to pay the exorbitant prices commanded by microLED displays, front-projection systems remain the best way to go. Due to limitations in light output, projectors often struggle with HDR signals, which are typically mastered for LED displays capable of producing far brighter images. Improving HDR handling is something projector companies continue working on, and both Sony and JVC rolled out new firmware specifically to address how their projectors process HDR images.

 

JVC’s new Frame Adapt HDR analyzes the peak brightness of each frame using a proprietary algorithm and adjusts dynamic range to provide the best possible HDR image. Frame Adapt HDR works with any HDR10 content, meaning all HDR sources—Kaleidescape Strato, Ultra HD Blu-ray players, Apple TV 4K, Xbox One, etc.—can be enjoyed with greater dynamic range and image quality.

 

Barco displayed a very cool projection solution by using a mirror system and its projection-warping technology to place the projector way off center and hidden out of the way—

actually turned sideways in a soffit and firing from the back corner of the room—while still offering a fantastic large-screen image.

 

 

ULTRA-SHORT-THROW PROJECTION

Ultra-short-throw projectors can sit very close to the screen wall—often just inches away—tucked low and out of sight, and can even be completely concealed in cabinetry. Paired with ambient-light-rejecting screens, these projectors produce bright and contrasty images in a typically lit room, meaning they can serve as a TV replacement, giving you 100 to 120 inches of screen that can be enjoyed all the time.

 

Short-throws have typically been priced for the upper end of the market. But at least four companies—LG, Epson, Optoma, and Hisense—now offer 4K laser projectors, usually paired with an appropriate screen and featuring a basic audio system, for under $6,000. This makes them a more attractive option for secondary rooms, like a den or bedroom. 

 

 

8K VIDEO

It seems silly to be talking about 8K video when we aren’t even at a point where broadcast TV—either off-the-air, cable, or satellite—can regularly deliver 4K images, but progress never stops in technology land. Sony, LG, and Samsung all demonstrated 8K displays at the show.

 

Beyond the added pixels—of which there are over 33 million in a 7,680 x 4,320 array—these sets also feature flagship video processing and higher brightness. And it’s these other features that have far more impact on the image than all the extra pixels.

 

Of the sets on display, one of the most impressive was LG’s new 88-inch 8K OLED, which delivered truly lifelike images, with amazing color detail and the ultra-deep black levels for which OLED is known. I’m sure they were feeding the set true 8K images, as they had stunning clarity and depth. At $30,000, this set is truly luxury, but for the viewer who wants the best-of-the-best, this 8K OLED panel won’t fail to impress.

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

Why Filmmaker Mode Matters

This week, at an event in Los Angeles, movie director Rian Johnson (Brick, LooperThe Last Jedi), introduced a new feature called Filmmaker Mode, which will appear on select TVs beginning in 2020. This might sound strikingly similar to pictures modes you already have on your TV, which go by names like “Cinema” or “Movie.” So what makes this different? “If you like movies,” Johnson said, “then Filmmaker Mode will make movies not look like poo poo.” Those are awfully big words. But, as it turns out, this new mode is actually a very simple enhancement.

 

Every TV already comes with all kinds of modes that have an impact—sometimes negative—on the picture. You might remember that at the end of 2018, Tom Cruise took to Twitter to post a video about the evils of motion smoothing, sometimes 

referred to as “The Soap Opera Effect.” This technology, which is also known as “motion interpolation” or “motion-compensated frame interpolation,” has been around for years, although it’s usually labeled with some slick marketing term on your TV such as “Auto Motion Plus,” “Clear Motion Rate,” “Action Smoothing,”  “Smooth Motion Effect,”  “MotionFlow,” “ClearScan,” or “TruMotion.” All of these terms really refer to the same thing: The process

of artificially creating frames of video and inserting them in between existing frames in your favorite movies or TV shows in order to reduce motion blur.

 

Reviewers, directors, cinematographers, editors, and cinephiles alike all urge the viewing public to turn off motion smoothing—which is often on by default—and other extra processing layered on by display manufacturers, and to instead set their displays to a basic set of standards meant to reproduce a movie as accurately as possible. But telling people how to defeat the various modes can be difficult and confusing given the kind of inconsistent jargon described above. Even as someone who reviews TVs, I would have to look up “ClearScan” and what it does to know whether I want it on or off. It sounds more like a TSA screening machine than a kind of picture processing.

 

Creatives and enthusiasts have been pushing to keep extra processing out of watching movies at home for as long as there’s been extra processing. But Filmmaker Mode is different, because all of the various forces—including the movie creators, the studios, and the display manufacturers—are all pushing together.

 

Simply put, Filmmaker Mode preserves the aspect ratio, frame rate, and color of the movie or TV show you’re watching so they match what was seen on the reference monitors used for post production as closely as possible. To do this, it sets the 

Why Filmmaker Mode Matters

correct color temperature on your display, turns off motion smoothing and other processing like sharpness and noise reduction, and makes sure the image isn’t stretched out.

 

It’s not yet clear how this will be implemented for the user, but, based on what was said at the UHD Alliance

event, it will likely be either a dedicated button on the display’s remote or—and this would be ideal—included in the metadata of a disc, stream, or download, so the display would turn on Filmmaker Mode (in other words, turn off all the extra junk) automatically.

 

Filmmaker Mode has been endorsed by Warner Bros., NBCUniversal, Amazon Prime, Vizio, Panasonic, LG, and dozens of household-name movie directors, including Martin Scorsese and Christopher Nolan. Vizio has announced that its 2020 line of smart TVs will include the new mode, and there are rumors that manufacturers are looking into adding it to existing displays through firmware updates. If you’re wondering how that’s possible, most modern UHD/HDR TVs can already do all of the things Filmmaker Mode does—but only if you’re willing to dig through all the menus and dial in a dozen or more settings.

 

The biggest issue facing Filmmaker Mode won’t be getting manufacturers to include it with their products. Similar modes already exist, such as the Netflix Calibrated Mode on Sony displays. The challenge will be educating the public about why they should care enough to push this button (which is why the idea of it being included in metadata is so enticing to me). Or maybe Maverick can post some more videos on Twitter about the Filmmaker Mode button to let people know it’s there.

John Higgins

John Higgins lives a life surrounded by audio. When he’s not writing for Cineluxe, IGN,
or 
Wirecutter, he’s a professional musician and sound editor for TV/film. During his down
time, he’s watching Star Wars or learning from his toddler son, Neil.

How Kaleidescape Makes Movies Look Amazing

How Kaleidescape Makes Movies Look Amazing

Like most of you, I’ve never put a tremendous amount of thought into the work involved in bringing a film from movie theaters to the home. Sure, I know the video needs to be compressed—more so for streaming-video services than for discs or high-bandwidth downloads, the likes of which you’d buy from the Kaleidescape store. But beyond that basic understanding, the process was a bit of a mystery to me.

 

Never one to let an interesting mystery go unsolved, I sat down with Kaleidescape’s Luke O’Brien, Director of Content Operations, and Mike Kobb, Principal Engineer, User Experience, to pick their brains about the process. I discovered that, in many ways, it’s a far more complicated undertaking than I could have imagined—mainly because there isn’t really

How Kaleidescape Makes Movies Look Amazing

Luke O’Brien and Mike Kobb

a consistent pipeline from big screen to home screens. Much of that could probably be attributed to the fact that the home video market is ever-evolving, and that what Kaleidescape is doing—delivering high-bandwidth, pixel-perfect presentations of movies, TV shows, and documentaries—is unique in this era of highly compressed streaming.

 

In short, the files Kaleidescape receives from the various studios vary quite a bit. But they all fall under the umbrella of “mezzanine files”—and if you’ve never heard that term before, you’re probably not alone. To put it simply, mezzanine files are lightly compressed video files that 

are usually indistinguishable from fully uncompressed video. And by “lightly compressed,” I mean that your average movie might arrive in a file that’s ten times the size of a normal UHD Blu-ray disc.

 

So, how does Kaleidescape shrink that amount of data to a file small enough to be downloaded to your hard drive, but not so small that it compromises the viewing experience? How do they ensure that the image you see on your screen looks just as good as—if not better than—the master files delivered by the movie studios? That was my first question.

—Dennis Burger

 

 

Mike Kobb  I think one of the things that is a huge asset to Kaleidescape is the human element that goes into preparing this content. This is done by people who take a lot of pride and put a lot of effort into making stuff look really good and ensuring that everything is right. They sweat the details. It’s not, and I doubt that it will ever be, an operation where a digital file shows up from a studio and gets tossed into the hopper and completely automated machines grind it up and out comes the end product.

 

Dennis Burger  How long does that process take? I mean, let’s take a recent mainstream theatrical movie as an example. Let’s say, Captain Marvel, which I think it’s safe to say is being prepped for home video as we speak. How long does it take you, from the time you’re given whatever files you receive from the studio, to the point where it’s prepared and ready to be released once that digital release date hits?

 

Luke O’Brien  Well, we’re constantly doing things to try and make that process tighter and cleaner and quicker, to shorten the windows. And we have a whole toolset we’re working to go wide with this quarter, which I think will speed up this process significantly. But as it stands right now, the average title takes several business days.

 

MK  Yeah, it takes us about two business weeks to prepare a movie.

 

LO  And we’ve done it faster, in cases where we’ve needed to. And we’ve done it much slower in cases where we’ve run into problems that needed to be addressed. But if we don’t think it’s good enough, we just won’t release it. There’s a quality line we have to defend with our products. And mind you, I don’t consider anything in that state forever. There are files that we haven’t been happy where we landed with them, and I consider them to be still works in progress. And no, I’m not going to tell you what they are. But it will be a happy surprise when they show up on the service looking as great as they should when they’re on the Kaleidescape System.

 

DB  This was honestly a bit of a surprise for me, and I think it would be for many people who just assumed that in this era of 4K, Kaleidescape simply got a copy of the UHD Blu-ray disc, ripped it to your hard drives, put it on your servers, and delivered exactly the same bits that are on the disc via the internet. It’s nothing like that, though, is it?

 

LO  No. The files we get from the studios are raw files in a variety of formats, depending on the studio. Some of them are going to be ProRes files, some of them are going to be MOV files, some of them are going to be IMFs (Interoperable Mastering Format). There’s a variety of base container files they use to send those over, mostly because these files are 

How Kaleidescape Makes Movies Look Amazing

ready wildly in advance of when disc files are ready and we’re really aggressive about making sure we’re always hitting the first possible date a digital release can be made available to our customers. So, we need to receive these files in a manner that a lot of the other places in the digital market do take them.

 

But we’re handling them differently, because obviously our delivery method isn’t to create something designed to be pumped out and compressed and uncompressed to varying degrees for streaming. We actually had to create a way to take the base files they give us and to create a Kaleidescape Container File: Something that is a beautiful package that will serve as

the movie on the customer’s system, that they would then download and have locally to watch and enjoy.

 

DB  The process obviously still involves some careful compression, though. Do you also do your own HDR grading? I ask because I’ve noticed that your HDR sometimes looks more cinematic, more subtle than what I’ve seen on other home video releases.

 

LO  We don’t do our own HDR grade. We don’t do that level of file detail correction.

 

MK  We’re not looking to make any changes to the way the filmmakers intended that movie to look. We always strive to get it to be as proper a representation of that as possible.

 

DB  So, what would account for the subtle differences I saw in, say, Incredibles 2, where other HDR home video releases seemed to focus more on stark contrasts, but the Kaleidescape HDR presentation seemed to err on the side of subtlety and richness of shadow detail?

 

LO  Well, we do have a transcode process that we take the files and run them through. And that will not be identical to what will come through when any other person puts their files together. One thing I can say is that you’re talking about a studio that’s very protective of their property, and between us and the studio there’s often an elaborate process to getting our titles qualified.

 

DB  One of the things that prompted me to want to have this conversation was the Kaleidescape presentation of Blue

Planet II. I thought your HDR presentation of that series was just utterly stunning. Does a series like that—a mini-series that was created for broadcast on BBC, rather than a theatrical presentation—go through a different process than your typical movie release?

 

LO  Oof. That one’s a little bit different, because there are a lot more pieces in the supply chain on that particular title, because it was created for UK television presentation. That was really the intended final target. So, we worked with BBC and BBC worked with some external processing houses to have a regraded, transformed file. But they work with them to make sure they’re happy with all the color corrections as everything goes through to get it to a file format that we can take and transcode and deliver to our customers. But on this end, it just goes through our normal process.

 

I love the way that particular title looks as well, and I want to give Kaleidescape credit for absolutely everything I can. But really, you have to give BBC credit for making such a beautiful, spectacular original source file. I don’t know what process it went through elsewhere, but I do think it looks stunning on our service.

 

DB  Would you say the process of something like that, which was intended for TV broadcast, ends up being more complicated or less so than your typical blockbuster movie?

 

LO  I think the important thing to consider here is that we have a human review process. So, it’s certainly more time-intensive. I don’t know if it’s more complicated, but that series is, like, the equivalent of eight movies. It’s 400 minutes of someone’s time 

How Kaleidescape Makes Movies Look Amazing

Examples of video flaws that can appear during the transcoding process.

and a lot of Visine. 800 minutes, actually, because every episode requires two passes—because it will get an initial pass through our tools, and anything we see that we’re not happy with triggers a second pass, so it can be finalized and we can deliver it to our customers.

 

DB  What kinds of things might trigger a second pass?

 

LO  It’s all the stuff that you might imagine could conceivably bother you if you were watching this program on a reference-quality screen: Is there any sense that the black levels aren’t staying true? Is there any banding in the transitions of colors? Is the brightness fading properly when it should? Is there any macroblocking that 

shows up? And if any of that shows up, we work with proprietary tools to make sure we’re filtering out anything that’s not in the source file, that was introduced in the process of preparing it for public consumption.

 

MK  One other thing to consider, getting back to our earlier discussion about Kaleidescape versus discs: One area where we have some latitude is that the optical disc has whatever capacity it has, so when the disc is authored, they’re working with that limitation. We don’t have that limitation. We don’t have to conform our releases to something that could fit on an optical disc. We don’t have to worry about adding a second disc for bonus features. So, if a particular movie or TV series benefits from having higher-bandwidth encoding than a disc would allow, we can do that.

 

LO  Yeah, the result is that our files are big. They’re big because there’s all of that delicious, juicy information stacked up and stored in each one of those files.

 

MK  Exactly. But you know when you’re watching one of our premium movies that someone actually took the time to go over it with a fine-tooth comb and make sure that it’s right.

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

How Do I Define a Luxury TV?

How Do I Define a Luxury TV?

I’m thinking about upgrading my living-room TV, a five-year-old UHD TV that doesn’t support HDR. The process of choosing a new TV has me thinking seriously about a question that several Cineluxe writers have already attempted to answer: How do I define the term “luxury”?

 

For me, luxury simply means going beyond what you deem necessary in a given purchase. Whether it’s cars or watches or speakers, we all have a standard in our minds of what the base model is, the thing that will get the job done in the manner we want it done. And then there’s the thing that goes beyond, the thing that delivers a higher-quality experience that may not be necessary but is oh so delightful.

The standard is different for each person, which means the luxury is different for each person. I’m generally a frugal (okay, cheap) person. When I shop, I tend to start at a base model and actually talk myself down to something less. The plus side of that approach is that the luxury bar isn’t set terribly high. Sometimes just buying a brand name feels like an indulgence.

 

But that mentality goes right out the window when we’re talking about TVs. I’ve been a video reviewer for over 10 years, so I’ve had the good fortune to spend time with the creme of the crop in the TV category. I’ve had a taste of the best, and it has definitely raised the baseline standard of what I demand from a TV.

 

I won’t buy a new TV that can’t deliver a true HDR experience—by that, I mean it must have a great black level, above-average peak brightness, and support for both HDR10 and Dolby Vision. And since manufacturer review samples tend to be 65-inchers, I’ve grown accustomed to that screen size—anything smaller just won’t cut it.

 

Those requirements already set a baseline that’s higher than what the average person deems necessary in a TV, which is causing quite the internal battle between my inner cheapskate and my inner videophile over what’s essential in this purchase.

 

The (ahem) frugal side of me is leaning toward a midrange 65-inch LED/LCD TV—something with a local-dimming full-array LED panel and a respectable amount of peak brightness. As we discussed in a recent podcast, the performance of these midrange TVs has gotten so good 

that the vast majority of people will be truly blown away by the picture quality. My mind knows that these are very good performers that have the features I demand. They check all the right boxes. It’s a no-brainer.

 

But my heart has something else to say on the subject. It longs for the luxury of the far pricier OLED TV. I know rationally that, from a features standpoint, an OLED TV doesn’t really bring anything more to the table than those midrange LCDs. And while its performance is certainly better, it’s not two or three times better, which is how much more you’ll pay for a similar screen size—and that’s if you go with the “budget” OLED option. The true luxury purchase would be a flagship model like LG’s Signature W8, whose picture quality is essentially identical to lower-priced models in LG’s line. You’re paying for the sex appeal.

 

Ultimately, luxury lives on a sliding scale that’s determined entirely by our personal experience. Once you’ve experienced the Nth degree of performance and design—be it in a TV, a speaker, a control platform, or even a lighting system—your baseline is bound to shift.  You may know you don’t really need it, but it’s hard not to want it.

Adrienne Maxwell

Adrienne Maxwell has been writing about the home theater industry for longer than she’s
willing to admit. She is currently the 
AV editor at Wirecutter (but her opinions here do not
represent those of Wirecutter or its parent company, The New York Times). Adrienne lives in
Colorado, where she spends far too much time looking at the Rockies and not nearly enough
time being in them.

Why HDR Matters

If you read the reviews here at Cineluxe with any frequency, you’ve probably noticed that we make frequent reference to HDR—high dynamic range–video. By now, it’s a term you’re almost certainly familiar with. But if you’re not really sure what it means, you can be forgiven, because most of the standard marketing materials are confusing and misleading.

 

Here’s a perfect example. This image is representative of the images that most TV manufacturers use to convey the advantages of HDR. Look at that dull and washed out image on the left. Marvel at how it pales in comparison to the vibrant image on the right side of the screen. See how much better HDR is?

Why HDR Matters

There’s just one problem with this. This entire pictured is rendered in standard dynamic range (SDR). That vibrant, lifelike image on the right? Your old, non-HDR display could almost certainly render it with no problem. The image on the left? It’s artificially toned down and muted. This analogy isn’t really helpful. And mind you, I’m not knocking the graphic artist who made this particular example. The entire electronics industry seems content to rely on some variation of this example on every piece of marketing material promoting the advantages of HDR. I’m simply saying that if this is the only sort of comparison you’ve seen, you’re right to be skeptical.

 

So, how is one to understand the actually differences between SDR and HDR video? One easy way is to visit your local tech expert, be it a custom integrator or an electronics store you trust, and ask for a demo.

 

But you can also understand it with just a little math.

 

In short, the SDR video we’ve grown accustomed to for the past few decades, through DVD, HDTV, Blu-ray, and even non-HDR 4K, uses 8 bits of data to represent each primary color: red, green, and blue. What this means is that you can have 256 different shades of each of those colors, which are then combined to create the entire visual spectrum. 256 shades of red, 256 shades of blue, and 256 shades of green combine to create nearly 17 million total shades that can be displayed on a SDR screen, or captured in a video format like Blu-ray.

 

HDR, by contrast, relies on 10-bit (or even 12-bit) color. To understand what a monumental increase that is, understand that 10-bit color allows for 1,024 different shades of red, green, and blue, which when combined result in over a billion different shades onscreen.

 

Here’s a visualization of the difference between 10-bit and 8-bit, when limited to the blue channel alone:

Why HDR Matters

And grayscale, which represents every step along the way from pure black to pure white:

Why HDR Matters

Again, you’re seeing these images presented in SDR, but hopefully they convey the point that 10-bit video, and hence HDR, allows for more subtle variation in color and grayscale. Which means that you see more detail in the shadows of darker images (or darker areas of a complex scene), and more variation in the highlights of brighter images (or brighter areas of a complex scene).

 

But that’s not all. HDR also allows for greater image brightness, and more control over which areas of the image are dark and bright. Your old HDTV might be capable of delivering 300 nits (a standard unit of measurement for brightness), whereas many of today’s better HDR-capable displays can easily deliver 1,000 nits or more. That doesn’t necessarily mean that the entire 

image is brighter, mind you, as if you just took your old HDTV and cranked the brightness control. Turn up the brightness on an old TV, and the blacks get washed out and turn gray. Turn up the contrast to compensate, and what you end up with is an image with stark blacks, bright whites, and not much in between.

 

A good HDR TV, on the other hand, can make a small area of the screen—a flashlight beam, for example—shine with all the intensity of the real thing, while keeping the shadows wonderfully and natural dark, without robbing you of those all-important mid-tones in between.

If you’ll allow me my own dubious analogy, think of it like this: Imagine a piano that only had 22 keys. The key on the left is still low A, and the key on the right is still high C, but there are only twenty keys in between them and they can only be played with the soft pedal depressed. Compare that imaginary hobbled instrument to the rich sonic output of an 88-key Steinway Model D concert grand piano played at full volume, and you can start to really wrap your brain around the differences between SDR and HDR.

 

The bottom line is that good HDR displays do a much better job of matching our eyes’ (and our brain’s) ability to differentiate subtle differences in color and contrast, as well as the natural variations in brightness we experience out in the real world.

 

There is one other confusing aspect to all of this, though: The fact that there are competing HDR standards—which you may have seen referred to as HDR10, HDR10+, Dolby Vision, and Hybrid Log Gamma. You don’t really need to understand the differences between them to understand what HDR is and how it works, but we’ll dig into those competing standards in a future post and explain what sets them apart.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.