Tech

Rediscovering My Joy for Home Theater

Rediscovering My Joy for Home Theater

I’d already planned to write a wrap-up post on my journey to get a new projector to update my personal home theater, but Andrew Robinson’s recent “4K is for Fanboys,” makes the timing of this post even more relevant.

 

As I mentioned in “It’s Time to Update My Theater,” technology had passed my previous Marantz projector by, and it had been quite some time since we had used it. Instead, we just watched our 65-inch TV screen full time. (I know, a first-world problem for sure.) Sure, it was still enjoyable, but it actually curtailed the number of movies we watched. When the projector was in action, we would generally watch two to three movies per week, making an evening around dropping the lights and focusing

on the big screen. But with the projector out of action, we went to watching two to three movies per month.

 

After the new projector arrived, I couldn’t wait to see it in action. Instead of waiting until I could get some help to properly install the JVC by retrofitting the new cabling required (sending 4K HDR signals upwards of 50 feet is beyond the limits of my old HDMI cable, and I’ve gone to an HDMI-over-fiber solution from FIBBR) and mounting the JVC, I just set it on its box on top of our kitchen counter, strung the FIBBR cable across the floor, did a quick-and-dirty alignment and focus, and settled in to watch a movie on the big screen.

 

And from the opening scene, I was ecstatic with my new purchase. The blacks were deep and cinematic, colors were bright and punchy, edges were sharp and defined, and, blown up to nearly 10 feet, the projector’s 4K image had incredible resolution and detail. For me, this is what true theater-at-home is all about.

 

Watching movies on a 115-inch screen is incredibly more involving than a 65-inch one. And with the projector, it is an active viewing experience, with the lights down and distractions minimized. In the short time I’ve had the new projector—less than two weeks—we’ve already watched seven films with it, and each time I’m giddy that this is something I’m actually able to enjoy in my own home.

 

Coupled with my 7.2.6-channel audio system, movies look and sound as good as virtually any commercial theater.

I’m not a filmmaker as Andrew is, and I’m not a student of film as site editor Mike Gaughn is. I don’t watch movies to dissect framing, composition, or lighting. And I’m sure there are many subtleties, references, and hat tips in films that I’m completely oblivious to. But, the fact is, most times when I go to watch a movie, it’s to relax and enjoy myself. And I’d imagine that’s what most people are looking to do with their home entertainment systems. I’m not looking for Ready Player One to change my world view, or for Alita: Battle Angel to offer a commentary on anything, or for John Wick to teach me any lessons, well, except for maybe on the benefits of rapid mag changes. 

 

I’m looking to sit back with a martini and be entertained for a couple of hours.

 

At the end of the day, unless you are a filmmaker evaluating your work, or a professional film critic getting paid to review the work of others, all of this “home theater stuff” is really just a hobby designed to be fun and enjoyable. And any technology improvements that can help people to achieve a better experience—be it 4K, HDR, Dolby Atmos, 3D, or other—is an improvement in my book.

 

To my eye, 4K HDR films look better, especially when blown up to large sizes. And, to my ear, Dolby Atmos (or DTS:X) soundtracks are more exciting and involving. And if I’m electing to spend my precious time watching something—be it Survivor on broadcast cable, Jack Ryan streaming on Amazon, the latest Star Wars, Avengers, or Pixar entry, or just some new release from the Kaleidescape Store, then I’d like to do so in the highest quality possible.

 

And if that makes me a 4K Fanboy as Andrew suggests, then sign me right up!

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

“Apollo 11” Goes 4K

"Apollo 11" Goes 4K

If you’ve read my review of the original HD release of Todd Douglas Miller’s documentary film Apollo 11 from earlier this year, you may recall that it was a bit more of a rant than a proper critique. Not about the film, mind you. Apollo 11 still stands as one of the year’s best cinematic efforts, especially in the more straightforward, less editorial approach it takes in capturing this one monumental moment in history.

 

The rant was instead about the film’s home video release, which was originally HD only, with no mention of a UHD/HDR followup. As I said in that original review, this was doubly troubling because Apollo 11 is among a small handful of films released recently to actually be sourced from a 4K digital intermediate. In fact, its original film elements were scanned at

resolutions between 8K and 16K. Given that most modern films, especially Hollywood tentpoles, are finished in 2K digital intermediates and upsampled to 4K for cinematic and home video release, the lack of a UHD option for Apollo 11 was as infuriating as it was puzzling.

 

Thankfully, that mistake has been rectified. Apollo 11 is now available in UHD with HDR on most major video platforms, including disc and Kaleidescape, with the latter being my viewing platform of choice. I know I mentioned purchasing the film in HD via Vudu in my original review, but that purchase doesn’t offer any sort of upgrade path for UHD, the way Kaleidescape does.

 

At any rate, I did a lot of speculation in that first review about the sort of differences I thought UHD would make for this title. And having now viewed it, most of those predictions turned out to be true. UHD does, indeed, reveal a lot of detail that was obscured in the HD release. That makes sense given that the source of so much of this film’s visuals existed in the form of 65mm/70mm archival footage.

 

One of the biggest differences you see when comparing the 

HD and UHD releases is in the textures of the Saturn V rocket. Ribbing in the first three stages of the rocket that dwindle to nothing in HD are clear and distinct in UHD. The little flag on the side of the rocket is also noticeably crisper, and the stars in its blue field stand out more as individual points of whiteness, rather than fuzzy variations in the value scale.

 

As predicted, the launch of Apollo 11 also massively benefits from HDR grading. The plume of exhaust that billows forth from the rocket shines with such stunning brightness that you almost—almost—want to squint.

 

One thing I didn’t predict, though—which ends up being my favorite aspect of this new HDR grade—is how much warmer and more lifelike the imagery is. In the standard dynamic range color grade of the HD version of the film, there’s an undeniable cooler, bluer cast to the colors that never really bothered me until I saw the warmer HDR version. Indeed, the HDR grade evokes the comforting warmth of the old Kodak stock on which the film was captured in a way the SDR grade simply doesn’t.

 

It’s true that the new UHD presentation does make the grain more pronounced in the middle passage of the film—where 65mm film stock gives way to 35mm and even 16mm footage. That honestly has more to do with the enhanced contrast of 

this presentation than it does the extra resolution. HD is quite sufficient to capture all the nuances and detail of this lower-quality film. But the boost in contrast does mean that grain pops a little more starkly.

 

This does nothing to detract from the quality of the presentation, though, at least not for me. And even if you do find this lush and organic grain somewhat 

distracting, I think you’ll agree it’s a small price to pay for the significantly crisper, more detailed, more faithful presentation of the first and third acts.

 

If you haven’t picked up Apollo 11 yet, congratulations—you get to enjoy your first viewing as it should have been presented to begin with. If you already bought the film in HD, I can’t recommend the upgrade to UHD highly enough. Thankfully, for Kaleidescape owners, that upgrade doesn’t mean purchasing the film all over again.

 

It is a shame Universal, the film’s home video distributor, has for whatever reason decided to hold back bonus features. The featurette included with the UHD Blu-ray release, which covers the discovery of the 65mm archival footage, is missing here—although it’s widely available on YouTube at this point (and is embedded above). And only Apple TV owners get access to an exclusive audio commentary. Then again, given how badly the studio fumbled the original home video release, it’s no real surprise that they’ve dropped the ball on making the bonus features widely available.

 

Don’t let that turn you off of the film, though. This is one that belongs in every movie collection, especially now that it’s available in UHD.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

4K is for Fanboys

4K is for Fanboys

I feel as if I might have a reputation around these parts, a heel of sorts. Why a heel and not a hero? Because I find that my opinions are often in opposition to that of my contemporaries. Not because they are wrong, but just because I think their focus is continually on things, topics, and ideas that play to a base that, well, is dying.

 

Dennis Burger wrote a terrific piece on why 4K isn’t always 4K. It is a truly good piece of writing and one that gets 99 percent of the argument absolutely correct. For as someone who has literally filmed a feature-length film for a motion-picture studio in 

true 4K only to have it shown in theaters in 2K, I can attest to the article’s validity. But Dennis, like me some years ago, missed the boat by even framing the argument around resolution at all.

 

You see, I thought people/viewers cared about things like resolution. Back in 2008, when I filmed my movie, the original RED ONE cinema camera just came out, and as a result the “whole world” was clamoring for 4K—or so it seemed. I had the choice of whether or not to film in 4K via the RED ONE or go with a more known entity by filming in 2K via cameras from Sony’s CineAlta line. Ultimately I chose option C, and went with a true dark horse contender 

in Dalsa, who up and to that point, had no cinema pedigree—unless you count being the ones who designed the sensor tech for the Mars Rover a cinematic endeavor. But I digress.

 

We didn’t use the RED ONE because it was buggier than a roadside motel mattress, and I didn’t choose to side with Sony because they were HD, and HD was yesterday’s news. Filming in 4K via the Dalsa back in 2008 was an absolute pain in the ass. (That’s me with the Dalsa Origin II in the photo at the right.) Spoiler alert, not much has changed in 2019, as 4K 

continues to be a bit of a pain, it’s just more accessible, which makes everyone think that they need it—more on that in a moment.

 

What is upsetting is that I do know the monetary difference my need to satiate the consumer-electronics fanboys cost me and my film—a quarter of a million dollars. While $250,000 isn’t much in Hollywood terms, it represented over a quarter of my film’s total budget. The cost of filming in HD you ask? Less than $30,000. Oh, and post production 

4K is for Fanboys

would’ve taken half the time—thus lowering costs further. All of that headache, backache, and money only to have the film bow in 2K via 4K digital projectors from—wait for it—Sony!

 

Now, I will sort of agree with the assertion that capturing visuals at a higher resolution or quality and downscaling to a lesser format—say HD—will result in a clearer or better picture—but honestly, only if you preface what you’re watching as such ahead of time. Which brings me to my point: All of this HD vs. 4K talk is for fanboys who insist on watching pixels and specs rather than watch the damn movie. Not one person, or journalist (apart from me), wrote about my film from the context of being the first-feature length film ever filmed entirely in 4K. They didn’t ask about it, nor care, because it doesn’t matter.

 

It never mattered.

 

What digital has done is remove the magic from cinema and replace it with a bunch of numbers that bored middle-aged dudes (yes, dudes) can masturbate over in an attempt to differentiate their lot from the rest. None of it has any bearing on the story, enjoyment, or skill. It’s an arms race, one we all fall prey to, and one we continually perpetuate, because, well, it sells. We’ve gotten away from cinema theory, history, and storytelling in recent years and instead become infatuated with bit-rates, color spaces, and codecs. And yet, in the same breath, so many of us bitch about why there are no good films being made anymore. It’s because the only thing audiences will pay for is what they think is going to look great on their brand new UltraHD TV.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

Choosing My New Projector

Choosing My New Projector

Following up on my last post, “It’s Time to Update My Theater,” I’m going to delve into the thought process that caused me to splurge and finally upgrade my projector.

 

As I mentioned, my existing projector was about 11 years old, and, while it still produced watchable pictures from Blu-ray and DVD discs, it wasn’t compatible with many of the new 4K HDR sources in my system, so we had just stopped using it. I was

toying around with ditching both the projector and my current 65-inch Sony flat panel and upgrading to a new 85-inch flat panel.

 

Why 85 inches? Well, that is about the current size limit before you start getting into ridiculously expensive pricing. For under $4,500, you can get a Sony XBR-85X950G flat-panel that has been universally reviewed as a fantastic display. This would provide a large screen image for viewing all the time, not just at night with the lights down. It would also handle HDR signals (and Dolby Vision) far better than a projector at any price could.

 

As this was a significantly cheaper upgrade option, I really considered it, but ultimately decided I would miss the truly large-screen experience of my 115-inch, 2.35 aspect screen.

 

We use the projector almost exclusively for movie watching, and having nearly double the screen real estate makes a massive difference, and is far more engaging than a direct-view set, even one at 85 inches. (Now, had the 98-inch 

Sony Z-series TV been a tenth of its price—selling for $7,000 instead of $70,000—that probably would have been my pick.)

 

So, having made the decision to stick with front projection, I had to settle on a model. I had a few criteria going in that helped narrow the search.

 

First, I wanted it to be true, native 4K resolution on the imager, not using any pixel shifting or “wobulation” to “achieve 4K resolution on screen.” This ruled out many of the DLP models from companies like Epson and Optoma. Nothing against them, I just wanted native 4K.

 

Second, it had to have a throw distance that worked with my current mounting location. Actually, this isn’t much of a concern anymore, and most modern projectors have an incredibly generous adjustment range on their lens.

 

Third, I needed a model that offered lens memory so it would work with my multi-aspect screen (92 inches when masked down to 16:9, and 115 inches when opened to full 2.35:1.) This allows the projector to zoom, shift, and focus for a variety of screen sizes at the push of a single button, and is crucial for multi-aspect viewing.

 

Fourth, it needed to integrate with my Control4 automation system. Sure, I could cobble together a driver, but it would never offer integration as tight as one that was meant to work with that particular model.

 

Finally, it had to fit my $10,000 budget. Unfortunately, this ruled out brands like Barco and DPI. I was super impressed with Barco’s Bragi projector, but, alas, it doesn’t fit in my tax bracket.

 

Basically, with these criteria, my search was narrowed to two companies: JVC and Sony. And primarily to two projectors: The JVC DLA-NX7 (shown at the top of the page) and the Sony VPL-VW695ES. (Were my budget higher, I would have added the JVC DLA-NX9 to that list, which has the primary advantage of a much higher quality, all-glass lens, but it was more than double the price. And while the less expensive JVC DLA-NX5 also met all my criteria, the step up NX7 offers more bang for just a little more buck.)

 

So, I did what a lot of people do prior to making a big technology purchase: Research. I read a ton of forum posts, read all of the reviews on both models, and watched video comparisons. I also reached out to a couple of professional reviewers and calibrators who had actually had hands-on time with both models.

 

The CEDIA Expo is a place where manufacturers often launch new projectors, so this past month’s show coincided perfectly with my hunt. Since both companies had models that had been launched at CEDIA 2018, I was eager to see what announcements they might have regarding replacements or upgrades. Alas, there were no model changes, which, in a way, can be a good thing, since it means both models are now proven, have had any early bugs worked out with firmware updates, and  are readily available and shipping.

 

I really hoped to check out both projectors at the show, but, unfortunately, no one was exhibiting either. (Apparently, CEDIA is not the place to show your sub-$10,000 models.)

 

Ultimately, two announcements at the show swayed me to pull the trigger on the JVC. First, the product manager I spoke with said the price was going up by $1,000 on October 1, so buying sooner than later would actually save me money. But more importantly, JVC introduced new firmware at CEDIA that would add a Frame Adapt HDR function that will dynamically analyze HDR10 picture levels frame by frame, automatically adjusting the brightness and color to optimize HDR performance for each frame.

 

Projectors historically have a difficult time handling HDR signals, and this firmware is designed to produce the best HDR images from every frame. This used to be achieved by using a high-end outboard video processor such as a Lumagen Radiance Pro, but that would add thousands of dollars to the system. When I saw this new technology demonstrated in JVC’s booth, I was all in.

 

In my next post, I’ll let you know if the purchase was worth it. (Spoiler: It totally was!)

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

It’s Time to Update My Theater

 Some views of my home theater space, pre upgrades

photos by Jim Raycroft

The first home theater component I ever purchased was a subwoofer back in 1995. It was a big 15-inch black cube Definitive Technology model that I drove into San Francisco to buy after researching everything I could find for weeks in all the enthusiast magazines at the time. From there, I bought a Yamaha digital surround decoder and Dolby Digital RF demodulator

for a laserdisc player, connected it all to some speakers and a 25-inch Proton tube TV, and voila! I had my first home theater system.

 

It didn’t have a lot of style or elegance, and it certainly wasn’t luxury, but I was on the cutting edge of 5.1-channel technology, and it sounded better than anything my friends had.

 

And I was hooked.

 

Over the years, my system has seen a lot of upgrades, most frequently in the preamp/processor section, as I chase the technology dragon of trying to stay current with surround formats, channel counts, and HDMI processing. (For the record, the 13.1-channel Marantz AV8805 is currently serving processing duties in my rack, and doing a very fine job of it, thank you.)

 

Speakers get upgraded the least often, as a good speaker rarely stops sounding good, and, if cared for, rarely breaks. Sources come and go as technology improves. Gone are the VCR, and the LaserDisc and DVD players. Currently in use are a Kaleidescape Strato and M500 player, Samsung UHD Blu-ray, Apple 4KTV, Dish Hopper 3, and Microsoft Xbox One.

 

Lying in the upgrade middle ground is my system display. Long gone is the 25-inch Proton, having been replaced by a 35-inch Mitsubishi, then a 61-inch Samsung DLP, then a 60-inch Pioneer Elite Plasma. Currently, my primary display is a Sony XBR-65X930D, a 65-inch 4K LED. However, it’s a D-

generation, and Sony is now on G models, so it might be due for replacement next year.

 

One device in my system that has never been upgraded is my video projector.

 

I always wanted a truly big-screen, cinematic experience, and this meant a projector and screen. So I purchased the best projector Marantz made (the VP-11S2, shown below) back in 2008, along with a Panamorph anamorphic lens and motorized 

sled system. This setup fires onto a Draper MultiView screen that has masking to show either a 92-inch 16:9 image or a 115-inch 2.35:1 Cinemascope image.

 

The first time we dropped the lights, powered on the projector, and lowered the screen, I was ecstatic. I couldn’t believe how lucky I was to have this amazing system in my own home, and we essentially stopped going out to the movies.

 

I continued to feel that way about my projection system for years. It 

It's Time to Update My Theater

provided an amazing, truly cinematic experience that made me happy literally every time we used it. And use it we did, generally watching two to three movies per week on the big screen.

 

But then, technology moved on.

 

Principally, HDMI went from 1.4 to 2.0, resolution went from 1080p to 4K, and video went from SDR to HDR.

 

While the Marantz still worked, it was now by far the weakest link in my theater chain, and it no longer supported any of the sources we wanted to watch. In fact, just watching a Blu-ray on the system via our Kaleidescape meant going into the Kaleidescape’s Web setup utility and telling the system to “dumb itself down” to output HDMI 1.4 signals. A huge hassle.

 

So, a couple of years ago, we basically stopped using the projector at all.

 

But, some things changed in the projector world at the recent CEDIA Expo in Denver that inspired me to finally make the upgrade plunge, and that’s what I’ll dive into in my next post!

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

How to Become an Expert Listener

How to Become an Expert Listener

Recently, I helped my friend Ed set up two audio systems. During the process of dialing them in, I had to walk him through what to listen for in order to hear the improvements because he didn’t know what to focus on in evaluating the sound. It occurred to me that most people don’t.

 

A luxury stereo system or home theater should deliver exceptional sound, of course. But what exactly should you listen for in evaluating, choosing, setting up, and enjoying a high-performance system?

 

(Note: I’m not going to dig deeply here into how to set up various aspects of a system to achieve peak performance, but rather what to listen for.)

 

First of all: A system will only sound as good as its source material. It’s essential to use good demo tracks. Don’t go with a low-bit-rate MP3 file for music listening, for example. Use an audiophile CD or LP, or a high-res download or streaming service.

 

For stereo music evaluation, you can’t go wrong with that stone classic, Pink Floyd’s The Dark Side of the Moon. It’s one of the best recordings ever made, thanks to the brilliant talent of Grammy-winning engineer Alan Parsons. Listing the strengths of this album is like outlining a mini-course in what to listen for:

 

—Deep, articulate bass, a rich midrange, and extended highs

—Accurate timbre of vocals and instruments (except when deliberately processed)

—An expansive sound field

—Wide dynamics, from almost subliminally soft to powerfully loud

—A remarkably clean sonic character.

 

(I’ll expand on each of these various areas below.)

 

A system should have a coherent tonal balance from top to bottom, without any particular frequency range sticking out. You don’t want it to sound too bright in the midrange (roughly the area between 200Hz and 5kHz, where most of the frequencies of the human voice reside) or have weak, recessed bass. With a solo piano recording like Robert Silverman’s superb

Chopin’s Last Waltz, listen for the transitions between the low, middle, and high notes, which should be smooth and seamless.

 

Listen for a clear, “transparent” sound with a lot of fine musical detail. The sound should be pure, without any “grain,” hardness, or roughness in texture. (For example, a flute should sound clean and natural, not buzzy or strident or distorted.) Bass should be articulate, not indistinct. The midrange should have plenty of presence, since that’s where most of the music “lives.” Highs should be airy and extended.

 

Subtleties like the “ting” of the triangle in the Fritz Reiner/Chicago Symphony recording of Scheherazade (an example of the upper range) or the reverb on Shelby Lynne’s voice on Just A Little Lovin’ (an example of the midrange) should be clearly audible. Although it’s not all that realistic in terms of spatial positioning of the instruments, Miles Davis’ jazz classic Kind of Blue is excellent for evaluating timbre, resolution, and overall naturalness of sound.

For stereo setups, listen for a coherent sound field without a “hole in the middle” (from your speakers being too far apart or not angled in properly) or a lack of imaging and spaciousness (speakers too close together). Depending on the recording, vocals and instruments can be precisely defined in space, left to right and front to back, and the sound field can seem to extend beyond the speakers and maybe even the room. (For some tips on speaker placement, check out these articles from Lifewire and Dynaudio.)

 

However, be aware that on some recordings, especially those from the late 1950s through early 1970s, vocals and instruments can be placed too far off to the left or right. Also, you won’t hear laser-focused pinpoint imaging on a properly-miked orchestral recording—because that’s not what things sound like in real life. And keep in mind that changing your

listening position will have a significant impact on the sound.

 

I once visited the Harman listening lab in Northridge, California, where they used Tracy Chapman’s “Fast Car” to help determine the differences between speakers. That’s because it’s one of the easiest cuts for people to use in picking out sonic differences.

 

When listening to multichannel movies or music, the sound literally expands, thanks to the addition of center and surround speakers, one or more subwoofers, and, in some installations, height speakers (for example, in a Dolby Atmos system). In fact, Cineluxe has some excellent recommendations for home theater demo material.

 

Listen for a good balance between all the speakers. The surround speakers and subwoofers shouldn’t overly call attention to themselves except when the audio mix warrants it. You should hear a seamless, immersive 360-degree bubble of sound.

 

Dialogue clarity is critical for movies and TV! As such, the performance of the center-channel speaker in a multichannel setup is crucial. (Center-channel volume can be set independently—a very important aspect of home theater system tuning.)

How to Listen—The App

 

I have a confession to make.

 

Instead of writing this post,  I could have been lazy and just told you to check out the Harman: How to Listen app. It’s a training course that teaches you how to become a better listener by pointing out various sonic aspects to focus on, such as specific frequency ranges, spatial balances, and other attributes. Check out this post by Harman’s Dr. Sean Olive for more details.

–F.D.

On another note, it’s a good idea to use material you’re familiar with when evaluating a system, even if it’s not “demo quality,” so you can instantly hear the improvements a luxury system can make. I can’t tell you how many times I’ve sat someone in front of my high-end setup, asked them to pick a favorite piece of music, and then heard them say things like, “I can’t believe the difference! I never knew it could sound like that! It sounds like a different recording!”

 

The best advice I can give is to constantly school yourself to become a better listener.

 

Go out and listen to live unamplified music, whether at Carnegie Hall or a friend strumming an acoustic guitar. Get familiar with the sonic nuances of various instruments. Listen to as many audio and home theater systems as possible, at stores, friends’ houses, and audio shows. Listen to the sounds around you—birds, wind, city streets.

 

Good listeners are made, not born.

Frank Doris

Frank Doris is the chief cook & bottle washer for Frank Doris/Public Relations and works with a
number of audio & music industry clients. He’s a professional guitarist and a vinyl enthusiast with
multiple turntables and thousands of records.

4K is Sometimes Actually 2K–But That’s OK

4K is Sometimes Actually 2K--But That's OK

From time to time in our reviews of 4K/HDR home video releases, you may have stumbled across a phrase that seems downright perplexing: “Taken from a 2K digital intermediate.” It stands to reason, after all, that a video file that has spent some portion of its life at 2K resolution can’t really be considered 4K. Or can it?

 

This can be doubly confusing when the sentence before or after makes note of the film being shot “on ARRIRAW at 6.5K resolution” or something to that effect. That’s a whole lot of different Ks for a film that’s ostensibly being released in 4K (or, more accurately “Ultra HD”) for home video. So, what exactly does all of this mean? And should you really care?

 

To get to the bottom of these questions, we need to back up and discuss how movies are shot, produced, and distributed. To keep the discussion as simple as possible, we’ll ignore films that are still captured on actual film stock and just focus on digital cinema, since that’s the way most movies (and TV shows) are shot.

 

Depending on the model of camera used, as well as other technical considerations, the resolution captured by these cameras generally ranges between 2K (2,048 x 858 or 2,048 x 1,152) and 6.5K (6,560 x 3,102), with a few other resolutions in between—like 2.8K (2,880 x 1,620) and 3.4K (3,424 x 2,202)—also commonly used. The “K” is short for “thousand,” and the resulting abbreviation is simply a rough approximation of the horizontal resolution of the resulting file.

 

At any rate, no matter what resolution a film is shot in, the footage has to be reformatted to standard digital cinema projector resolutions, either 2K (2,048 × 1,080) or 4K (4,096 × 2,160), before being distributed to commercial movie theaters. But a lot more than that happens to most films before they’re released. They have to be edited and color timed, and with most 

4K is Sometimes Actually 2K--But That's OK

blockbusters, special effects have to be rendered and composited into the footage that was shot on-set.

 

This work is time-consuming and expensive, and the higher the resolution at which the work is done, the costlier and more time-consuming it is. As such, due to budget constraints, release schedules, or in some cases simply preference, this work is usually done at 2K (2,048 × 1,080) resolution, the result of which is what we refer to as a 2K digital intermediate. This is the last step in the post-production process for most films, before their conversion to Digital Cinema Distribution Master (DCDM) and Digital Cinema Package (DCP), the latter being the compressed version of the final film sent to movie theaters for public consumption.

 

Sometimes, budget and time allowing, films are finished in a 4K digital intermediate—Black Panther, for example, just to name one recent Hollywood blockbuster. But by and large, the vast majority of effects-driven tentpole films go through the 2K bottleneck during postproduction.

 

Which may lead to you ask why they don’t just shoot the movies in 2K to begin with, if they’re going to be downsampled to 2K. It’s a good question. And the answer isn’t a simple one.

 

But, to simplify it as much as possible, shooting in 6.5K or 3.4K or even 2.8K, then downsampling to 2K, will often result in an image that’s crisper, clearer, and more

detailed than an image shot natively in 2K resolution. Ironically, you’ll also find some filmmakers who admit to shooting closeups of actors through filters of one form or another because the enhanced clarity of shooting in 6.5K or 3.4K or whatever can be somewhat less than flattering, even once the footage is downsampled to 2K. Nevertheless, there are technical advantages to shooting at such high resolutions, even if you and I will never see the original full-resolution footage.

 

Of course, there’s one other obvious question you may be asking: If all of this imagery has been shrunk down to 2K resolution, and all of the special effects have been rendered in 2K, why not just be honest about it and release the film in 2K? Why make the bogus claim that these home video releases are in 4K?

 

The cheeky answer is that we don’t have a 2K home video format. Digital cinema resolutions and home video resolutions simply don’t match up for historical reasons that I won’t delve into here. The older high-definition home video format, with its 1,920 x 1,080 pixels, is pretty close to 2K, but it’s still about six percent fewer pixels.

4K is Sometimes Actually 2K--But That's OK

The Oscar-winning Spider-Man: Into the Spider-Verse, which many feel is one of the most
visually stunning recent films and a reference-quality 4K HDR release, was created solely in the
2K domain and then upsampled to 4K for distribution

When you get right down to it, though, pixel count is actually one of the least important contributors to perceived image quality, once you get above a certain resolution. High dynamic range (HDR) video and wide color gamut actually play a much greater role in our perception of the quality of the picture. And HD video formats, such as Blu-ray or 1080p downloads and streams, simply don’t support the larger color gamut and higher dynamic range that modern video displays support.

 

For that, we have to step up to Ultra HD, which is colloquially called “4K” by many in our industry, if only because “Ultra HD” is a mouthful. The thing is, most UHD home video displays have a resolution of 3,840 x 2,160—a little less than the digital cinema standard 4K resolution of 4,096 × 2,160. But still, close enough.

 

And here’s the important thing to consider, if you take nothing else away from this long and rambling screed: If you want to enjoy the best that home video has to offer these days, you’re going to be watching your movies (and TV shows) in Ultra HD on an Ultra HD display. Would it be technically possible for Hollywood to release those movies and shows in something closer to 2K resolution, while also delivering HDR and wide color gamut? Sure. It may be contrary to home video format standards,

but nothing about that would violate the laws of physics.

 

But why would they? Your display (or your player, or maybe even your AV receiver or preamp) is going to upsample any incoming video to match the resolution of your screen anyway. One way or another, you’re going to be viewing 3,840 x 2,160 pixels. As such, why wouldn’t you want the studios to use their vastly more sophisticated professional video scalers to upsample the resolution before it’s delivered to you via disc, download, or streaming? Those video processors don’t work in real-time, the way the processors built into your player, receiver, or display do. They’re slow, methodical, and do a much better job.

 

So even if the movie you’re enjoying this evening technically passed through a 2K-resolution digital intermediate at some point, that doesn’t mean you’re being duped when you’re sold a “4K/UHD” home video release. You’re still enjoying the most important technical advantages of the Ultra HD format—namely the increased dynamic range and color gamut.

 

Mind you, for David Attenborough nature documentaries and other footage that doesn’t require the addition of special effects, I want a genuine Ultra HD video master, with every possible pixel kept intact. But for big Hollywood blockbusters? I honestly think this whole “Fake 4K” discussion has gotten way out of hand.

 

I’ll leave you with one last thought to consider. This summer’s biggest film, Avengers: Endgame, reportedly had a budget of more than $350 million before marketing costs 

were factored in. Of that $350-ish million, roughly $100 million went to the visuals, including special effects. Had the film been finished in a 4K digital intermediate instead of a 2K one, you can bet that budget would have been significantly higher (remember, the jump from 2K to 4K isn’t a doubling, but rather a quadrupling of pixels, since both the horizontal and vertical resolution is doubled, and rendering four times as many pixels simply costs a heck of a lot more money and time.)

 

Would it have been worth it? Well, consider this: The original John Wick film was shot in 2.8K and finished in a 4K digital intermediate, whereas the latest release in the franchise, John Wick 3, was shot in 3.2K and finished in a 2K digital intermediate. I haven’t seen any of these films, but every review I’ve read seems to indicate that the UHD home video release of the third looks noticeably better than the first.

 

If 2K digital intermediates were truly the bane of the home cinephile’s existence, this simply wouldn’t be the case. So, when we mention in reviews that an Ultra HD release came from a 2K digital intermediate, we’re not implying that you’re somehow being cheated out of pixels you thought you were paying for when you bought that big new “4K” display. We’re just video geeks being video geeks and pointing out the most pedantic of details. In the few rare cases where it makes a legitimate difference, we’ll point that out explicitly.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

The Current State of the Luxury Audio Art

The Current State of the Luxury Audio Art

Steinway Lyngdorf’s P200 surround processor

In my previous post, I talked about the intriguing video trends I came across at the recent custom integrators CEDIA Expo in Denver. While there weren’t as many new developments on the audio side, I did notice a few continuing and developing trends throughout the show that will have an impact on the luxury home cinema market. And, unlike some of the premium video solutions on the horizon, these are all things that can be implemented in a home theater immediately!

HIGHER CHANNEL COUNT

While immersive surround systems such as Dolby Atmos, DTS:X, and Auro3D are pretty much de facto in newly installed luxury home cinemas, we need to remember that these formats have been available in the home market for only about five years, and until fairly recently the channel count for most of these systems maxed out at 12 in a 7.1.4 configuration (seven ear-level speakers, a subwoofer, and four overhead speakers).

 

But there has been an explosion of systems that support up to 16 channels in a 9.1.6 array, which adds front width speakers at ear level and an additional pair of overhead speakers. While having 15 (or more) speakers in a room might seem excessive, creating a seamless and truly immersive experience in large rooms that have multiple rows of seating requires additional channels to create cohesion between speakers as objects travel around the surround mix.

The Current State of the Luxury Audio Art

Companies offering new 16-channel AV receivers and preamp/processorss include JBL Synthesis, Arcam, Acurus, Bryston, Emotiva, and Monoprice. Some companies are even pushing the boundaries beyond 16, including StormAudio, Steinway Lyngdorf, Trinnov, JBL Synthesis, and Datasat.

 

 

BETTER BASS IN EVERY SEAT

Three home theater masters—Theo Kalomirakis, Joel Silver, and Anthony Grimani—presented a full-day training course titled “Home Cinema Design Masterclass,” where they discussed best practices in home theater design. Grimani, president of Grimani Systems and someone who has worked on more than 1,000 rooms over his 34-year career, stated that 30% of what people like about an audio system happens between 20 and 100Hz—the bass region. In short, if a system’s bass response and performance aren’t good, the whole system suffers.

 

But low frequencies are difficult to pull off correctly, especially across multiple seating positions, which is the ultimate goal in a luxury cinema. Good bass is possible for multiple listeners, but multiple subwoofers are always needed. Two subs are better 

than one, three subs are better than two, and four subs are better than three. (But Grimani stated that adding more than four subs actually has diminishing results.)

 

All the best home cinemas feature multiple subwoofers, not for louder bass, as one might think, but for more even bass at every seat. The best theaters deliver slam and impact at the low-end, but are also quick and free of bloat, which is what multiple good subs can deliver.

 

 

ROOM CALIBRATION

In  that same master class, Tony Grimani also claimed that achieving good bass performance almost always requires the correct use of equalization. Virtually every home theater receiver or processor sold today incorporates some form of room-correction softwareeither proprietary like Yamaha’s YPAO or Anthem’s ARC, or a third-party solution like Audyssey. At its simplest, these software systems employ a microphone to measure tones emitted by the speakers, which are used to calculate the distance from the speaker to the listener as well as to set channel levels. The more advanced systems employ equalization and other types of filters in an attempt to optimize how the room interacts with the signal. 

 

Three of the most revered and powerful room-correction systems all hail from Europe: Trinnov Audio (France), Dirac (Sweden), and Steinway Lyngdorf’s RoomPerfect (Scandinavia). These systems offer more adjustments, filters, and flexibility that less expensive, more mass-market offerings in order to make any room sound its absolute 

best. (For more on the importance of room correction, read this post by Dennis Burger.)

 

One of the big developments in room correction featured at the CEDIA Expo was Dirac’s new Live Bass Management module. An add-on to the existing Dirac Live correction system, it will aggregate measurement and location data from multiple subwoofers in a system to determine how best to distribute bass evenly across a room. It will also correct low-frequency sound waves produced by the main speaker pair so they’re in sync with the rest of the system.

 

But just having access to the best room-correction devices isn’t enough, as the best luxury rooms are calibrated by professionals who have been trained in acoustics to the Nth degree. This small group of top-tier calibrators travels the world with kits costing tens of thousands of dollars in order to measure, sample, adjust, and tweak the parameter of every speaker and subwoofer in your theater to wring out the very last drop of performance.

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

A Guide to Luxury Amps & Preamps

A Guide to Luxury Amps & Preamps
What is a Luxury Entertainment System?

As promised in our last Cineluxe Basics post, which covered the things you should consider when picking source components for your luxury home-entertainment system, this time we’ll be turning our attention to one of the most important—but also one of the most overlooked—components required to make such systems work. It’s such an esoteric piece of gear that you may not fully understand what it does.

 

But hopefully by the end of this discussion you’ll not only have a lot more respect for the lowly preamplifier; you’ll also be better able to make a more informed decision about which one is right for your system.

 

Everyone understands that source components like disc players, satellite boxes, movie servers, and video streamers deliver the movies and TV shows you watch on a regular basis, either from a silver platter, the airwaves, or a hard drive somewhere.

It’s positively axiomatic that your TV or projector is responsible for delivering those images to your eyes, and your speakers transmit sound through the air to your ears.

 

The preamp, though? It’s the box that sits in the middle, functioning as a sort of air-traffic control for your entertainment system. It sends the video from your sources to your display. It decodes the digital audio stream from your source components and sends it to your amps and speakers in analog form.

 

And you may be thinking to yourself, “That sounds an awful lot like an AV receiver!” It’s true. Preamp/amplifiers serve the same function in a luxury home-entertainment system as do AV receivers. It’s simply that a receiver combines all of the preamplification and amplification in one box, whereas going the preamp/amplifier route gives you a lot more flexibility in terms of perfectly matching your amplification needs to your speakers and your room.

 

As a result, it’s not inaccurate to say that a preamp/amp combo will generally give you better performance than a receiver, especially in a larger room. A more accurate explanation would also be a much more complicated one, but if you’re itching for a geeky discussion about the topic, I wrote one a few years back for Home Theater Review.

 

At any rate, these days all of the above is only part of the equation when it comes to selecting the right preamp. Another important function that has arisen in the past few years is digital room correction. Broadly speaking, “digital room correction” is a catch-all term that covers a number of different technologies, but all of them ostensibly serve the same purpose: To use a combination of equalization and other filtering to reverse the deleterious acoustic effects your room itself has on the sound leaving your speakers.

 

These effects come in two forms: Those caused by the shape of your room and those caused by the surfaces in your room. The former affects the clarity and evenness of bass in the room, as the low-frequency sounds coming from

your subwoofers and other speakers bounce off the walls and ceilings and either cancel each other out or reinforce one another.

 

Bass frequencies below 250 Hz or so (the highest note you can play on a double bass) have a really long wavelength, between five and 60 feet, so it takes a really big, flat surface to reflect them. So, it doesn’t really matter if your room is decorated with wood paneling or acoustic fabric; your subwoofer is going to sound overwhelming in one part of the room and wimpy in another. All good room-correction systems will listen to a microphone placed in and around the seats in

your entertainment space and tweak the sounds coming from your subs and speakers so the bass has impact and authority without sounding boomy or sloppy.

 

A great example of a room-correction system that positively excels in this respect is Anthem Room Correction, which you’ll find, appropriately enough, on preamps made by Anthem, like the AVM 60 (shown at the top of the page). If you have a dedicated home cinema space with acoustically treated walls, Anthem Room Correction is likely all you need to whip your bass into shape and make your subwoofers sounds like a million bucks.

If, on the other hand, you have a multi-use home-entertainment space in a living room or family room, your installer may recommend a more sophisticated—and indeed more expensive—preamplifier with a more advanced room-correction solution. That’s because it takes a lot more processing power and a lot more calculations to digitally correct problems that arise from hard or uneven surfaces in the room—like mirrors, windows, cabinets, hardwood floors, etc.—or even standard decorations like vases, coffee tables, or even columns along the wall. Since these surfaces are smaller than, say, the entire back wall of your room, they affect smaller wavelengths of sound—hence, higher frequencies.

 

You can attempt to correct for such problems with almost any room-correction system, but the cheaper ones—like you’ll find on most mass-market AV receivers—don’t do a very good job of it, leaving you with a sound system that’s lifeless, dull, and uninspiring.

 

Better, more sophisticated room-correction solutions, though, can go a long way toward erasing the harsh audible effects of such surfaces from the sound that reaches your ears, without making it sound like you’ve thrown a blanket over your head. Examples of such systems include RoomPerfect, which you can find on Lyngdorf’s MP-50 and MP-60 preamplifiers, as 

well as Trinnov’s Speaker/Room Optimizer, found on the company’s Altitude line of preamps. Your installer may also recommend preamps that rely on Dirac Live room correction, an excellent mid-priced solution.

 

As for amplifiers? Your best bet here is simply to listen to the advice of your installer. You will, of course, need one channel of amplification for every speaker in your system (except perhaps for the subwoofers, which often contain their own amplification),

so if you’re installing a 7.2.6-channel system (that’s seven ear-level speakers, two subwoofers, and six overhead speakers), you’ll need at least 13 channels of amplification. That may come in the form of two seven-channel amps, seven stereo amps, or even 13 standalone “monoblock” amplifiers, with each configuration having its own relative pluses and minuses. But again, chances are good your installer is intimately familiar with the speakers going into your system, and knows what amplification will work best.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

The Need for High-End Audio

The Need for High-End Audio

For me, high-end audio is all about the emotion.

 

Hold that thought for a moment.

 

In a recent column, my friend and colleague Adrienne Maxwell asked, “Do we really need high-end audio?” She outlined many valid reasons as to why the answer may not be “yes.” Certainly, high-end audio would not be at the bedrock of Maslow’s hierarchy of needs. And the path to high-end nirvana can have many challenges.

 

As a consumer, there’s the expense (though one can assemble a wonderfully musical system without spending outrageous sums of money, as Adrienne pointed out), the concerns of system and room matching, the need for proper setup, and the possibility that after investing all that time and money your particular combination of room and gear just might not work well together. (The advice of an expert can be invaluable in avoiding this pitfall.)

As a salesperson or dealer, you have a responsibility to provide your customer with what they want. It goes without saying that this requires skill and insight, not just a desire to earn a big spiff.

 

As a high-end manufacturer, you have to balance the sometimes opposing factors of price, performance, aesthetics, manufacturability, business costs, and market demand. If you’re going all-out on a product that strives for ultimate quality, it will almost certainly carry a high price tag, and the law of diminishing returns will be staring you in the face.

 

And, yes, sometimes a large speaker might cost $30,000 or $100,000 or more. But consider their multiple top-quality drivers, complex-geometry cabinets with expensive woods and finishes, elaborate crossovers, premium parts, and so on. These don’t come cheap, and manufacturers and dealers have to make a profit. And such speakers can outperform other designs, sometimes dramatically so, especially in presence, scale, dynamics and bass extension.

 

As a reviewer, I can attest that properly reviewing high-end audio gear is demanding. Let’s say you’re doing a speaker review. You need to listen using different amps, cables, source components, and even rooms in order to try to factor out what the speaker is doing from what the other equipment is doing.

 

Then there’s the psychological pressure. You have a responsibility to get it right because the stakes with a high-end review are high. Because this gear can be so expensive to produce, a negative review can financially harm a manufacturer, especially a smaller one.

 

So why get involved in high-end audio at all? And, as Adrienne pointed out, what the heck is it even, anyway?

 

There have been many definitions of “high-end audio” over the decades, most defining it as the ability for components or systems to more accurately or convincingly reproduce the sound of music than typical products. Harry Pearson, founder of The Absolute Sound, characterized high-end as the ability to reproduce the sound of real music—the absolute sound—in real space. Certainly, when most think of high-end they think of expensive prices.

 

But, like I said, for me—and for so many others—it’s all about the emotion.

 

A high-end system is one that crosses the line from a mere (even if high-quality) reproducer of sound to one that conveys the emotional impact of music.

 

It’s a system that draws you in and engages you. It makes you forget that you’re listening to reproduced sound and makes a direct connection to your feelings on a primal, soul-deep level.

 

This is an elusive quality. Just ask an audiophile dedicated to the pursuit, or anyone who’s spent hours or days setting up a system at an audio show or a dealer or a customer’s home. A system might sound good, or it might even sound bad, and after painstakingly adjusting speaker placement, cartridge alignment, vibration-isolating feet, room treatment, or what-have-you, there’s ideally a moment when everything comes together and the sound becomes right, locked-in, and, at the best of times, magical.

 

I fervently believe that high-end audio is worth defending, preserving, and encouraging. (Disclaimer: I’m in the high-end audio industry. And let’s set aside considerations of possible overpricing, marketing hype, accusations of “snake oil,” and other frown-inducing aspects for the moment.) High-end audio reflects not only a constant striving for excellence but a noble (if also commercial) effort to bring listeners ever-closer to the music.

 

And when you get that closeness, it’s one of the most wonderful feelings in the world.

Frank Doris

Frank Doris is the chief cook & bottle washer for Frank Doris/Public Relations and works with a
number of audio & music industry clients. He’s a professional guitarist and a vinyl enthusiast with
multiple turntables and thousands of records.