Tech

Why Aren’t Enthusiasts Honest (With Themselves)?

I would like to say AV enthusiasts are a weird bunch, but the truth is all enthusiasts of any genre are weird—present company included. One of the things I find most peculiar about enthusiasts of any persuasion is their—ahem, our—incessant need to “lie” to ourselves. What I mean by that is simple: We often lie or convince ourselves that we require, need, or even have

more than we actually do.

 

I have been running a YouTube channel aimed squarely at AV enthusiasts since 2013 (I think), and in all that time one “truth” has remained constant: Everyone claims to need or have more, when in reality they often do with less. For example, if I talk about or review an AV receiver, one of the common responses I get is, “Does it have [insert some insane request here]?” When I inevitably reply, “No”, the response quickly turns to, “Well, I would’ve bought it, but . . .”

 

Yeah, right.

 

What’s more interesting is the amount of data YouTube and other services provide creators like me that show just how not cutting-edge enthusiast are—or at least think they are. More often than not, enthusiasts shop solely on price and not on the features or performance they so dearly covet. Depending on what types of links within my videos they click on, I can quite literally see how they shop for AV gear. And I have to tell you, it’s never how they claim to.

 

More often than not, if enthusiasts choose to click on my

links in order to shop for AV gear, they often start by going to the product I talked about. But from there, they go on an exploration of other equipment that I would classify often as comparable, but which is almost always less expensive.

They only really buy what I’ve reviewed when it truly is their cheapest option—for example, Crown Audio’s XLS DriveCore 2 amplifiers (shown at right). These amplifiers cost a few hundred dollars each, but put out Krell-like power ratings. It doesn’t hurt that the Crown amplifiers also sound good, but you get my point.

 

All of this data flies in the face of enthusiasts’ public statements that products must offer the Earth, moon, and stars

Why Aren't Enthusiasts Honest (With Themselves)?

for them to consider purchasing, and that their purchasing decision is always about performance—absolutely.

 

I just don’t understand why we do this to ourselves. There’s no shame in having a $300 AV receiver if a $300 AV receiver gets the job done. There’s no shame in only having a 50-inch TV. I get the need to want to keep up with the Joneses, but the reality is the Joneses don’t even have what you think they do, for we’re all the Joneses.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

Does a Luxury Cinema Really Need a Projector?

Does a Luxury Cinema Need a Projector?

Here’s a pop quiz to start your day with: How big is the TV you see in the image above? If you’re familiar with this specific model (LG’s C9 OLED), the proportions of its pedestal may give you some idea. The rest of you probably think this is an unfair question. You’re trying to look for other clues that could give it away: How tall are those ceilings? How wide is that wall? More importantly, how far away from the screen was the camera when this photo was taken?

 

That’s actually exactly my point. For the record, the image is of a 77-inch display. But if I had told you it was 55, or 65, or even 88 inches, would you have balked? Probably not, because you intuitively understand that a display’s screen size isn’t the beginning and end of the conversation when it comes to how large it actually appears to your eyes. It’s the relationship

between the display size and the distance from seat to screen that determines the degree to which an image fills your field of view.

 

Not to pick on my colleague and friend John Sciacca here, but in his recent piece “Rediscovering My Joy for Home Theater,” he says, “Watching movies on a 115-inch screen is incredibly more involving than a 65-inch one.” What John is leaving unsaid there, though, is, “. . . from the same seating distance.” That last bit, that unspoken relationship between seat and screen, was taken for granted in John’s story, because to him it’s obvious. But that fact often gets tossed out the window completely when the gatekeepers of home cinema attempt to discredit the “lowly” TV as a legitimate screen for a proper home entertainment system.

 

I think this outdated perception of projectors as the only valid screens for home cinema systems is probably rooted in the equally outdated notion that commercial cinemas are the gold standard against which the home movie-watching experience should be judged. As I’ve argued in the past, that ship has sailed. 

These days, with a few rare and special exceptions aside, commercial cinemas are simply a way for most people to check out the latest Avengers or Star Wars flick before someone else ruins the plot for them. Or maybe they just want to view those big event movies with a few more subwoofers than their home AV systems can accommodate. But I guarantee you that almost none of the people who opt to go to their local movie theater to see the latest blockbusters would tell you that the allure of seeing an image bounced off a big sheet of perforated vinyl was what drew them out of the comforts of their own homes.

 

And mind you, I’m not claiming there aren’t plenty of valid reasons to install a projector at home. In his own media room, John sits roughly 12 feet from his screen, by his own estimation. He also has two kids at home, so movie-watching is often a whole-family experience. For his needs and his lifestyle, yeah, a projector is absolutely the right screen.

 

I, on the other hand, only have to worry about my wife and me. The only other permanent resident is Bruno, our 75-pound pit bull, and more often than not he either leaves the room when we watch movies or curls up in my lap and goes to sleep. We also only sit about six and a half feet from the screen in the main media room. The smallest high-performance home cinema projection screen I’m aware of is an 80-incher that would frankly be too much at that seating distance. A 75-inch display is pretty much perfect for this room, as it takes up a healthy 45.5 degrees of our field of view—a little more than

THX’s recommended 36 degrees, but so be it. We’d rather have a bit too much screen than a bit too little. But we don’t want The Last Jedi turning into a tennis match, either.

 

Interestingly enough, John’s 115-inch projection screen, when viewed from 12 feet away, takes up roughly 38.5 degrees of his field of view. In other words, my 75-inch screen looks bigger to me and my wife than his 115-inch projection screen looks to him and his family.

 

Am I bashing John’s choice of screens? Of course not. What works for him works for him, and what works for me

How to Determine Your Viewing Distance

 

If you want figure out your screen size based on viewing distance, or vice versa, but without having to wade through technical specs or do any heavy math, click this link.

works for me. And I’m sure he would agree. Different rooms. Different families. Different viewing habits. Different solutions. Without a doubt, we’re both enjoying a better movie-watching experience than we would at the local cineplex, and his system gives him one big advantage over mine: He gets to watch ultra-widescreen 2.4:1 aspect-ratio films without any letterboxing.

 

In addition to the larger perceptual screen real estate, though, my TV also gives me better black levels, better dynamic range, better peak brightness, and better color uniformity than any two-piece projection system could. And if for whatever reason we ever decided to watch a movie with the lights on, we wouldn’t have to worry about the screen washing out. (Not that we would, mind you. My wife and I prefer to keep any and all distractions to a minimum when watching movies, going so far as to put our mobile phones away or turning them off entirely. I’m just saying that we could leave a light on if we wanted to.)

 

And yet, the naysayers and gatekeepers would have you believe that for whatever reason my viewing experience is subpar. That I would somehow be better served by lacking black levels, middling contrasts, less peak brightness, and worse screen uniformity, simply because that would be a more faithful facsimile of the local cineplex.

 

To which I say this: The New Vision Theatres Chantilly 13 across town isn’t the yardstick by which I judge my movie-watching experience at home anymore. My home cinema system looks better and sounds better, and quite frankly has a better selection of films from which to choose. Granted, if we had a much larger room, or typically invited large groups of friends over to watch movies, a projection screen would likely be a superior alternative to our 75-inch TV on the balance sheet. If we had two or three rows of seating? No question about it—we would need a projector.

 

The beauty of current AV gear, though, is that you don’t have to change your lifestyle or viewing habits to have a better-than-movie-theater experience at home. You can assemble a reference-quality home cinema that conforms to your lifestyle, not the other way around. And if, like me, that means employing a gigantic TV as your screen of choice, you shouldn’t pay much attention to anyone telling you you’re doing it wrong, or that your system doesn’t count as “luxury.” Chances are, they’re trying to sell you something.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

Rediscovering My Joy for Home Theater

Rediscovering My Joy for Home Theater

I’d already planned to write a wrap-up post on my journey to get a new projector to update my personal home theater, but Andrew Robinson’s recent “4K is for Fanboys,” makes the timing of this post even more relevant.

 

As I mentioned in “It’s Time to Update My Theater,” technology had passed my previous Marantz projector by, and it had been quite some time since we had used it. Instead, we just watched our 65-inch TV screen full time. (I know, a first-world problem for sure.) Sure, it was still enjoyable, but it actually curtailed the number of movies we watched. When the projector was in action, we would generally watch two to three movies per week, making an evening around dropping the lights and focusing

on the big screen. But with the projector out of action, we went to watching two to three movies per month.

 

After the new projector arrived, I couldn’t wait to see it in action. Instead of waiting until I could get some help to properly install the JVC by retrofitting the new cabling required (sending 4K HDR signals upwards of 50 feet is beyond the limits of my old HDMI cable, and I’ve gone to an HDMI-over-fiber solution from FIBBR) and mounting the JVC, I just set it on its box on top of our kitchen counter, strung the FIBBR cable across the floor, did a quick-and-dirty alignment and focus, and settled in to watch a movie on the big screen.

 

And from the opening scene, I was ecstatic with my new purchase. The blacks were deep and cinematic, colors were bright and punchy, edges were sharp and defined, and, blown up to nearly 10 feet, the projector’s 4K image had incredible resolution and detail. For me, this is what true theater-at-home is all about.

 

Watching movies on a 115-inch screen is incredibly more involving than a 65-inch one. And with the projector, it is an active viewing experience, with the lights down and distractions minimized. In the short time I’ve had the new projector—less than two weeks—we’ve already watched seven films with it, and each time I’m giddy that this is something I’m actually able to enjoy in my own home.

 

Coupled with my 7.2.6-channel audio system, movies look and sound as good as virtually any commercial theater.

I’m not a filmmaker as Andrew is, and I’m not a student of film as site editor Mike Gaughn is. I don’t watch movies to dissect framing, composition, or lighting. And I’m sure there are many subtleties, references, and hat tips in films that I’m completely oblivious to. But, the fact is, most times when I go to watch a movie, it’s to relax and enjoy myself. And I’d imagine that’s what most people are looking to do with their home entertainment systems. I’m not looking for Ready Player One to change my world view, or for Alita: Battle Angel to offer a commentary on anything, or for John Wick to teach me any lessons, well, except for maybe on the benefits of rapid mag changes. 

 

I’m looking to sit back with a martini and be entertained for a couple of hours.

 

At the end of the day, unless you are a filmmaker evaluating your work, or a professional film critic getting paid to review the work of others, all of this “home theater stuff” is really just a hobby designed to be fun and enjoyable. And any technology improvements that can help people to achieve a better experience—be it 4K, HDR, Dolby Atmos, 3D, or other—is an improvement in my book.

 

To my eye, 4K HDR films look better, especially when blown up to large sizes. And, to my ear, Dolby Atmos (or DTS:X) soundtracks are more exciting and involving. And if I’m electing to spend my precious time watching something—be it Survivor on broadcast cable, Jack Ryan streaming on Amazon, the latest Star Wars, Avengers, or Pixar entry, or just some new release from the Kaleidescape Store, then I’d like to do so in the highest quality possible.

 

And if that makes me a 4K Fanboy as Andrew suggests, then sign me right up!

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

“Apollo 11” Goes 4K

"Apollo 11" Goes 4K

If you’ve read my review of the original HD release of Todd Douglas Miller’s documentary film Apollo 11 from earlier this year, you may recall that it was a bit more of a rant than a proper critique. Not about the film, mind you. Apollo 11 still stands as one of the year’s best cinematic efforts, especially in the more straightforward, less editorial approach it takes in capturing this one monumental moment in history.

 

The rant was instead about the film’s home video release, which was originally HD only, with no mention of a UHD/HDR followup. As I said in that original review, this was doubly troubling because Apollo 11 is among a small handful of films released recently to actually be sourced from a 4K digital intermediate. In fact, its original film elements were scanned at

resolutions between 8K and 16K. Given that most modern films, especially Hollywood tentpoles, are finished in 2K digital intermediates and upsampled to 4K for cinematic and home video release, the lack of a UHD option for Apollo 11 was as infuriating as it was puzzling.

 

Thankfully, that mistake has been rectified. Apollo 11 is now available in UHD with HDR on most major video platforms, including disc and Kaleidescape, with the latter being my viewing platform of choice. I know I mentioned purchasing the film in HD via Vudu in my original review, but that purchase doesn’t offer any sort of upgrade path for UHD, the way Kaleidescape does.

 

At any rate, I did a lot of speculation in that first review about the sort of differences I thought UHD would make for this title. And having now viewed it, most of those predictions turned out to be true. UHD does, indeed, reveal a lot of detail that was obscured in the HD release. That makes sense given that the source of so much of this film’s visuals existed in the form of 65mm/70mm archival footage.

 

One of the biggest differences you see when comparing the 

HD and UHD releases is in the textures of the Saturn V rocket. Ribbing in the first three stages of the rocket that dwindle to nothing in HD are clear and distinct in UHD. The little flag on the side of the rocket is also noticeably crisper, and the stars in its blue field stand out more as individual points of whiteness, rather than fuzzy variations in the value scale.

 

As predicted, the launch of Apollo 11 also massively benefits from HDR grading. The plume of exhaust that billows forth from the rocket shines with such stunning brightness that you almost—almost—want to squint.

 

One thing I didn’t predict, though—which ends up being my favorite aspect of this new HDR grade—is how much warmer and more lifelike the imagery is. In the standard dynamic range color grade of the HD version of the film, there’s an undeniable cooler, bluer cast to the colors that never really bothered me until I saw the warmer HDR version. Indeed, the HDR grade evokes the comforting warmth of the old Kodak stock on which the film was captured in a way the SDR grade simply doesn’t.

 

It’s true that the new UHD presentation does make the grain more pronounced in the middle passage of the film—where 65mm film stock gives way to 35mm and even 16mm footage. That honestly has more to do with the enhanced contrast of 

this presentation than it does the extra resolution. HD is quite sufficient to capture all the nuances and detail of this lower-quality film. But the boost in contrast does mean that grain pops a little more starkly.

 

This does nothing to detract from the quality of the presentation, though, at least not for me. And even if you do find this lush and organic grain somewhat 

distracting, I think you’ll agree it’s a small price to pay for the significantly crisper, more detailed, more faithful presentation of the first and third acts.

 

If you haven’t picked up Apollo 11 yet, congratulations—you get to enjoy your first viewing as it should have been presented to begin with. If you already bought the film in HD, I can’t recommend the upgrade to UHD highly enough. Thankfully, for Kaleidescape owners, that upgrade doesn’t mean purchasing the film all over again.

 

It is a shame Universal, the film’s home video distributor, has for whatever reason decided to hold back bonus features. The featurette included with the UHD Blu-ray release, which covers the discovery of the 65mm archival footage, is missing here—although it’s widely available on YouTube at this point (and is embedded above). And only Apple TV owners get access to an exclusive audio commentary. Then again, given how badly the studio fumbled the original home video release, it’s no real surprise that they’ve dropped the ball on making the bonus features widely available.

 

Don’t let that turn you off of the film, though. This is one that belongs in every movie collection, especially now that it’s available in UHD.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

4K is for Fanboys

4K is for Fanboys

I feel as if I might have a reputation around these parts, a heel of sorts. Why a heel and not a hero? Because I find that my opinions are often in opposition to that of my contemporaries. Not because they are wrong, but just because I think their focus is continually on things, topics, and ideas that play to a base that, well, is dying.

 

Dennis Burger wrote a terrific piece on why 4K isn’t always 4K. It is a truly good piece of writing and one that gets 99 percent of the argument absolutely correct. For as someone who has literally filmed a feature-length film for a motion-picture studio in 

true 4K only to have it shown in theaters in 2K, I can attest to the article’s validity. But Dennis, like me some years ago, missed the boat by even framing the argument around resolution at all.

 

You see, I thought people/viewers cared about things like resolution. Back in 2008, when I filmed my movie, the original RED ONE cinema camera just came out, and as a result the “whole world” was clamoring for 4K—or so it seemed. I had the choice of whether or not to film in 4K via the RED ONE or go with a more known entity by filming in 2K via cameras from Sony’s CineAlta line. Ultimately I chose option C, and went with a true dark horse contender 

in Dalsa, who up and to that point, had no cinema pedigree—unless you count being the ones who designed the sensor tech for the Mars Rover a cinematic endeavor. But I digress.

 

We didn’t use the RED ONE because it was buggier than a roadside motel mattress, and I didn’t choose to side with Sony because they were HD, and HD was yesterday’s news. Filming in 4K via the Dalsa back in 2008 was an absolute pain in the ass. (That’s me with the Dalsa Origin II in the photo at the right.) Spoiler alert, not much has changed in 2019, as 4K 

continues to be a bit of a pain, it’s just more accessible, which makes everyone think that they need it—more on that in a moment.

 

What is upsetting is that I do know the monetary difference my need to satiate the consumer-electronics fanboys cost me and my film—a quarter of a million dollars. While $250,000 isn’t much in Hollywood terms, it represented over a quarter of my film’s total budget. The cost of filming in HD you ask? Less than $30,000. Oh, and post production 

4K is for Fanboys

would’ve taken half the time—thus lowering costs further. All of that headache, backache, and money only to have the film bow in 2K via 4K digital projectors from—wait for it—Sony!

 

Now, I will sort of agree with the assertion that capturing visuals at a higher resolution or quality and downscaling to a lesser format—say HD—will result in a clearer or better picture—but honestly, only if you preface what you’re watching as such ahead of time. Which brings me to my point: All of this HD vs. 4K talk is for fanboys who insist on watching pixels and specs rather than watch the damn movie. Not one person, or journalist (apart from me), wrote about my film from the context of being the first-feature length film ever filmed entirely in 4K. They didn’t ask about it, nor care, because it doesn’t matter.

 

It never mattered.

 

What digital has done is remove the magic from cinema and replace it with a bunch of numbers that bored middle-aged dudes (yes, dudes) can masturbate over in an attempt to differentiate their lot from the rest. None of it has any bearing on the story, enjoyment, or skill. It’s an arms race, one we all fall prey to, and one we continually perpetuate, because, well, it sells. We’ve gotten away from cinema theory, history, and storytelling in recent years and instead become infatuated with bit-rates, color spaces, and codecs. And yet, in the same breath, so many of us bitch about why there are no good films being made anymore. It’s because the only thing audiences will pay for is what they think is going to look great on their brand new UltraHD TV.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

Choosing My New Projector

Choosing My New Projector

Following up on my last post, “It’s Time to Update My Theater,” I’m going to delve into the thought process that caused me to splurge and finally upgrade my projector.

 

As I mentioned, my existing projector was about 11 years old, and, while it still produced watchable pictures from Blu-ray and DVD discs, it wasn’t compatible with many of the new 4K HDR sources in my system, so we had just stopped using it. I was

toying around with ditching both the projector and my current 65-inch Sony flat panel and upgrading to a new 85-inch flat panel.

 

Why 85 inches? Well, that is about the current size limit before you start getting into ridiculously expensive pricing. For under $4,500, you can get a Sony XBR-85X950G flat-panel that has been universally reviewed as a fantastic display. This would provide a large screen image for viewing all the time, not just at night with the lights down. It would also handle HDR signals (and Dolby Vision) far better than a projector at any price could.

 

As this was a significantly cheaper upgrade option, I really considered it, but ultimately decided I would miss the truly large-screen experience of my 115-inch, 2.35 aspect screen.

 

We use the projector almost exclusively for movie watching, and having nearly double the screen real estate makes a massive difference, and is far more engaging than a direct-view set, even one at 85 inches. (Now, had the 98-inch 

Sony Z-series TV been a tenth of its price—selling for $7,000 instead of $70,000—that probably would have been my pick.)

 

So, having made the decision to stick with front projection, I had to settle on a model. I had a few criteria going in that helped narrow the search.

 

First, I wanted it to be true, native 4K resolution on the imager, not using any pixel shifting or “wobulation” to “achieve 4K resolution on screen.” This ruled out many of the DLP models from companies like Epson and Optoma. Nothing against them, I just wanted native 4K.

 

Second, it had to have a throw distance that worked with my current mounting location. Actually, this isn’t much of a concern anymore, and most modern projectors have an incredibly generous adjustment range on their lens.

 

Third, I needed a model that offered lens memory so it would work with my multi-aspect screen (92 inches when masked down to 16:9, and 115 inches when opened to full 2.35:1.) This allows the projector to zoom, shift, and focus for a variety of screen sizes at the push of a single button, and is crucial for multi-aspect viewing.

 

Fourth, it needed to integrate with my Control4 automation system. Sure, I could cobble together a driver, but it would never offer integration as tight as one that was meant to work with that particular model.

 

Finally, it had to fit my $10,000 budget. Unfortunately, this ruled out brands like Barco and DPI. I was super impressed with Barco’s Bragi projector, but, alas, it doesn’t fit in my tax bracket.

 

Basically, with these criteria, my search was narrowed to two companies: JVC and Sony. And primarily to two projectors: The JVC DLA-NX7 (shown at the top of the page) and the Sony VPL-VW695ES. (Were my budget higher, I would have added the JVC DLA-NX9 to that list, which has the primary advantage of a much higher quality, all-glass lens, but it was more than double the price. And while the less expensive JVC DLA-NX5 also met all my criteria, the step up NX7 offers more bang for just a little more buck.)

 

So, I did what a lot of people do prior to making a big technology purchase: Research. I read a ton of forum posts, read all of the reviews on both models, and watched video comparisons. I also reached out to a couple of professional reviewers and calibrators who had actually had hands-on time with both models.

 

The CEDIA Expo is a place where manufacturers often launch new projectors, so this past month’s show coincided perfectly with my hunt. Since both companies had models that had been launched at CEDIA 2018, I was eager to see what announcements they might have regarding replacements or upgrades. Alas, there were no model changes, which, in a way, can be a good thing, since it means both models are now proven, have had any early bugs worked out with firmware updates, and  are readily available and shipping.

 

I really hoped to check out both projectors at the show, but, unfortunately, no one was exhibiting either. (Apparently, CEDIA is not the place to show your sub-$10,000 models.)

 

Ultimately, two announcements at the show swayed me to pull the trigger on the JVC. First, the product manager I spoke with said the price was going up by $1,000 on October 1, so buying sooner than later would actually save me money. But more importantly, JVC introduced new firmware at CEDIA that would add a Frame Adapt HDR function that will dynamically analyze HDR10 picture levels frame by frame, automatically adjusting the brightness and color to optimize HDR performance for each frame.

 

Projectors historically have a difficult time handling HDR signals, and this firmware is designed to produce the best HDR images from every frame. This used to be achieved by using a high-end outboard video processor such as a Lumagen Radiance Pro, but that would add thousands of dollars to the system. When I saw this new technology demonstrated in JVC’s booth, I was all in.

 

In my next post, I’ll let you know if the purchase was worth it. (Spoiler: It totally was!)

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

It’s Time to Update My Theater

 Some views of my home theater space, pre upgrades

photos by Jim Raycroft

The first home theater component I ever purchased was a subwoofer back in 1995. It was a big 15-inch black cube Definitive Technology model that I drove into San Francisco to buy after researching everything I could find for weeks in all the enthusiast magazines at the time. From there, I bought a Yamaha digital surround decoder and Dolby Digital RF demodulator

for a laserdisc player, connected it all to some speakers and a 25-inch Proton tube TV, and voila! I had my first home theater system.

 

It didn’t have a lot of style or elegance, and it certainly wasn’t luxury, but I was on the cutting edge of 5.1-channel technology, and it sounded better than anything my friends had.

 

And I was hooked.

 

Over the years, my system has seen a lot of upgrades, most frequently in the preamp/processor section, as I chase the technology dragon of trying to stay current with surround formats, channel counts, and HDMI processing. (For the record, the 13.1-channel Marantz AV8805 is currently serving processing duties in my rack, and doing a very fine job of it, thank you.)

 

Speakers get upgraded the least often, as a good speaker rarely stops sounding good, and, if cared for, rarely breaks. Sources come and go as technology improves. Gone are the VCR, and the LaserDisc and DVD players. Currently in use are a Kaleidescape Strato and M500 player, Samsung UHD Blu-ray, Apple 4KTV, Dish Hopper 3, and Microsoft Xbox One.

 

Lying in the upgrade middle ground is my system display. Long gone is the 25-inch Proton, having been replaced by a 35-inch Mitsubishi, then a 61-inch Samsung DLP, then a 60-inch Pioneer Elite Plasma. Currently, my primary display is a Sony XBR-65X930D, a 65-inch 4K LED. However, it’s a D-

generation, and Sony is now on G models, so it might be due for replacement next year.

 

One device in my system that has never been upgraded is my video projector.

 

I always wanted a truly big-screen, cinematic experience, and this meant a projector and screen. So I purchased the best projector Marantz made (the VP-11S2, shown below) back in 2008, along with a Panamorph anamorphic lens and motorized 

sled system. This setup fires onto a Draper MultiView screen that has masking to show either a 92-inch 16:9 image or a 115-inch 2.35:1 Cinemascope image.

 

The first time we dropped the lights, powered on the projector, and lowered the screen, I was ecstatic. I couldn’t believe how lucky I was to have this amazing system in my own home, and we essentially stopped going out to the movies.

 

I continued to feel that way about my projection system for years. It 

It's Time to Update My Theater

provided an amazing, truly cinematic experience that made me happy literally every time we used it. And use it we did, generally watching two to three movies per week on the big screen.

 

But then, technology moved on.

 

Principally, HDMI went from 1.4 to 2.0, resolution went from 1080p to 4K, and video went from SDR to HDR.

 

While the Marantz still worked, it was now by far the weakest link in my theater chain, and it no longer supported any of the sources we wanted to watch. In fact, just watching a Blu-ray on the system via our Kaleidescape meant going into the Kaleidescape’s Web setup utility and telling the system to “dumb itself down” to output HDMI 1.4 signals. A huge hassle.

 

So, a couple of years ago, we basically stopped using the projector at all.

 

But, some things changed in the projector world at the recent CEDIA Expo in Denver that inspired me to finally make the upgrade plunge, and that’s what I’ll dive into in my next post!

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

How to Become an Expert Listener

How to Become an Expert Listener

Recently, I helped my friend Ed set up two audio systems. During the process of dialing them in, I had to walk him through what to listen for in order to hear the improvements because he didn’t know what to focus on in evaluating the sound. It occurred to me that most people don’t.

 

A luxury stereo system or home theater should deliver exceptional sound, of course. But what exactly should you listen for in evaluating, choosing, setting up, and enjoying a high-performance system?

 

(Note: I’m not going to dig deeply here into how to set up various aspects of a system to achieve peak performance, but rather what to listen for.)

 

First of all: A system will only sound as good as its source material. It’s essential to use good demo tracks. Don’t go with a low-bit-rate MP3 file for music listening, for example. Use an audiophile CD or LP, or a high-res download or streaming service.

 

For stereo music evaluation, you can’t go wrong with that stone classic, Pink Floyd’s The Dark Side of the Moon. It’s one of the best recordings ever made, thanks to the brilliant talent of Grammy-winning engineer Alan Parsons. Listing the strengths of this album is like outlining a mini-course in what to listen for:

 

—Deep, articulate bass, a rich midrange, and extended highs

—Accurate timbre of vocals and instruments (except when deliberately processed)

—An expansive sound field

—Wide dynamics, from almost subliminally soft to powerfully loud

—A remarkably clean sonic character.

 

(I’ll expand on each of these various areas below.)

 

A system should have a coherent tonal balance from top to bottom, without any particular frequency range sticking out. You don’t want it to sound too bright in the midrange (roughly the area between 200Hz and 5kHz, where most of the frequencies of the human voice reside) or have weak, recessed bass. With a solo piano recording like Robert Silverman’s superb

Chopin’s Last Waltz, listen for the transitions between the low, middle, and high notes, which should be smooth and seamless.

 

Listen for a clear, “transparent” sound with a lot of fine musical detail. The sound should be pure, without any “grain,” hardness, or roughness in texture. (For example, a flute should sound clean and natural, not buzzy or strident or distorted.) Bass should be articulate, not indistinct. The midrange should have plenty of presence, since that’s where most of the music “lives.” Highs should be airy and extended.

 

Subtleties like the “ting” of the triangle in the Fritz Reiner/Chicago Symphony recording of Scheherazade (an example of the upper range) or the reverb on Shelby Lynne’s voice on Just A Little Lovin’ (an example of the midrange) should be clearly audible. Although it’s not all that realistic in terms of spatial positioning of the instruments, Miles Davis’ jazz classic Kind of Blue is excellent for evaluating timbre, resolution, and overall naturalness of sound.

For stereo setups, listen for a coherent sound field without a “hole in the middle” (from your speakers being too far apart or not angled in properly) or a lack of imaging and spaciousness (speakers too close together). Depending on the recording, vocals and instruments can be precisely defined in space, left to right and front to back, and the sound field can seem to extend beyond the speakers and maybe even the room. (For some tips on speaker placement, check out these articles from Lifewire and Dynaudio.)

 

However, be aware that on some recordings, especially those from the late 1950s through early 1970s, vocals and instruments can be placed too far off to the left or right. Also, you won’t hear laser-focused pinpoint imaging on a properly-miked orchestral recording—because that’s not what things sound like in real life. And keep in mind that changing your

listening position will have a significant impact on the sound.

 

I once visited the Harman listening lab in Northridge, California, where they used Tracy Chapman’s “Fast Car” to help determine the differences between speakers. That’s because it’s one of the easiest cuts for people to use in picking out sonic differences.

 

When listening to multichannel movies or music, the sound literally expands, thanks to the addition of center and surround speakers, one or more subwoofers, and, in some installations, height speakers (for example, in a Dolby Atmos system). In fact, Cineluxe has some excellent recommendations for home theater demo material.

 

Listen for a good balance between all the speakers. The surround speakers and subwoofers shouldn’t overly call attention to themselves except when the audio mix warrants it. You should hear a seamless, immersive 360-degree bubble of sound.

 

Dialogue clarity is critical for movies and TV! As such, the performance of the center-channel speaker in a multichannel setup is crucial. (Center-channel volume can be set independently—a very important aspect of home theater system tuning.)

How to Listen—The App

 

I have a confession to make.

 

Instead of writing this post,  I could have been lazy and just told you to check out the Harman: How to Listen app. It’s a training course that teaches you how to become a better listener by pointing out various sonic aspects to focus on, such as specific frequency ranges, spatial balances, and other attributes. Check out this post by Harman’s Dr. Sean Olive for more details.

–F.D.

On another note, it’s a good idea to use material you’re familiar with when evaluating a system, even if it’s not “demo quality,” so you can instantly hear the improvements a luxury system can make. I can’t tell you how many times I’ve sat someone in front of my high-end setup, asked them to pick a favorite piece of music, and then heard them say things like, “I can’t believe the difference! I never knew it could sound like that! It sounds like a different recording!”

 

The best advice I can give is to constantly school yourself to become a better listener.

 

Go out and listen to live unamplified music, whether at Carnegie Hall or a friend strumming an acoustic guitar. Get familiar with the sonic nuances of various instruments. Listen to as many audio and home theater systems as possible, at stores, friends’ houses, and audio shows. Listen to the sounds around you—birds, wind, city streets.

 

Good listeners are made, not born.

Frank Doris

Frank Doris is the chief cook & bottle washer for Frank Doris/Public Relations and works with a
number of audio & music industry clients. He’s a professional guitarist and a vinyl enthusiast with
multiple turntables and thousands of records.

4K is Sometimes Actually 2K–But That’s OK

4K is Sometimes Actually 2K--But That's OK

From time to time in our reviews of 4K/HDR home video releases, you may have stumbled across a phrase that seems downright perplexing: “Taken from a 2K digital intermediate.” It stands to reason, after all, that a video file that has spent some portion of its life at 2K resolution can’t really be considered 4K. Or can it?

 

This can be doubly confusing when the sentence before or after makes note of the film being shot “on ARRIRAW at 6.5K resolution” or something to that effect. That’s a whole lot of different Ks for a film that’s ostensibly being released in 4K (or, more accurately “Ultra HD”) for home video. So, what exactly does all of this mean? And should you really care?

 

To get to the bottom of these questions, we need to back up and discuss how movies are shot, produced, and distributed. To keep the discussion as simple as possible, we’ll ignore films that are still captured on actual film stock and just focus on digital cinema, since that’s the way most movies (and TV shows) are shot.

 

Depending on the model of camera used, as well as other technical considerations, the resolution captured by these cameras generally ranges between 2K (2,048 x 858 or 2,048 x 1,152) and 6.5K (6,560 x 3,102), with a few other resolutions in between—like 2.8K (2,880 x 1,620) and 3.4K (3,424 x 2,202)—also commonly used. The “K” is short for “thousand,” and the resulting abbreviation is simply a rough approximation of the horizontal resolution of the resulting file.

 

At any rate, no matter what resolution a film is shot in, the footage has to be reformatted to standard digital cinema projector resolutions, either 2K (2,048 × 1,080) or 4K (4,096 × 2,160), before being distributed to commercial movie theaters. But a lot more than that happens to most films before they’re released. They have to be edited and color timed, and with most 

4K is Sometimes Actually 2K--But That's OK

blockbusters, special effects have to be rendered and composited into the footage that was shot on-set.

 

This work is time-consuming and expensive, and the higher the resolution at which the work is done, the costlier and more time-consuming it is. As such, due to budget constraints, release schedules, or in some cases simply preference, this work is usually done at 2K (2,048 × 1,080) resolution, the result of which is what we refer to as a 2K digital intermediate. This is the last step in the post-production process for most films, before their conversion to Digital Cinema Distribution Master (DCDM) and Digital Cinema Package (DCP), the latter being the compressed version of the final film sent to movie theaters for public consumption.

 

Sometimes, budget and time allowing, films are finished in a 4K digital intermediate—Black Panther, for example, just to name one recent Hollywood blockbuster. But by and large, the vast majority of effects-driven tentpole films go through the 2K bottleneck during postproduction.

 

Which may lead to you ask why they don’t just shoot the movies in 2K to begin with, if they’re going to be downsampled to 2K. It’s a good question. And the answer isn’t a simple one.

 

But, to simplify it as much as possible, shooting in 6.5K or 3.4K or even 2.8K, then downsampling to 2K, will often result in an image that’s crisper, clearer, and more

detailed than an image shot natively in 2K resolution. Ironically, you’ll also find some filmmakers who admit to shooting closeups of actors through filters of one form or another because the enhanced clarity of shooting in 6.5K or 3.4K or whatever can be somewhat less than flattering, even once the footage is downsampled to 2K. Nevertheless, there are technical advantages to shooting at such high resolutions, even if you and I will never see the original full-resolution footage.

 

Of course, there’s one other obvious question you may be asking: If all of this imagery has been shrunk down to 2K resolution, and all of the special effects have been rendered in 2K, why not just be honest about it and release the film in 2K? Why make the bogus claim that these home video releases are in 4K?

 

The cheeky answer is that we don’t have a 2K home video format. Digital cinema resolutions and home video resolutions simply don’t match up for historical reasons that I won’t delve into here. The older high-definition home video format, with its 1,920 x 1,080 pixels, is pretty close to 2K, but it’s still about six percent fewer pixels.

4K is Sometimes Actually 2K--But That's OK

The Oscar-winning Spider-Man: Into the Spider-Verse, which many feel is one of the most
visually stunning recent films and a reference-quality 4K HDR release, was created solely in the
2K domain and then upsampled to 4K for distribution

When you get right down to it, though, pixel count is actually one of the least important contributors to perceived image quality, once you get above a certain resolution. High dynamic range (HDR) video and wide color gamut actually play a much greater role in our perception of the quality of the picture. And HD video formats, such as Blu-ray or 1080p downloads and streams, simply don’t support the larger color gamut and higher dynamic range that modern video displays support.

 

For that, we have to step up to Ultra HD, which is colloquially called “4K” by many in our industry, if only because “Ultra HD” is a mouthful. The thing is, most UHD home video displays have a resolution of 3,840 x 2,160—a little less than the digital cinema standard 4K resolution of 4,096 × 2,160. But still, close enough.

 

And here’s the important thing to consider, if you take nothing else away from this long and rambling screed: If you want to enjoy the best that home video has to offer these days, you’re going to be watching your movies (and TV shows) in Ultra HD on an Ultra HD display. Would it be technically possible for Hollywood to release those movies and shows in something closer to 2K resolution, while also delivering HDR and wide color gamut? Sure. It may be contrary to home video format standards,

but nothing about that would violate the laws of physics.

 

But why would they? Your display (or your player, or maybe even your AV receiver or preamp) is going to upsample any incoming video to match the resolution of your screen anyway. One way or another, you’re going to be viewing 3,840 x 2,160 pixels. As such, why wouldn’t you want the studios to use their vastly more sophisticated professional video scalers to upsample the resolution before it’s delivered to you via disc, download, or streaming? Those video processors don’t work in real-time, the way the processors built into your player, receiver, or display do. They’re slow, methodical, and do a much better job.

 

So even if the movie you’re enjoying this evening technically passed through a 2K-resolution digital intermediate at some point, that doesn’t mean you’re being duped when you’re sold a “4K/UHD” home video release. You’re still enjoying the most important technical advantages of the Ultra HD format—namely the increased dynamic range and color gamut.

 

Mind you, for David Attenborough nature documentaries and other footage that doesn’t require the addition of special effects, I want a genuine Ultra HD video master, with every possible pixel kept intact. But for big Hollywood blockbusters? I honestly think this whole “Fake 4K” discussion has gotten way out of hand.

 

I’ll leave you with one last thought to consider. This summer’s biggest film, Avengers: Endgame, reportedly had a budget of more than $350 million before marketing costs 

were factored in. Of that $350-ish million, roughly $100 million went to the visuals, including special effects. Had the film been finished in a 4K digital intermediate instead of a 2K one, you can bet that budget would have been significantly higher (remember, the jump from 2K to 4K isn’t a doubling, but rather a quadrupling of pixels, since both the horizontal and vertical resolution is doubled, and rendering four times as many pixels simply costs a heck of a lot more money and time.)

 

Would it have been worth it? Well, consider this: The original John Wick film was shot in 2.8K and finished in a 4K digital intermediate, whereas the latest release in the franchise, John Wick 3, was shot in 3.2K and finished in a 2K digital intermediate. I haven’t seen any of these films, but every review I’ve read seems to indicate that the UHD home video release of the third looks noticeably better than the first.

 

If 2K digital intermediates were truly the bane of the home cinephile’s existence, this simply wouldn’t be the case. So, when we mention in reviews that an Ultra HD release came from a 2K digital intermediate, we’re not implying that you’re somehow being cheated out of pixels you thought you were paying for when you bought that big new “4K” display. We’re just video geeks being video geeks and pointing out the most pedantic of details. In the few rare cases where it makes a legitimate difference, we’ll point that out explicitly.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

The Current State of the Luxury Audio Art

The Current State of the Luxury Audio Art

Steinway Lyngdorf’s P200 surround processor

In my previous post, I talked about the intriguing video trends I came across at the recent custom integrators CEDIA Expo in Denver. While there weren’t as many new developments on the audio side, I did notice a few continuing and developing trends throughout the show that will have an impact on the luxury home cinema market. And, unlike some of the premium video solutions on the horizon, these are all things that can be implemented in a home theater immediately!

HIGHER CHANNEL COUNT

While immersive surround systems such as Dolby Atmos, DTS:X, and Auro3D are pretty much de facto in newly installed luxury home cinemas, we need to remember that these formats have been available in the home market for only about five years, and until fairly recently the channel count for most of these systems maxed out at 12 in a 7.1.4 configuration (seven ear-level speakers, a subwoofer, and four overhead speakers).

 

But there has been an explosion of systems that support up to 16 channels in a 9.1.6 array, which adds front width speakers at ear level and an additional pair of overhead speakers. While having 15 (or more) speakers in a room might seem excessive, creating a seamless and truly immersive experience in large rooms that have multiple rows of seating requires additional channels to create cohesion between speakers as objects travel around the surround mix.

The Current State of the Luxury Audio Art

Companies offering new 16-channel AV receivers and preamp/processorss include JBL Synthesis, Arcam, Acurus, Bryston, Emotiva, and Monoprice. Some companies are even pushing the boundaries beyond 16, including StormAudio, Steinway Lyngdorf, Trinnov, JBL Synthesis, and Datasat.

 

 

BETTER BASS IN EVERY SEAT

Three home theater masters—Theo Kalomirakis, Joel Silver, and Anthony Grimani—presented a full-day training course titled “Home Cinema Design Masterclass,” where they discussed best practices in home theater design. Grimani, president of Grimani Systems and someone who has worked on more than 1,000 rooms over his 34-year career, stated that 30% of what people like about an audio system happens between 20 and 100Hz—the bass region. In short, if a system’s bass response and performance aren’t good, the whole system suffers.

 

But low frequencies are difficult to pull off correctly, especially across multiple seating positions, which is the ultimate goal in a luxury cinema. Good bass is possible for multiple listeners, but multiple subwoofers are always needed. Two subs are better 

than one, three subs are better than two, and four subs are better than three. (But Grimani stated that adding more than four subs actually has diminishing results.)

 

All the best home cinemas feature multiple subwoofers, not for louder bass, as one might think, but for more even bass at every seat. The best theaters deliver slam and impact at the low-end, but are also quick and free of bloat, which is what multiple good subs can deliver.

 

 

ROOM CALIBRATION

In  that same master class, Tony Grimani also claimed that achieving good bass performance almost always requires the correct use of equalization. Virtually every home theater receiver or processor sold today incorporates some form of room-correction softwareeither proprietary like Yamaha’s YPAO or Anthem’s ARC, or a third-party solution like Audyssey. At its simplest, these software systems employ a microphone to measure tones emitted by the speakers, which are used to calculate the distance from the speaker to the listener as well as to set channel levels. The more advanced systems employ equalization and other types of filters in an attempt to optimize how the room interacts with the signal. 

 

Three of the most revered and powerful room-correction systems all hail from Europe: Trinnov Audio (France), Dirac (Sweden), and Steinway Lyngdorf’s RoomPerfect (Scandinavia). These systems offer more adjustments, filters, and flexibility that less expensive, more mass-market offerings in order to make any room sound its absolute 

best. (For more on the importance of room correction, read this post by Dennis Burger.)

 

One of the big developments in room correction featured at the CEDIA Expo was Dirac’s new Live Bass Management module. An add-on to the existing Dirac Live correction system, it will aggregate measurement and location data from multiple subwoofers in a system to determine how best to distribute bass evenly across a room. It will also correct low-frequency sound waves produced by the main speaker pair so they’re in sync with the rest of the system.

 

But just having access to the best room-correction devices isn’t enough, as the best luxury rooms are calibrated by professionals who have been trained in acoustics to the Nth degree. This small group of top-tier calibrators travels the world with kits costing tens of thousands of dollars in order to measure, sample, adjust, and tweak the parameter of every speaker and subwoofer in your theater to wring out the very last drop of performance.

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.