Tech

4K is for Fanboys

4K is for Fanboys

I feel as if I might have a reputation around these parts, a heel of sorts. Why a heel and not a hero? Because I find that my opinions are often in opposition to that of my contemporaries. Not because they are wrong, but just because I think their focus is continually on things, topics, and ideas that play to a base that, well, is dying.

 

Dennis Burger wrote a terrific piece on why 4K isn’t always 4K. It is a truly good piece of writing and one that gets 99 percent of the argument absolutely correct. For as someone who has literally filmed a feature-length film for a motion-picture studio in 

true 4K only to have it shown in theaters in 2K, I can attest to the article’s validity. But Dennis, like me some years ago, missed the boat by even framing the argument around resolution at all.

 

You see, I thought people/viewers cared about things like resolution. Back in 2008, when I filmed my movie, the original RED ONE cinema camera just came out, and as a result the “whole world” was clamoring for 4K—or so it seemed. I had the choice of whether or not to film in 4K via the RED ONE or go with a more known entity by filming in 2K via cameras from Sony’s CineAlta line. Ultimately I chose option C, and went with a true dark horse contender 

in Dalsa, who up and to that point, had no cinema pedigree—unless you count being the ones who designed the sensor tech for the Mars Rover a cinematic endeavor. But I digress.

 

We didn’t use the RED ONE because it was buggier than a roadside motel mattress, and I didn’t choose to side with Sony because they were HD, and HD was yesterday’s news. Filming in 4K via the Dalsa back in 2008 was an absolute pain in the ass. (That’s me with the Dalsa Origin II in the photo at the right.) Spoiler alert, not much has changed in 2019, as 4K 

continues to be a bit of a pain, it’s just more accessible, which makes everyone think that they need it—more on that in a moment.

 

What is upsetting is that I do know the monetary difference my need to satiate the consumer-electronics fanboys cost me and my film—a quarter of a million dollars. While $250,000 isn’t much in Hollywood terms, it represented over a quarter of my film’s total budget. The cost of filming in HD you ask? Less than $30,000. Oh, and post production 

4K is for Fanboys

would’ve taken half the time—thus lowering costs further. All of that headache, backache, and money only to have the film bow in 2K via 4K digital projectors from—wait for it—Sony!

 

Now, I will sort of agree with the assertion that capturing visuals at a higher resolution or quality and downscaling to a lesser format—say HD—will result in a clearer or better picture—but honestly, only if you preface what you’re watching as such ahead of time. Which brings me to my point: All of this HD vs. 4K talk is for fanboys who insist on watching pixels and specs rather than watch the damn movie. Not one person, or journalist (apart from me), wrote about my film from the context of being the first-feature length film ever filmed entirely in 4K. They didn’t ask about it, nor care, because it doesn’t matter.

 

It never mattered.

 

What digital has done is remove the magic from cinema and replace it with a bunch of numbers that bored middle-aged dudes (yes, dudes) can masturbate over in an attempt to differentiate their lot from the rest. None of it has any bearing on the story, enjoyment, or skill. It’s an arms race, one we all fall prey to, and one we continually perpetuate, because, well, it sells. We’ve gotten away from cinema theory, history, and storytelling in recent years and instead become infatuated with bit-rates, color spaces, and codecs. And yet, in the same breath, so many of us bitch about why there are no good films being made anymore. It’s because the only thing audiences will pay for is what they think is going to look great on their brand new UltraHD TV.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

Choosing My New Projector

Choosing My New Projector

Following up on my last post, “It’s Time to Update My Theater,” I’m going to delve into the thought process that caused me to splurge and finally upgrade my projector.

 

As I mentioned, my existing projector was about 11 years old, and, while it still produced watchable pictures from Blu-ray and DVD discs, it wasn’t compatible with many of the new 4K HDR sources in my system, so we had just stopped using it. I was

toying around with ditching both the projector and my current 65-inch Sony flat panel and upgrading to a new 85-inch flat panel.

 

Why 85 inches? Well, that is about the current size limit before you start getting into ridiculously expensive pricing. For under $4,500, you can get a Sony XBR-85X950G flat-panel that has been universally reviewed as a fantastic display. This would provide a large screen image for viewing all the time, not just at night with the lights down. It would also handle HDR signals (and Dolby Vision) far better than a projector at any price could.

 

As this was a significantly cheaper upgrade option, I really considered it, but ultimately decided I would miss the truly large-screen experience of my 115-inch, 2.35 aspect screen.

 

We use the projector almost exclusively for movie watching, and having nearly double the screen real estate makes a massive difference, and is far more engaging than a direct-view set, even one at 85 inches. (Now, had the 98-inch 

Sony Z-series TV been a tenth of its price—selling for $7,000 instead of $70,000—that probably would have been my pick.)

 

So, having made the decision to stick with front projection, I had to settle on a model. I had a few criteria going in that helped narrow the search.

 

First, I wanted it to be true, native 4K resolution on the imager, not using any pixel shifting or “wobulation” to “achieve 4K resolution on screen.” This ruled out many of the DLP models from companies like Epson and Optoma. Nothing against them, I just wanted native 4K.

 

Second, it had to have a throw distance that worked with my current mounting location. Actually, this isn’t much of a concern anymore, and most modern projectors have an incredibly generous adjustment range on their lens.

 

Third, I needed a model that offered lens memory so it would work with my multi-aspect screen (92 inches when masked down to 16:9, and 115 inches when opened to full 2.35:1.) This allows the projector to zoom, shift, and focus for a variety of screen sizes at the push of a single button, and is crucial for multi-aspect viewing.

 

Fourth, it needed to integrate with my Control4 automation system. Sure, I could cobble together a driver, but it would never offer integration as tight as one that was meant to work with that particular model.

 

Finally, it had to fit my $10,000 budget. Unfortunately, this ruled out brands like Barco and DPI. I was super impressed with Barco’s Bragi projector, but, alas, it doesn’t fit in my tax bracket.

 

Basically, with these criteria, my search was narrowed to two companies: JVC and Sony. And primarily to two projectors: The JVC DLA-NX7 (shown at the top of the page) and the Sony VPL-VW695ES. (Were my budget higher, I would have added the JVC DLA-NX9 to that list, which has the primary advantage of a much higher quality, all-glass lens, but it was more than double the price. And while the less expensive JVC DLA-NX5 also met all my criteria, the step up NX7 offers more bang for just a little more buck.)

 

So, I did what a lot of people do prior to making a big technology purchase: Research. I read a ton of forum posts, read all of the reviews on both models, and watched video comparisons. I also reached out to a couple of professional reviewers and calibrators who had actually had hands-on time with both models.

 

The CEDIA Expo is a place where manufacturers often launch new projectors, so this past month’s show coincided perfectly with my hunt. Since both companies had models that had been launched at CEDIA 2018, I was eager to see what announcements they might have regarding replacements or upgrades. Alas, there were no model changes, which, in a way, can be a good thing, since it means both models are now proven, have had any early bugs worked out with firmware updates, and  are readily available and shipping.

 

I really hoped to check out both projectors at the show, but, unfortunately, no one was exhibiting either. (Apparently, CEDIA is not the place to show your sub-$10,000 models.)

 

Ultimately, two announcements at the show swayed me to pull the trigger on the JVC. First, the product manager I spoke with said the price was going up by $1,000 on October 1, so buying sooner than later would actually save me money. But more importantly, JVC introduced new firmware at CEDIA that would add a Frame Adapt HDR function that will dynamically analyze HDR10 picture levels frame by frame, automatically adjusting the brightness and color to optimize HDR performance for each frame.

 

Projectors historically have a difficult time handling HDR signals, and this firmware is designed to produce the best HDR images from every frame. This used to be achieved by using a high-end outboard video processor such as a Lumagen Radiance Pro, but that would add thousands of dollars to the system. When I saw this new technology demonstrated in JVC’s booth, I was all in.

 

In my next post, I’ll let you know if the purchase was worth it. (Spoiler: It totally was!)

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

It’s Time to Update My Theater

 Some views of my home theater space, pre upgrades

photos by Jim Raycroft

The first home theater component I ever purchased was a subwoofer back in 1995. It was a big 15-inch black cube Definitive Technology model that I drove into San Francisco to buy after researching everything I could find for weeks in all the enthusiast magazines at the time. From there, I bought a Yamaha digital surround decoder and Dolby Digital RF demodulator

for a laserdisc player, connected it all to some speakers and a 25-inch Proton tube TV, and voila! I had my first home theater system.

 

It didn’t have a lot of style or elegance, and it certainly wasn’t luxury, but I was on the cutting edge of 5.1-channel technology, and it sounded better than anything my friends had.

 

And I was hooked.

 

Over the years, my system has seen a lot of upgrades, most frequently in the preamp/processor section, as I chase the technology dragon of trying to stay current with surround formats, channel counts, and HDMI processing. (For the record, the 13.1-channel Marantz AV8805 is currently serving processing duties in my rack, and doing a very fine job of it, thank you.)

 

Speakers get upgraded the least often, as a good speaker rarely stops sounding good, and, if cared for, rarely breaks. Sources come and go as technology improves. Gone are the VCR, and the LaserDisc and DVD players. Currently in use are a Kaleidescape Strato and M500 player, Samsung UHD Blu-ray, Apple 4KTV, Dish Hopper 3, and Microsoft Xbox One.

 

Lying in the upgrade middle ground is my system display. Long gone is the 25-inch Proton, having been replaced by a 35-inch Mitsubishi, then a 61-inch Samsung DLP, then a 60-inch Pioneer Elite Plasma. Currently, my primary display is a Sony XBR-65X930D, a 65-inch 4K LED. However, it’s a D-

generation, and Sony is now on G models, so it might be due for replacement next year.

 

One device in my system that has never been upgraded is my video projector.

 

I always wanted a truly big-screen, cinematic experience, and this meant a projector and screen. So I purchased the best projector Marantz made (the VP-11S2, shown below) back in 2008, along with a Panamorph anamorphic lens and motorized 

sled system. This setup fires onto a Draper MultiView screen that has masking to show either a 92-inch 16:9 image or a 115-inch 2.35:1 Cinemascope image.

 

The first time we dropped the lights, powered on the projector, and lowered the screen, I was ecstatic. I couldn’t believe how lucky I was to have this amazing system in my own home, and we essentially stopped going out to the movies.

 

I continued to feel that way about my projection system for years. It 

It's Time to Update My Theater

provided an amazing, truly cinematic experience that made me happy literally every time we used it. And use it we did, generally watching two to three movies per week on the big screen.

 

But then, technology moved on.

 

Principally, HDMI went from 1.4 to 2.0, resolution went from 1080p to 4K, and video went from SDR to HDR.

 

While the Marantz still worked, it was now by far the weakest link in my theater chain, and it no longer supported any of the sources we wanted to watch. In fact, just watching a Blu-ray on the system via our Kaleidescape meant going into the Kaleidescape’s Web setup utility and telling the system to “dumb itself down” to output HDMI 1.4 signals. A huge hassle.

 

So, a couple of years ago, we basically stopped using the projector at all.

 

But, some things changed in the projector world at the recent CEDIA Expo in Denver that inspired me to finally make the upgrade plunge, and that’s what I’ll dive into in my next post!

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

How to Become an Expert Listener

How to Become an Expert Listener

Recently, I helped my friend Ed set up two audio systems. During the process of dialing them in, I had to walk him through what to listen for in order to hear the improvements because he didn’t know what to focus on in evaluating the sound. It occurred to me that most people don’t.

 

A luxury stereo system or home theater should deliver exceptional sound, of course. But what exactly should you listen for in evaluating, choosing, setting up, and enjoying a high-performance system?

 

(Note: I’m not going to dig deeply here into how to set up various aspects of a system to achieve peak performance, but rather what to listen for.)

 

First of all: A system will only sound as good as its source material. It’s essential to use good demo tracks. Don’t go with a low-bit-rate MP3 file for music listening, for example. Use an audiophile CD or LP, or a high-res download or streaming service.

 

For stereo music evaluation, you can’t go wrong with that stone classic, Pink Floyd’s The Dark Side of the Moon. It’s one of the best recordings ever made, thanks to the brilliant talent of Grammy-winning engineer Alan Parsons. Listing the strengths of this album is like outlining a mini-course in what to listen for:

 

—Deep, articulate bass, a rich midrange, and extended highs

—Accurate timbre of vocals and instruments (except when deliberately processed)

—An expansive sound field

—Wide dynamics, from almost subliminally soft to powerfully loud

—A remarkably clean sonic character.

 

(I’ll expand on each of these various areas below.)

 

A system should have a coherent tonal balance from top to bottom, without any particular frequency range sticking out. You don’t want it to sound too bright in the midrange (roughly the area between 200Hz and 5kHz, where most of the frequencies 

of the human voice reside) or have weak, recessed bass. With a solo piano recording like Robert Silverman’s superb Chopin’s Last Waltz, listen for the transitions between the low, middle, and high notes, which should be smooth and seamless.

 

Listen for a clear, “transparent” sound with a lot of fine musical detail. The sound should be pure, without any “grain,” hardness, or roughness in texture. (For example, a flute should sound clean and natural, not buzzy or strident or distorted.) Bass should be articulate, not indistinct. The midrange should have plenty of presence, since that’s where most of the music “lives.” Highs should be airy and extended.

 

Subtleties like the “ting” of the triangle in the Fritz Reiner/Chicago Symphony recording of Scheherazade (an example of the upper range) or the reverb on Shelby Lynne’s voice on Just A Little Lovin’ (an example of the midrange) should be clearly audible. Although it’s not all that realistic in terms of spatial positioning of the instruments, Miles Davis’ jazz classic Kind of Blue is excellent for evaluating timbre, resolution, and overall naturalness of sound.

 

For stereo setups, listen for a coherent sound field without a “hole in the middle” (from your speakers being too far apart or not angled in properly) or a lack of imaging and spaciousness (speakers too close together). Depending on the recording, vocals and instruments can be precisely defined in space, left to right and front to back, and the sound field can seem to extend beyond the 

speakers and maybe even the room. (For some tips on speaker placement, check out these articles from Lifewire and Dynaudio.)

 

However, be aware that on some recordings, especially those from the late 1950s through early 1970s, vocals and instruments can be placed too far off to the left or right. Also, you won’t hear laser-focused pinpoint imaging on a properly-miked orchestral recording—because that’s not what things sound like in real life. And keep in mind that changing your

listening position will have a significant impact on the sound.

 

I once visited the Harman listening lab in Northridge, California, where they used Tracy Chapman’s “Fast Car” to help determine the differences between speakers. That’s because it’s one of the easiest cuts for people to use in picking out sonic differences.

 

When listening to multichannel movies or music, the sound literally expands, thanks to the addition of center and surround speakers, one or more subwoofers, and, in some installations, height speakers (for example, in a Dolby Atmos system). In fact, Cineluxe has some excellent recommendations for home theater demo material.

 

Listen for a good balance between all the speakers. The surround speakers and subwoofers shouldn’t overly call attention to themselves except when the audio mix warrants it. You should hear a seamless, immersive 360-degree bubble of sound.

 

Dialogue clarity is critical for movies and TV! As such, the performance of the center-channel speaker in a multichannel setup is crucial. (Center-channel volume can be set independently—a very important aspect of home theater system tuning.)

How to Listen—The App

 

I have a confession to make.

 

Instead of writing this post,  I could have been lazy and just told you to check out the Harman: How to Listen app. It’s a training course that teaches you how to become a better listener by pointing out various sonic aspects to focus on, such as specific frequency ranges, spatial balances, and other attributes. Check out this post by Harman’s Dr. Sean Olive for more details.

–F.D.

On another note, it’s a good idea to use material you’re familiar with when evaluating a system, even if it’s not “demo quality,” so you can instantly hear the improvements a luxury system can make. I can’t tell you how many times I’ve sat someone in front of my high-end setup, asked them to pick a favorite piece of music, and then heard them say things like, “I can’t believe the difference! I never knew it could sound like that! It sounds like a different recording!”

 

The best advice I can give is to constantly school yourself to become a better listener.

 

Go out and listen to live unamplified music, whether at Carnegie Hall or a friend strumming an acoustic guitar. Get familiar with the sonic nuances of various instruments. Listen to as many audio and home theater systems as possible, at stores, friends’ houses, and audio shows. Listen to the sounds around you—birds, wind, city streets.

 

Good listeners are made, not born.

Frank Doris

Frank Doris is the chief cook & bottle washer for Frank Doris/Public Relations and works with a
number of audio & music industry clients. He’s a professional guitarist and a vinyl enthusiast with
multiple turntables and thousands of records.

4K is Sometimes Actually 2K–But That’s OK

4K is Sometimes Actually 2K--But That's OK

From time to time in our reviews of 4K/HDR home video releases, you may have stumbled across a phrase that seems downright perplexing: “Taken from a 2K digital intermediate.” It stands to reason, after all, that a video file that has spent some portion of its life at 2K resolution can’t really be considered 4K. Or can it?

 

This can be doubly confusing when the sentence before or after makes note of the film being shot “on ARRIRAW at 6.5K resolution” or something to that effect. That’s a whole lot of different Ks for a film that’s ostensibly being released in 4K (or, more accurately “Ultra HD”) for home video. So, what exactly does all of this mean? And should you really care?

 

To get to the bottom of these questions, we need to back up and discuss how movies are shot, produced, and distributed. To keep the discussion as simple as possible, we’ll ignore films that are still captured on actual film stock and just focus on digital cinema, since that’s the way most movies (and TV shows) are shot.

 

Depending on the model of camera used, as well as other technical considerations, the resolution captured by these cameras generally ranges between 2K (2,048 x 858 or 2,048 x 1,152) and 6.5K (6,560 x 3,102), with a few other resolutions in between—like 2.8K (2,880 x 1,620) and 3.4K (3,424 x 2,202)—also commonly used. The “K” is short for “thousand,” and the resulting abbreviation is simply a rough approximation of the horizontal resolution of the resulting file.

 

At any rate, no matter what resolution a film is shot in, the footage has to be reformatted to standard digital cinema projector resolutions, either 2K (2,048 × 1,080) or 4K (4,096 × 2,160), before being distributed to commercial movie theaters. But a lot more than that happens to most films before they’re released. They have to be edited and color timed, and with most 

4K is Sometimes Actually 2K--But That's OK

blockbusters, special effects have to be rendered and composited into the footage that was shot on-set.

 

This work is time-consuming and expensive, and the higher the resolution at which the work is done, the costlier and more time-consuming it is. As such, due to budget constraints, release schedules, or in some cases simply preference, this work is usually done at 2K (2,048 × 1,080) resolution, the result of which is what we refer to as a 2K digital intermediate. This is the last step in the post-production process for most films, before their conversion to Digital Cinema Distribution Master (DCDM) and Digital Cinema Package (DCP), the latter being the compressed version of the final film sent to movie theaters for public consumption.

 

Sometimes, budget and time allowing, films are finished in a 4K digital intermediate—Black Panther, for example, just to name one recent Hollywood blockbuster. But by and large, the vast majority of effects-driven tentpole films go through the 2K bottleneck during postproduction.

 

Which may lead to you ask why they don’t just shoot the movies in 2K to begin with, if they’re going to be downsampled to 2K. It’s a good question. And the answer isn’t a simple one.

 

But, to simplify it as much as possible, shooting in 6.5K or 3.4K or even 2.8K, then downsampling to 2K, will often result in an image that’s crisper, clearer, and more

detailed than an image shot natively in 2K resolution. Ironically, you’ll also find some filmmakers who admit to shooting closeups of actors through filters of one form or another because the enhanced clarity of shooting in 6.5K or 3.4K or whatever can be somewhat less than flattering, even once the footage is downsampled to 2K. Nevertheless, there are technical advantages to shooting at such high resolutions, even if you and I will never see the original full-resolution footage.

 

Of course, there’s one other obvious question you may be asking: If all of this imagery has been shrunk down to 2K resolution, and all of the special effects have been rendered in 2K, why not just be honest about it and release the film in 2K? Why make the bogus claim that these home video releases are in 4K?

 

The cheeky answer is that we don’t have a 2K home video format. Digital cinema resolutions and home video resolutions simply don’t match up for historical reasons that I won’t delve into here. The older high-definition home video format, with its 1,920 x 1,080 pixels, is pretty close to 2K, but it’s still about six percent fewer pixels.

4K is Sometimes Actually 2K--But That's OK

The Oscar-winning Spider-Man: Into the Spider-Verse, which many feel is one of the most
visually stunning recent films and a reference-quality 4K HDR release, was created solely in the
2K domain and then upsampled to 4K for distribution

When you get right down to it, though, pixel count is actually one of the least important contributors to perceived image quality, once you get above a certain resolution. High dynamic range (HDR) video and wide color gamut actually play a much greater role in our perception of the quality of the picture. And HD video formats, such as Blu-ray or 1080p downloads and streams, simply don’t support the larger color gamut and higher dynamic range that modern video displays support.

 

For that, we have to step up to Ultra HD, which is colloquially called “4K” by many in our industry, if only because “Ultra HD” is a mouthful. The thing is, most UHD home video displays have a resolution of 3,840 x 2,160—a little less than the digital cinema standard 4K resolution of 4,096 × 2,160. But still, close enough.

 

And here’s the important thing to consider, if you take nothing else away from this long and rambling screed: If you want to enjoy the best that home video has to offer these days, you’re going to be watching your movies (and TV shows) in Ultra HD on an Ultra HD display. Would it be technically possible for Hollywood to release those movies and shows in something closer to 2K resolution, while also delivering HDR and wide color gamut? Sure. It may be contrary to home video format standards,

but nothing about that would violate the laws of physics.

 

But why would they? Your display (or your player, or maybe even your AV receiver or preamp) is going to upsample any incoming video to match the resolution of your screen anyway. One way or another, you’re going to be viewing 3,840 x 2,160 pixels. As such, why wouldn’t you want the studios to use their vastly more sophisticated professional video scalers to upsample the resolution before it’s delivered to you via disc, download, or streaming? Those video processors don’t work in real-time, the way the processors built into your player, receiver, or display do. They’re slow, methodical, and do a much better job.

 

So even if the movie you’re enjoying this evening technically passed through a 2K-resolution digital intermediate at some point, that doesn’t mean you’re being duped when you’re sold a “4K/UHD” home video release. You’re still enjoying the most important technical advantages of the Ultra HD format—namely the increased dynamic range and color gamut.

 

Mind you, for David Attenborough nature documentaries and other footage that doesn’t require the addition of special effects, I want a genuine Ultra HD video master, with every possible pixel kept intact. But for big Hollywood blockbusters? I honestly think this whole “Fake 4K” discussion has gotten way out of hand.

 

I’ll leave you with one last thought to consider. This summer’s biggest film, Avengers: Endgame, reportedly had a budget of more than $350 million before marketing costs 

were factored in. Of that $350-ish million, roughly $100 million went to the visuals, including special effects. Had the film been finished in a 4K digital intermediate instead of a 2K one, you can bet that budget would have been significantly higher (remember, the jump from 2K to 4K isn’t a doubling, but rather a quadrupling of pixels, since both the horizontal and vertical resolution is doubled, and rendering four times as many pixels simply costs a heck of a lot more money and time.)

 

Would it have been worth it? Well, consider this: The original John Wick film was shot in 2.8K and finished in a 4K digital intermediate, whereas the latest release in the franchise, John Wick 3, was shot in 3.2K and finished in a 2K digital intermediate. I haven’t seen any of these films, but every review I’ve read seems to indicate that the UHD home video release of the third looks noticeably better than the first.

 

If 2K digital intermediates were truly the bane of the home cinephile’s existence, this simply wouldn’t be the case. So, when we mention in reviews that an Ultra HD release came from a 2K digital intermediate, we’re not implying that you’re somehow being cheated out of pixels you thought you were paying for when you bought that big new “4K” display. We’re just video geeks being video geeks and pointing out the most pedantic of details. In the few rare cases where it makes a legitimate difference, we’ll point that out explicitly.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

The Current State of the Luxury Audio Art

The Current State of the Luxury Audio Art

Steinway Lyngdorf’s P200 surround processor

In my previous post, I talked about the intriguing video trends I came across at the recent custom integrators CEDIA Expo in Denver. While there weren’t as many new developments on the audio side, I did notice a few continuing and developing trends throughout the show that will have an impact on the luxury home cinema market. And, unlike some of the premium video solutions on the horizon, these are all things that can be implemented in a home theater immediately!

HIGHER CHANNEL COUNT

While immersive surround systems such as Dolby Atmos, DTS:X, and Auro3D are pretty much de facto in newly installed luxury home cinemas, we need to remember that these formats have been available in the home market for only about five years, and until fairly recently the channel count for most of these systems maxed out at 12 in a 7.1.4 configuration (seven ear-level speakers, a subwoofer, and four overhead speakers).

 

But there has been an explosion of systems that support up to 16 channels in a 9.1.6 array, which adds front width speakers at ear level and an additional pair of overhead speakers. While having 15 (or more) speakers in a room might seem excessive, creating a seamless and truly immersive experience in large rooms that have multiple rows of seating requires additional channels to create cohesion between speakers as objects travel around the surround mix.

The Current State of the Luxury Audio Art

Companies offering new 16-channel AV receivers and preamp/processorss include JBL Synthesis, Arcam, Acurus, Bryston, Emotiva, and Monoprice. Some companies are even pushing the boundaries beyond 16, including StormAudio, Steinway Lyngdorf, Trinnov, JBL Synthesis, and Datasat.

 

 

BETTER BASS IN EVERY SEAT

Three home theater masters—Theo Kalomirakis, Joel Silver, and Anthony Grimani—presented a full-day training course titled “Home Cinema Design Masterclass,” where they discussed best practices in home theater design. Grimani, president of Grimani Systems and someone who has worked on more than 1,000 rooms over his 34-year career, stated that 30% of what people like about an audio system happens between 20 and 100Hz—the bass region. In short, if a system’s bass response and performance aren’t good, the whole system suffers.

 

But low frequencies are difficult to pull off correctly, especially across multiple seating positions, which is the ultimate goal in a luxury cinema. Good bass is possible for multiple listeners, but multiple subwoofers are always needed. Two subs are better 

than one, three subs are better than two, and four subs are better than three. (But Grimani stated that adding more than four subs actually has diminishing results.)

 

All the best home cinemas feature multiple subwoofers, not for louder bass, as one might think, but for more even bass at every seat. The best theaters deliver slam and impact at the low-end, but are also quick and free of bloat, which is what multiple good subs can deliver.

 

 

ROOM CALIBRATION

In  that same master class, Tony Grimani also claimed that achieving good bass performance almost always requires the correct use of equalization. Virtually every home theater receiver or processor sold today incorporates some form of room-correction softwareeither proprietary like Yamaha’s YPAO or Anthem’s ARC, or a third-party solution like Audyssey. At its simplest, these software systems employ a microphone to measure tones emitted by the speakers, which are used to calculate the distance from the speaker to the listener as well as to set channel levels. The more advanced systems employ equalization and other types of filters in an attempt to optimize how the room interacts with the signal. 

 

Three of the most revered and powerful room-correction systems all hail from Europe: Trinnov Audio (France), Dirac (Sweden), and Steinway Lyngdorf’s RoomPerfect (Scandinavia). These systems offer more adjustments, filters, and flexibility that less expensive, more mass-market offerings in order to make any room sound its absolute 

best. (For more on the importance of room correction, read this post by Dennis Burger.)

 

One of the big developments in room correction featured at the CEDIA Expo was Dirac’s new Live Bass Management module. An add-on to the existing Dirac Live correction system, it will aggregate measurement and location data from multiple subwoofers in a system to determine how best to distribute bass evenly across a room. It will also correct low-frequency sound waves produced by the main speaker pair so they’re in sync with the rest of the system.

 

But just having access to the best room-correction devices isn’t enough, as the best luxury rooms are calibrated by professionals who have been trained in acoustics to the Nth degree. This small group of top-tier calibrators travels the world with kits costing tens of thousands of dollars in order to measure, sample, adjust, and tweak the parameter of every speaker and subwoofer in your theater to wring out the very last drop of performance.

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

The Need for High-End Audio

The Need for High-End Audio

For me, high-end audio is all about the emotion.

 

Hold that thought for a moment.

 

In a recent column, my friend and colleague Adrienne Maxwell asked, “Do we really need high-end audio?” She outlined many valid reasons as to why the answer may not be “yes.” Certainly, high-end audio would not be at the bedrock of Maslow’s hierarchy of needs. And the path to high-end nirvana can have many challenges.

 

As a consumer, there’s the expense (though one can assemble a wonderfully musical system without spending outrageous sums of money, as Adrienne pointed out), the concerns of system and room matching, the need for proper setup, and the possibility that after investing all that time and money your particular combination of room and gear just might not work well together. (The advice of an expert can be invaluable in avoiding this pitfall.)

As a salesperson or dealer, you have a responsibility to provide your customer with what they want. It goes without saying that this requires skill and insight, not just a desire to earn a big spiff.

 

As a high-end manufacturer, you have to balance the sometimes opposing factors of price, performance, aesthetics, manufacturability, business costs, and market demand. If you’re going all-out on a product that strives for ultimate quality, it will almost certainly carry a high price tag, and the law of diminishing returns will be staring you in the face.

 

And, yes, sometimes a large speaker might cost $30,000 or $100,000 or more. But consider their multiple top-quality drivers, complex-geometry cabinets with expensive woods and finishes, elaborate crossovers, premium parts, and so on. These don’t come cheap, and manufacturers and dealers have to make a profit. And such speakers can outperform other designs, sometimes dramatically so, especially in presence, scale, dynamics and bass extension.

 

As a reviewer, I can attest that properly reviewing high-end audio gear is demanding. Let’s say you’re doing a speaker review. You need to listen using different amps, cables, source components, and even rooms in order to try to factor out what the speaker is doing from what the other equipment is doing.

 

Then there’s the psychological pressure. You have a responsibility to get it right because the stakes with a high-end review are high. Because this gear can be so expensive to produce, a negative review can financially harm a manufacturer, especially a smaller one.

 

So why get involved in high-end audio at all? And, as Adrienne pointed out, what the heck is it even, anyway?

 

There have been many definitions of “high-end audio” over the decades, most defining it as the ability for components or systems to more accurately or convincingly reproduce the sound of music than typical products. Harry Pearson, founder of The Absolute Sound, characterized high-end as the ability to reproduce the sound of real music—the absolute sound—in real space. Certainly, when most think of high-end they think of expensive prices.

 

But, like I said, for me—and for so many others—it’s all about the emotion.

 

A high-end system is one that crosses the line from a mere (even if high-quality) reproducer of sound to one that conveys the emotional impact of music.

 

It’s a system that draws you in and engages you. It makes you forget that you’re listening to reproduced sound and makes a direct connection to your feelings on a primal, soul-deep level.

 

This is an elusive quality. Just ask an audiophile dedicated to the pursuit, or anyone who’s spent hours or days setting up a system at an audio show or a dealer or a customer’s home. A system might sound good, or it might even sound bad, and after painstakingly adjusting speaker placement, cartridge alignment, vibration-isolating feet, room treatment, or what-have-you, there’s ideally a moment when everything comes together and the sound becomes right, locked-in, and, at the best of times, magical.

 

I fervently believe that high-end audio is worth defending, preserving, and encouraging. (Disclaimer: I’m in the high-end audio industry. And let’s set aside considerations of possible overpricing, marketing hype, accusations of “snake oil,” and other frown-inducing aspects for the moment.) High-end audio reflects not only a constant striving for excellence but a noble (if also commercial) effort to bring listeners ever-closer to the music.

 

And when you get that closeness, it’s one of the most wonderful feelings in the world.

Frank Doris

Frank Doris is the chief cook & bottle washer for Frank Doris/Public Relations and works with a
number of audio & music industry clients. He’s a professional guitarist and a vinyl enthusiast with
multiple turntables and thousands of records.

The 4 Hottest Trends in Luxury Video

The 4 Hottest Trends in Luxury Video

Samsung’s The Wall Luxury microLED TV

I got a chance to get a bead on the latest trends in luxury video this past week at the annual custom integrators CEDIA Expo in Denver. It was great to see that some of the most intriguing products announced at January’s Consumer Electronics Show (CES) are finally becoming real. 

 

microLED DISPLAYS

Perhaps the most exciting technology on display were large-screen microLED video panels that can come in sizes up to 65 feet diagonal. The images on these screens are incredibly bright, have no loss of black level due to ambient lighting, offer incredible contrast, support a wider color gamut, offer superior off-angle viewing, and handle HDR signals far better than front-projection systems.

 

MicroLED systems use small LED tiles, usually little larger than a brick, that snap into a larger matrix to form the full panel. You can later add more tiles to form an even larger screen, and tiles can be replaced as needed. (Most displays ship with extra tiles that have been matched to ensure color uniformity in the picture and facilitate in-field replacement.)

 

MicroLED panels increase resolution by decreasing the size of the pixel structure, or pitch. Reducing the distance between pixels—as measured from center-of-pixel to center-of-pixel—makes individual pixels invisible at typical seating distances.

Many companies offer panels with pixel pitch of less than 8mm.

 

The downside? This technology is massively expensive. How massive? Samsung’s 146-inch diagonal The Wall Luxury (shown above) will retail for $400,000. Need bigger? Sony has you covered with it’s Crystal LED Display System—previously given the awkward nickname CLEDIS—with a 16 x 9-foot panel (219-inch diagonal) that is full 4K resolution, with 1-million:1 contrast and supports high frame rates up to 120 fps, selling for $877,000. Other manufacturers I spoke with—such as Planar, Barco, and Digital Projection—all offer panels of varying sizes with similar pricing.

 

For the luxury market, this is truly the ultimate solution; but it looks likely microLED will never reach mainstream pricing.

 

 

LARGE-SCREEN PROJECTION

If you want a screen larger than 90 inches for your luxury theater or media room but don’t want to pay the exorbitant prices commanded by microLED displays, front-projection systems remain the best way to go. Due to limitations in light output, projectors often struggle with HDR signals, which are typically mastered for LED displays capable of producing far brighter images. Improving HDR handling is something projector companies continue working on, and both Sony and JVC rolled out new firmware specifically to address how their projectors process HDR images.

 

JVC’s new Frame Adapt HDR analyzes the peak brightness of each frame using a proprietary algorithm and adjusts dynamic range to provide the best possible HDR image. Frame Adapt HDR works with any HDR10 content, meaning all HDR sources—Kaleidescape Strato, Ultra HD Blu-ray players, Apple TV 4K, Xbox One, etc.—can be enjoyed with greater dynamic range and image quality.

 

Barco displayed a very cool projection solution by using a mirror system and its projection-warping technology to place the projector way off center and hidden out of the way—

actually turned sideways in a soffit and firing from the back corner of the room—while still offering a fantastic large-screen image.

 

 

ULTRA-SHORT-THROW PROJECTION

Ultra-short-throw projectors can sit very close to the screen wall—often just inches away—tucked low and out of sight, and can even be completely concealed in cabinetry. Paired with ambient-light-rejecting screens, these projectors produce bright and contrasty images in a typically lit room, meaning they can serve as a TV replacement, giving you 100 to 120 inches of screen that can be enjoyed all the time.

 

Short-throws have typically been priced for the upper end of the market. But at least four companies—LG, Epson, Optoma, and Hisense—now offer 4K laser projectors, usually paired with an appropriate screen and featuring a basic audio system, for under $6,000. This makes them a more attractive option for secondary rooms, like a den or bedroom. 

 

 

8K VIDEO

It seems silly to be talking about 8K video when we aren’t even at a point where broadcast TV—either off-the-air, cable, or satellite—can regularly deliver 4K images, but progress never stops in technology land. Sony, LG, and Samsung all demonstrated 8K displays at the show.

 

Beyond the added pixels—of which there are over 33 million in a 7,680 x 4,320 array—these sets also feature flagship video processing and higher brightness. And it’s these other features that have far more impact on the image than all the extra pixels.

 

Of the sets on display, one of the most impressive was LG’s new 88-inch 8K OLED, which delivered truly lifelike images, with amazing color detail and the ultra-deep black levels for which OLED is known. I’m sure they were feeding the set true 8K images, as they had stunning clarity and depth. At $30,000, this set is truly luxury, but for the viewer who wants the best-of-the-best, this 8K OLED panel won’t fail to impress.

John Sciacca

Probably the most experienced writer on custom installation in the industry, John Sciacca is
co-owner of Custom Theater & Audio in Murrells Inlet, South Carolina, & is known for his writing
for such publications as
 Residential Systems and Sound & Vision. Follow him on Twitter at

@SciaccaTweets and at johnsciacca.com.

Do We Really Need High-End Audio?

Do We Really Need High-End Audio?

In the roughly 17 years that I’ve been an AV reviewer, I’ve covered pretty much every product category. I’ve reviewed video displays, speakers, remote controls, disc players, AV receivers—you name it. And while the products I reviewed covered a wide price range, there was always one category I tried to avoid: High-end audio. Now, I can’t give you an exact price or spec that represented the cutoff where I would pass an audio review opportunity on to someone else. The best way I can quantify “high-end audio” is to say that you know it when you see it. And perhaps that’s part of my concern with it.

 

Eventually my focus moved into the realm of display reviews, and one reason I’m quite comfortable there is because, generally speaking, there are clear, quantifiable steps that distinguish one performance class from another. You can measure black level and contrast, color accuracy, and now HDR peak brightness and accuracy. You can say to someone, “If you really

value [this], then you should buy [that].” “If you mostly use your TV to do [this], then you should save your money and get [that].” Of course you’ll run into products that straddle the fence between budget and mid-level, or between mid-level and high-end, which may make it harder to render a final verdict, but those are more the exception than the rule.

 

That wasn’t always the case, though. I first started reviewing displays in the early days of high-definition. There were virtually no budget HDTVs, but there was certainly a high-end realm, inhabited by brands like Mitsubishi, JVC, and Pioneer Elite. Sitting at the very top of the food chain was Runco, maker of the ultimate high-end TVs and 

projectors. It wasn’t necessarily that Runco displays performed significantly better than other lower-priced options, but they were sold exclusively through dealers that were trained to provide a level of service and support to justify the products’ high-end prices. And that model worked for them. It’s fair to say that Runco owned the luxury market.

 

But then a funny thing happened. Samsung and Vizio came along and proved that you could sell TVs that performed really well for a lot less money. JVC and Epson did the same thing with front projectors. High-definition displays became less of a luxury and more of a commodity, and the brands that couldn’t adapt to this new reality died. One by one the high-end display products just sort of fell away. Even Runco was ultimately purchased by commercial-display company Planar, which tried for a while to keep a presence in the luxury home market but eventually gave up.

 

Sure, names like SIM2 and B&O still exist, but they cater to a very niche market of loyalists. For the most part, the era of the truly exorbitantly priced home video product is dead.

 

That’s not the case in the audio market, at least not to the same extent. This market has faced similar challenges over the past 10 years, as companies like GoldenEar, SVS, and ELAC on the speaker side and Emotiva on the electronics side have proven that you can deliver high-performance audio products for a lot less money.

 

It has certainly been disruptive, forcing some brands out of business and others into the hands of private-equity companies. But big-name audiophile brands like Paradigm, Focal, MartinLogan, Revel, NAD, Anthem, and Marantz are still alive and kicking—and producing great gear at lower price points than ever before.

 

But it poses the question, as the mid-level offerings from these companies get better and better, how can they continue to justify the existence of higher-end lines, especially in the speaker market? How do you quantify the improvement? That has always been my struggle.

 

Sure, you can measure a speaker’s frequency response and sensitivity. You can measure an amp’s power and distortion. There are some performance benchmarks by which to judge a product. But measurements don’t tell the whole story in audio.

 

Personal preference is certainly a valid benchmark. Some people prefer a little fuller bass, a little more prominent midrange, or a more emphasized treble. That’s true of any audio product, no matter the price. (Hey, it’s true in video, too. Some people prefer a less accurate, more exaggerated picture. But unlike with a TV, you can’t offer multiple performance modes in 

a pair of speakers that will significantly alter the sound profile to appeal to different tastes.)

 

As you move into the truly high-end audio realm, the performance conversation moves away from those basic sonic characteristics that are easily defined and more toward elusive qualities like space, texture, and liquidity—words that often make the more technically minded audio fan bristle. What exactly are we describing there? I’m not even sure what liquidity sounds like.

 

Certainly, build quality and design help to distinguish many high-end products. The use of higher-quality parts. A product that has been hand-assembled, or at least individually inspected and approved. Real-wood cabinets. Automotive-grade custom paint finishes. 

 

But even here you reach a point of diminishing returns on your investment. Some of the most eye-catching speakers I’ve seen at recent trade shows include the Focal Kanta No. 2 ($10,000/pair), the Paradigm Persona 5F ($17,000/pair), and the Revel Performa F228Be ($10,000/pair). For me, 

these seem like the pinnacle of performance and luxury, so when I see the existence of $65,000/pair or $100,000/pair speakers, my response is: Why? I’ve yet to hear a satisfying answer to this question, which is why high-end audio is still a category I shy away from as a reviewer. I just don’t get it.

 

I also wonder how much longer it can last. The high-end audio market has proven itself more resilient (or maybe just more stubborn) than the high-end video market, but is the end nigh? One audio reviewer I know has mentioned that the trend at many audiophile shows these days is to create products where exoticism, rather than sound quality, is the apparent goal. He sometimes derides these products as “wacky.” Like, if you can’t convince people to buy something expensive, convince them to buy something “unique” instead. This trend might be even worse, but that’s a topic for another day.

Adrienne Maxwell

Adrienne Maxwell has been writing about the home theater industry for longer than she’s
willing to admit. She is currently the 
AV editor at Wirecutter (but her opinions here do not
represent those of Wirecutter or its parent company, The New York Times). Adrienne lives in
Colorado, where she spends far too much time looking at the Rockies and not nearly enough
time being in them.

Why Filmmaker Mode Matters

This week, at an event in Los Angeles, movie director Rian Johnson (Brick, LooperThe Last Jedi), introduced a new feature called Filmmaker Mode, which will appear on select TVs beginning in 2020. This might sound strikingly similar to pictures modes you already have on your TV, which go by names like “Cinema” or “Movie.” So what makes this different? “If you like movies,” Johnson said, “then Filmmaker Mode will make movies not look like poo poo.” Those are awfully big words. But, as it turns out, this new mode is actually a very simple enhancement.

 

Every TV already comes with all kinds of modes that have an impact—sometimes negative—on the picture. You might remember that at the end of 2018, Tom Cruise took to Twitter to post a video about the evils of motion smoothing, sometimes 

referred to as “The Soap Opera Effect.” This technology, which is also known as “motion interpolation” or “motion-compensated frame interpolation,” has been around for years, although it’s usually labeled with some slick marketing term on your TV such as “Auto Motion Plus,” “Clear Motion Rate,” “Action Smoothing,”  “Smooth Motion Effect,”  “MotionFlow,” “ClearScan,” or “TruMotion.” All of these terms really refer to the same thing: The process

of artificially creating frames of video and inserting them in between existing frames in your favorite movies or TV shows in order to reduce motion blur.

 

Reviewers, directors, cinematographers, editors, and cinephiles alike all urge the viewing public to turn off motion smoothing—which is often on by default—and other extra processing layered on by display manufacturers, and to instead set their displays to a basic set of standards meant to reproduce a movie as accurately as possible. But telling people how to defeat the various modes can be difficult and confusing given the kind of inconsistent jargon described above. Even as someone who reviews TVs, I would have to look up “ClearScan” and what it does to know whether I want it on or off. It sounds more like a TSA screening machine than a kind of picture processing.

 

Creatives and enthusiasts have been pushing to keep extra processing out of watching movies at home for as long as there’s been extra processing. But Filmmaker Mode is different, because all of the various forces—including the movie creators, the studios, and the display manufacturers—are all pushing together.

 

Simply put, Filmmaker Mode preserves the aspect ratio, frame rate, and color of the movie or TV show you’re watching so they match what was seen on the reference monitors used for post production as closely as possible. To do this, it sets the 

Why Filmmaker Mode Matters

correct color temperature on your display, turns off motion smoothing and other processing like sharpness and noise reduction, and makes sure the image isn’t stretched out.

 

It’s not yet clear how this will be implemented for the user, but, based on what was said at the UHD Alliance

event, it will likely be either a dedicated button on the display’s remote or—and this would be ideal—included in the metadata of a disc, stream, or download, so the display would turn on Filmmaker Mode (in other words, turn off all the extra junk) automatically.

 

Filmmaker Mode has been endorsed by Warner Bros., NBCUniversal, Amazon Prime, Vizio, Panasonic, LG, and dozens of household-name movie directors, including Martin Scorsese and Christopher Nolan. Vizio has announced that its 2020 line of smart TVs will include the new mode, and there are rumors that manufacturers are looking into adding it to existing displays through firmware updates. If you’re wondering how that’s possible, most modern UHD/HDR TVs can already do all of the things Filmmaker Mode does—but only if you’re willing to dig through all the menus and dial in a dozen or more settings.

 

The biggest issue facing Filmmaker Mode won’t be getting manufacturers to include it with their products. Similar modes already exist, such as the Netflix Calibrated Mode on Sony displays. The challenge will be educating the public about why they should care enough to push this button (which is why the idea of it being included in metadata is so enticing to me). Or maybe Maverick can post some more videos on Twitter about the Filmmaker Mode button to let people know it’s there.

John Higgins

John Higgins lives a life surrounded by audio. When he’s not writing for Cineluxe, IGN,
or 
Wirecutter, he’s a professional musician and sound editor for TV/film. During his down
time, he’s watching Star Wars or learning from his toddler son, Neil.