video compression Tag

Ep. 9: New Frontiers in Content & Compression

The Cineluxe Hour logo

Episode 9 opens with hosts Michael Gaughn & Dennis Burger talking about Dennis’s piece
on the surprisingly high quality of 4K streaming when watched using the right device.

 

At 6:18, Cineluxe contributor Andrew Robinson joins Mike & Dennis to discuss how Netflix
might be a threat to both the TV networks & the movie studios but the really innovative
programming isn’t happening on Netflix but on YouTube.

 

At 33:22, Cineluxe contributor John Higgins joins Andrew, Dennis & Mike to discuss the
controversy set off by the literally unwatchable Game of Thrones “Long Night” episode
and whether we can expect to see compression problems disappear any time soon.

 

The episode concludes at 59:20 with everyone (except Mike) talking about the most
interesting things they’re watched, listened to, or experienced in the past two weeks.

CLICK HERE TO CHECK OUT MORE EPISODES OF THE CINELUXE HOUR

RELATED POSTS

RELATED EPISODES

Andrew Robinson is a photographer and videographer by trade, working on commercial and branding projects all over the US. He has served as a managing editor and freelance journalist in the AV space for nearly 20 years, writing technical articles, product reviews, and guest speaking on behalf of several notable brands at functions around the world.

John Higgins lives a life surrounded by audio. When he’s not writing for Cineluxe, IGN, or Wirecutter, he’s a professional musician and sound editor for TV/film. During his down time, he’s watching Star Wars or learning from his toddler son, Neil.

What Did We Learn from the “GoT” Debacle?

What Did We Learn from the "GoT" Debacle?

The impenetrable darkness of “The Long Night”

It was a simpler time before April 28, 2019. The Khaleesi was going to be the savior of Westeros, Disney was on their way to owning all of us, and Joe’s Pizza in the Village had the best slice. While two of those things might still be true, they don’t matter anymore because we now live in a post-“The Long Night” world, a world where terms like H.264 and megabits per second are no longer muttered about only on tech blogs but discussed out in the open around water coolers (is that still a thing?) Now that the dust has settled a bit from the Game of Thrones kerfuffle, what are some of the things that came to light out of the darkness of that long night?

 

 

Lesson 1:  Public Enemy No. 1—Compression

If you haven’t realized it from the discussion here at Cineluxe over the past month, compression has become a hot-button issue—for good reason. GoT fans were confronted with Lego-like picture artifacts for the duration of the 82-minute “Long Night” episode, and they’re not happy about it.

 

While the video quality of home viewing has increased dramatically over the past few years with 4K UHD becoming more mainstream and the latest TVs allowing for great-looking HDR and far more vibrant colors, compression hasn’t always kept up. For years, H.264 (also called AVC) was king, and really, still is. It can compress video all the way up to 8K resolution, and has been tweaked to include support for wide color gamut and HDR, and to produce smaller file sizes. But it just can’t create files small enough for efficient delivery through the current pipelines without leading to the kinds of problems that were amply on display in “The Long Night.”

 

You probably read Andrew Robinson’s take on H.265 (aka HEVC) as the next step forward. With H.265, a 1080p signal only requires a 3 Mbps bitrate as opposed to H.264’s 6 Mbps. And a 4K signal needs less than half of H.264’s bitrate—15 vs. 32 

What Did We Learn from the "GoT" Debacle?

Mbps. But, as Andrew mentioned, not everything is currently equipped to handle and decode H.265-compressed video. In addition to needing significantly fewer bits per second, H.265 does a better job with motion compensation.

 

I should stress that the Mbps numbers listed above are truly bare 

minimums, and at those rates you’ll likely see image issues. Netflix, which uses H.265 for all of its 4K content, recommends a minimum 25 Mbps connection for streaming.

 

Speaking of Netflix, they’re at the forefront of experimenting with new, better codecs for 4K streaming. As a result, you can expect to hear some new acronyms like VP9 and AV1 in the coming years. AV1 in particular promises to deliver HEVC-level quality while using even fewer bits.

 

 

Lesson 2:  It’s (probably) not your TV

The cinematographer for “The Long Night,” Fabian Wagner, found himself on the defensive after the uproar and, in addition to (rightly) blaming HBO’s compression, also blamed viewers and their TVs. “A lot of the problem is that a lot of people don’t know how to tune their TVs properly,” he told Wired UK.

 

Technically, that is correct. The vast majority of people don’t know how to tune their TVs properly. Luckily, they don’t really need to. Most TVs over the past couple years priced more than $500 come out of the factory looking really good and don’t

necessarily need to be calibrated. (But I would still recommend calibration for any mid-to-high-end TV, to make sure you’re getting that absolute most out of it.)

 

One thing Mr. Wagner brought up that has some merit is people’s tendency to watch TV with their lights on. Even minimal lighting can have an impact on your ability to see shadow detail in a darkly filmed scene, especially if you have an older LCD TV with mediocre black levels. So one quick fix for a murky picture might be to just turn off any extra light in you room.

 

If you want to make sure your TV is in the best viewing mode—and you haven’t had it calibrated—don’t, for the love of Werner Herzog, ever put it in Vivid (aka “Torch”) mode. Go for Cinema, or Calibrated, or Movie. These will generally have the best color accuracy and contrast/backlight/ dimming zones setting, and won’t include the bane of video reviewers everywhere—the “soap opera effect.”

 

 

Lesson 3:  The apps you use (and the device they’re on) matter

You can expect the quality and user experience to differ from one app to the next, since they’re all made by different companies that generally aren’t keen on sharing development secrets. But there can even be performance issues with the same app on different platforms—as Dennis Burger recently described in his article about the Netflix app. I have to admit, that revelation was a bit of a shock to me. The idea that a seemingly identical app could perform vastly differently through different platforms was a big surprise. Some variation is to be expected, but I would have thought it would be more of an academic argument than a bunch of extra artifacts on one app version over the other. Trying the Netflix app on a different platform could help clear up any artifacts you might be experiencing.

 

But this piece is really about how HBO screwed up. And if you’re watching HBO through your cable or satellite service, you’re dependent on the hardware they provide, which might not be offering state-of-the-art resolution support. For instance, if you haven’t replaced you DirecTV HDR in the past couple years, it might still top out at 1080i resolution. Signing into the HBO GO app (or the NOW app, if you’re streaming only) should guarantee 1080p support.

 

 

Lesson 4:  Choose your viewing window wisely

“The Long Night” had 17.8 million viewers when it initially aired over all delivery media, including cable, satellite, HBO NOW, and HBO GO. That was a new record for HBO, so 

congratulations are in order, I suppose. But with such a concurrent draw on the servers, the quality of the stream suffered. This severely exacerbated the already present compression artifacts, to the point of making the show unwatchable—hence the Twitter eruption that night and the next day. I watched portions of the episode a few more times that week after the viewing tide subsided to see if there was any improvement, and while the artifacts weren’t gone, they were much less obvious.

Lesson 5:  Aesthetic choices matter too

Why did the Internet hordes descend on Fabian Wagner? It’s rare that a cinematographer needs to come out from behind the camera to defend himself, but that episode was dark—intentionally so. It was his conception (in collaboration with the director) that was on the screen, after all, and people were upset they couldn’t see it. A hugely anticipated battle scene where you can’t see anything? Preposterous. In contrast, take a look at another famous nighttime battle—The Battle of Helm’s Deep from The Lord of the Rings: The Two Towers. That place was lit up like a Christmas tree—or, more accurately, a huge amount of blue light that gave the feel of the moon. The whole sequence was masterfully shot.

 

That doesn’t mean “The Long Night” was shot wrong, just different. In fact, the move toward really dark seems to be a bit of a recent trend. In the spring of 2018, a little movie called Solo: A Star Wars Story was released. The cinematographer, Bradford Young, used a low-light approach much like Fabian Wagner’s to accentuate the shadows and grime of Han Solo’s earlier 

What Did We Learn from the "GoT" Debacle?

Solo: A Star Wars Story—into the darkness of the Maw

years. Complaints on the Internet were everywhere (for a Star Wars movie, go figure . . .) because many theaters, even in major markets, weren’t properly calibrated, which led to a lack of shadow detail. I happily didn’t run into that issue here in Los Angeles, and now regularly use Solo as a test disc for the gritty sabacc scenes and the darkness of the Falcon flying through the Maw.

 

 

What’s next?

Now that “The Longest Night” has brought the conversation out into the open, everything is solved and we don’t need to worry about encountering these problems ever again, right? Nope. Not by a long shot. It’s wonderful that we’re talking about what went wrong, but it’s going to take a while for the technology and the people who implement it to catch up.

 

Even though the first version of H.264 was completed in 2003, it didn’t really achieve widespread adoption until a decade later. The HEVC standard was ratified in 2013, and Netflix implemented it for 4K delivery in 2016, but it’s only recently begun to catch on elsewhere. If all of that is any indication, AV1 (which was released last year) won’t be in wide use for at least a couple of years.

 

And low-light cinematography isn’t going away, nor should it. But for HBO and their use of H.264, it does mean that grayscale banding in dark scenes will continue to be apparent. (We’ve already seen it again at the end of Episode 2 of HBO’s Chernobyl.)

 

The most we can do is make sure our TVs aren’t in Vivid mode, the lights are all turned off, and we’re using the best version of our streaming app we can.

John Higgins

John Higgins lives a life surrounded by audio. When he’s not writing for Cineluxe, IGN,
or 
Wirecutter, he’s a professional musician and sound editor for TV/film. During his down
time, he’s watching Star Wars or learning from his toddler son, Neil.

If Your Stream Looks Bad, Don’t Blame Netflix

If Your Stream Looks Bad, Don't Blame Netflix

We all take for granted that buying a better video display will result in a better home cinema experience. Ditto speakers and sound processors and amps and control systems and so on. But for some reason, even in an era where streaming has pretty much taken over as the dominant source of AV entertainment, we talk about services like Netflix as if the hardware delivering them doesn’t really matter.

 

This realization has been at the forefront of my mind recently, as I’ve had discussions with videophiles on Facebook and in the comments section of Home Theater Review about the quality of the streamed video experience. Even folks with roughly the same internet speeds as me, similar quality home networks, and comparable displays seem to be watching a wholly different Netflix than the one I enjoy.

 

This absolutely baffled me for the longest time. My first inclination was to write it off as pure bias. Or maybe even ignorance. But then I started asking about a variable we videophiles rarely discuss when we talk about streaming: “How, exactly, are you accessing Netflix?” (By the way, I’m using “Netflix” a bit like “Kleenex” here, as a synecdoche for high-performance streaming

video across the board. You could just as easily plug in your high-quality streaming service of choice, be it Vudu or Amazon or what have you. But none of this necessarily applies to lower-quality streaming apps like CBS All Access, etc.)

 

What I found is that almost none of the commenters who bemoan the quality of Netflix watch it the same way I do, via Roku Ultra. Some use cable or satellite boxes. Some rely on the smart apps built into their TVs. Some even have their laptops plugged into their TVs via HDMI.

 

This makes a difference. Way more than you would think. Way more than I would have ever imagined until I actually sat down for some exhaustive comparisons between the exact same Netflix programming streamed to the exact same display.

 

The first thing I discovered is just how substantially different loading times are between devices. I did all of this testing on my 75-inch UHD TV, installed just above my credenza, which houses my Roku Ultra, Dish Network satellite receiver, Kaleidescape Strato, and my other AV components. All are plugged into the same enterprise-grade, gigabit Cisco network switch, and as such have access to the exact same level of connectivity. If you’re a numbers nerd, you can check the “Netflix by the Numbers” sidebar below for a breakdown of exactly how long it took each device to load the Netflix app (after a hardware reboot), begin playing a title, and reach full UHD resolution and full bandwidth.

None of the above is even slightly shocking. What was shocking, though, is just how different Netflix looked via these different devices. Cueing up my recent favorite, Our Planet, I couldn’t help but notice that via the app built into my smart TV, this gorgeous nature doc looked a bit less gorgeous. A bit smeared. A bit noisy. A good bit less refined. A closer inspection of the screen revealed the cause: Numerous video compression artifacts, pretty much right in line with what all of the streaming detractors have been hollering at me about on Facebook.

If Your Stream Looks Bad, Don't Blame Netflix

Switching inputs to the Roku Ultra—again, via the same network connection—I was a little staggered to discover a complete lack of compression artifacts. Ignore, by the way, the subtle swirling bands of brightness fluctuation in the image below. Those are a result of moiré, a misalignment of pixels between my TV and the digital sensor in my cell phone.

 

Ignore too the slight softness in the upper row of leopard spots. This frame is from about half a second later than the one above, and as such the cheetah is moving a little faster, so there’s some motion blur. Also, don’t focus on differences in color—my smart TV’s integrated Netflix app is delivering the program in Dolby Vision, whereas my Roku Ultra only supports HDR10, but the camera in my smart phone can’t capture the gamut of either format. This image was also taken a few inches away from the TV, so what you’re seeing is a tiny fraction of the screen, blown up way larger than life-size.

If Your Stream Looks Bad, Don't Blame Netflix

But I think what’s clear here is that via the Roku Ultra, Our Planet’s image is virtually artifact-free. (As I mentioned in my review of the program, the only compression artifact I could find in the series’ entire run, at least from any reasonable seating distance, was about a second-and-a-half of very minor, almost imperceptible color banding in one early episode.)

 

I sent a series of images to colleague Andrew Robinson, since he and I have been discussing the geeky particulars of compression a lot recently. He immediately started poking holes in my methodology, at my request.

 

“Are you using the same picture profiles?”

 

Yup.

 

“Are you letting the smart TV buffer up to full resolution?”

 

Uh huh.

 

“Is your Roku running through the video processing of your AV preamp?”

 

Nope. I bypassed my preamp and ran the Roku straight into HDMI 1 on my TV.

 

I’ve done my darnedest to think of any reason why the same UHD/HDR program would look so rough via one streaming device and so flawless via another connected to the exact same network switch in the same room, running the same

streaming service from the same account. The only thing I can come up with is something Andrew touched on in his most recent piece about compression: HEVC (aka H.265), the video codec Netflix uses to deliver UHD/HDR, is very processor intensive. The cost of shoving such high-quality video through such a small pipe is that it makes the device on the playback end do a lot of heavy number crunching. And if those numbers can’t be crunched quickly enough, the results look a lot like the top screen shot above.

 

My guess here is that my Roku Ultra has the horsepower to deliver Netflix practically flawlessly, whereas my smart TV simply doesn’t. (And as gorgeous as the TV is with native 4K video, its middling performance in upscaling lower-resolution video to 4K is further evidence of this. That’s why I use my AV preamp to upscale video.)

 

And look, none of this is intended to be an advertisement for Roku. It may be my streaming player of choice because it consistently delivers the best performance for the streaming apps I use most. But I haven’t tested every single media streamer on the market to compare their video quality. (As our own John Sciacca has reported, though, even the highly lauded Apple TV 4K sometimes struggles on the audio front, and Andrew reported anecdotally in our most recent conversation that he noticed a significant improvement in video quality when he switched to Roku.) Nor do I have a representative sample of smart TVs to confirm that all of their built-in Netflix apps render such poor video performance.

NETFLIX BY THE NUMBERS

A nuts & bolts comparison of different streaming devices

 

I started with a simple load-time test, to see how long it would take for Netflix to launch to the user-select screen via devices that had just been powered up. All of these numbers are, of course, influenced by the speed of my internet connection (500 mbps) and the quality of my home network.

 

Roku Ultra  3.05 seconds on average from the time I selected the Netflix app until it loaded to the user-select screen

 

Dish Network Hopper DVR  4.41 seconds on average

 

Smart TV  22.38 seconds on average

 

I then selected three different Netflix programs (Our Planet, Love, Death + Robots, and Test Patterns) and ran numerous tests to find the average time it took each device to start playing the program after it was selected.

 

Roku Ultra  3.20 seconds on average, from the time I pressed Select until the program started playing

 

Dish Network Hopper  9.64 seconds on average

 

Smart TV  13.15 seconds on average

 

Lastly, I cued up the Test Patterns again, specifically the pattern labeled “YCBrCr 10-bit Linearity Chart: 3840×2160, 23.976fps.” This test gives you a bitrate meter at the top of the screen, and also displays playback resolution, which let me gauge how long it would take each device to reach full bandwidth (16 mbps) and full resolution/color bit-depth.

 

Roku Ultra  Played at UHD 10-bit immediately, although it did start at 12 megabits per second and took 4.15 seconds on average to report full 16 mbps bandwidth

 

Dish Network Hopper DVR  Switched from 1920 x 1080 resolution to full 3840 x 2160 resolution after 15.62 seconds on average, and took an average of 46.26 seconds to reach full 16 mbps bandwidth

 

Smart TV Took 47.18 seconds on average to switch from HD to UHD resolution, and didn’t reach full 16 mbps bandwidth until an average of 142.54 seconds into the stream

All I can say for certain is that the device you use to access Netflix and all of the other streaming services you subscribe to does matter. And it matters way more than I would have predicted just a week ago. Simply put, if you’re streaming Netflix in your luxury entertainment system and notice that the picture isn’t up to snuff, don’t blame Netflix. Start pointing your finger at the device you’re using to access the app.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

Compression Revisited

Compression Revisited

an example of the compression artifact “banding”

If my last post made it seem like I hate compression in all its forms, you’ll have to forgive me. The simple fact is, without compression, there would be no digital video. All video is compressed. Period. Usually at the point of capture, then for exhibition at movie theaters, then again for home video. Even movies or TV shows shot on film are later transferred to digital for post production and compressed. There is no way to have a moving digital image of any kind without some form of compression.

 

For years, most popular digital video formats and capture devices have used H.264 compression. You don’t need to know exactly what H.264 is or how it works. Suffice to say, it’s been with us for a long while now, and it’s time in the spotlight is 

running out. Why? Because we’re moving ever faster towards needing video that maintains H.264’s quality, but with much better efficiency.

 

Enter H.265 (aka HEVC). While the similarity in naming to H.264 suggests there’s not a big difference, H.265 is an entirely different beast and the next frontier in compression.

 

So why are we, or am I, suddenly talking about the AV industry’s most boring topic? Well, because of Game of Thrones naturally. What, did you think I was going to ramble on about a Starbucks cup? No, compression is a big deal now because winter came and for a lot of folks it didn’t come with a very spectacular view! Suddenly the whole world cares about compression, even if HBO and the show’s creators would rather blame it on our lack of calibration. (Don’t get me started.)

 

You see, compression not only allows for digital video to exist in the first place but  also allows for so many of us to enjoy it all at the same time. So when a lot of people all decided they wanted to see some dragon porn at precisely 8 p.m. on the same Sunday night, it took a fair amount of compression to make that happen. Why?

Because digital video files are huge—not to mention complicated. Not like, “Oh, you attached a big file to that last email,” but rather, “Damn, you know I don’t have unlimited data on my cellular plan!” They’re actually even larger than that. In many ways, we’ve long since taken digital video for granted, because prior to the Battle of Winterfell, the only people who really griped about compression were AV nerds like me.

Compression Revisited

For what it’s worth, even most AV nerds misrepresent compression. To give you an idea of what I mean, here’s a comparison between the amount of data it takes to deliver a 4K HDR stream via Netflix (or similar services) compared to the amount of data that UHD Blu-ray discs and your local cineplex deliver.

 

Most nerds will tell you (ignorantly) that the line between unacceptable 

garbage and perfect quality video falls somewhere between the bottom line and the middle one. That argument looks sort of silly, though, when you compare all of the above with truly uncompressed 4K video (see the chart below). The difference between the most and least compressed digital video you as a consumer can access is minuscule by comparison.

Compression Revisited

Again, this isn’t a conversation most people are having. But when everyone’s favorite cousin-f’ing dating show suddenly looks like The Lego Movie, well, people notice.

 

Mind you, as I indicated in my last post, I’m not saying there’s no such thing as too much compression. As we saw with Game of Thrones, you can reach the breaking point of any codec. But it’s not anyone’s fault. You see, we’ve only had digital video in a

meaningful way for a very short time. While digital video has existed since the ‘80s and ‘90s, it didn’t really become the standard until the early 2000s—which means we’ve covered a hell of a lot of technological ground in a very short time.

 

H.264 has been a godsend for digital video both at the capture and exhibition levels. But it does have limitations—not in quality, mind you. Believe it or not, H.264 is robust enough to handle even 8K-resolution files. No, H.264’s limitation is that for as compressed as it is, it actually doesn’t compress enough, so one of two things has to happen. Either you need to compress the files right up to their limit so more people can watch them on demand—thus the GoT debacle—or two, you need a new compression scheme. That’s where H.265 comes into play.

 

H.265 doesn’t really promise to do anything better than its predecessor, except retain the same or better quality but at a quarter of the size. That is all great news. But to get the same horsepower from an engine one quarter the size, you need to do some tweaking—or in this case, some fairly substantial computing.

 

As a result, not everything in today’s modern AV eco system is H.265 equipped, or compatible. Moreover, not every modern camera has H.265 capabilities despite being so-called state of the art.

 

In other words, we find ourselves in a bit of in-between state, a mixed bag of both H.264 and H.265 content and capability. That’s why, at the moment, Netflix can even rival silly spinning discs when it comes to picture quality, whereas other streaming providers, like HBO Go or HBO Now, can end up looking awful while eating up the same amount of your internet data.

 

The good news is that we’re marching ever forward toward the full-scale adoption of H.265—which, in theory, should make something like that disastrous Thrones episode a thing of the past. But until that day comes when we’re all able to get on the same page, more and more of us may have to come to grips with compression and why it is both the lifeblood of digital video and its achilles heel.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

“Game of Thrones” Sheds Darkness on the Real Issue

"Game of Thrones" Sheds Darkness on the Real Issue

Hey, did you rage tweet after Episode 3 of Game of Thrones because, well, you couldn’t see it? Did you blame the filmmakers and HBO for an experience that was tantamount to trying to watch porn at 3 a.m. through lines of static like when you were a kid? Did you?

 

We’ve all come to the same conclusion in the weeks that have followed, and that is that compression is the villain here, not HBO, not TV manufacturers, and, of course, not us the viewers. It’s compression’s fault. To which I say good. I’m glad this happened because maybe now we can have an honest conversation about the issue of compression.

 

I feel like I’ve been stuck on an island these past 15 or so years, droning on about compression while the rest of the AV world ran full steam ahead into HD, then 3D, and now 4 and 8K. HD, 4K, 8K all sound sexy, and like the exterior of a car they’ve

marketed to get your ass in the showroom. So, if 4K is the body, compression is the engine, and, well, she’s a two-cylinder with some rather old horses under the hood.

 

Nothing makes or breaks a digital video presentation more than compression. Before those physical-media stalwarts start typing See, I told you so, may I remind them that their precious silver coasters are compressed to shit just like the rest of today’s digital video feeds. Now, I can hear them saying, Yeah, but discs are less compressed. True, but the argument is weak, for discs can vary wildly in their levels of compression (just like streaming). Moreover, no one wants your silly discs, so it’s all moot.

 

Getting back to the topic at hand, compression and streaming (i.e. the video format that will ultimately “win”). Presently most video is compressed using the H.264 format, which back in the day was fine—hell, it was great!

 

But when H.264 revolutionized digital video, it mostly had to contend with SD content and all that it entailed. Now, that same compression scheme is being pressed into service in a radically different world. It is because of compression that the promise of 4K—hell, HD—has been curbed over the years. Did you know the HD spec encompassed 10-bit color and a larger color space too? These are not 4K-exclusive selling points, but rather bits of information and performance left on the AV battlefield due to compression and our collective digital eco-system being unable to handle the demands of more.

 

So, what did we do?

Naturally, we gave poor old H.264 more to choke on, because no one understands compression, only what it looks like. They don’t want to accept why it’s happening, they just want to be mad at it. Thankfully H.265 is here, and is slowly being adopted, only it’s very hardware/processor intensive, which makes it expensive to implement.

 

H.265 promises higher quality at lower file sizes. For example, if 1 hour of content using H.264 comes to 4 GB, then H.265 should give you equal or better quality but with a file of only 1 GB. These are not exact figures, but rather an illustration I hope is easy enough for everyone to understand. With smaller file sizes, the hope is that it’s then easier for feeds to stream faster, further, and with more consistency, thus resulting in (hopefully) a better viewing experience. Of course this is all predicated upon the notion that the hardware at either end can do some of the heavy lifting itself, as H.265 is more complex than H.264. Thankfully we’re getting there, and will ultimately get there in the end. It just takes time.

 

So the next time you turn on Netflix or HBO Go and watch whatever drama turn into The Lego Movie, don’t get upset. Know that it’s happening because once again, we demanded to run before we learned to walk.

—Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.