video compression Tag

If Your Stream Looks Bad, Don’t Blame Netflix

If Your Stream Looks Bad, Don't Blame Netflix

We all take for granted that buying a better video display will result in a better home cinema experience. Ditto speakers and sound processors and amps and control systems and so on. But for some reason, even in an era where streaming has pretty much taken over as the dominant source of AV entertainment, we talk about services like Netflix as if the hardware delivering them doesn’t really matter.

 

This realization has been at the forefront of my mind recently, as I’ve had discussions with videophiles on Facebook and in the comments section of Home Theater Review about the quality of the streamed video experience. Even folks with roughly the same internet speeds as me, similar quality home networks, and comparable displays seem to be watching a wholly different Netflix than the one I enjoy.

 

This absolutely baffled me for the longest time. My first inclination was to write it off as pure bias. Or maybe even ignorance. But then I started asking about a variable we videophiles rarely discuss when we talk about streaming: “How, exactly, are you accessing Netflix?” (By the way, I’m using “Netflix” a bit like “Kleenex” here, as a synecdoche for high-performance streaming

video across the board. You could just as easily plug in your high-quality streaming service of choice, be it Vudu or Amazon or what have you. But none of this necessarily applies to lower-quality streaming apps like CBS All Access, etc.)

 

What I found is that almost none of the commenters who bemoan the quality of Netflix watch it the same way I do, via Roku Ultra. Some use cable or satellite boxes. Some rely on the smart apps built into their TVs. Some even have their laptops plugged into their TVs via HDMI.

 

This makes a difference. Way more than you would think. Way more than I would have ever imagined until I actually sat down for some exhaustive comparisons between the exact same Netflix programming streamed to the exact same display.

 

The first thing I discovered is just how substantially different loading times are between devices. I did all of this testing on my 75-inch UHD TV, installed just above my credenza, which houses my Roku Ultra, Dish Network satellite receiver, Kaleidescape Strato, and my other AV components. All are plugged into the same enterprise-grade, gigabit Cisco network switch, and as such have access to the exact same level of connectivity. If you’re a numbers nerd, you can check the “Netflix by the Numbers” sidebar below for a breakdown of exactly how long it took each device to load the Netflix app (after a hardware reboot), begin playing a title, and reach full UHD resolution and full bandwidth.

None of the above is even slightly shocking. What was shocking, though, is just how different Netflix looked via these different devices. Cueing up my recent favorite, Our Planet, I couldn’t help but notice that via the app built into my smart TV, this gorgeous nature doc looked a bit less gorgeous. A bit smeared. A bit noisy. A good bit less refined. A closer inspection of the screen revealed the cause: Numerous video compression artifacts, pretty much right in line with what all of the streaming detractors have been hollering at me about on Facebook.

If Your Stream Looks Bad, Don't Blame Netflix

Switching inputs to the Roku Ultra—again, via the same network connection—I was a little staggered to discover a complete lack of compression artifacts. Ignore, by the way, the subtle swirling bands of brightness fluctuation in the image below. Those are a result of moiré, a misalignment of pixels between my TV and the digital sensor in my cell phone.

 

Ignore too the slight softness in the upper row of leopard spots. This frame is from about half a second later than the one above, and as such the cheetah is moving a little faster, so there’s some motion blur. Also, don’t focus on differences in color—my smart TV’s integrated Netflix app is delivering the program in Dolby Vision, whereas my Roku Ultra only supports HDR10, but the camera in my smart phone can’t capture the gamut of either format. This image was also taken a few inches away from the TV, so what you’re seeing is a tiny fraction of the screen, blown up way larger than life-size.

If Your Stream Looks Bad, Don't Blame Netflix

But I think what’s clear here is that via the Roku Ultra, Our Planet’s image is virtually artifact-free. (As I mentioned in my review of the program, the only compression artifact I could find in the series’ entire run, at least from any reasonable seating distance, was about a second-and-a-half of very minor, almost imperceptible color banding in one early episode.)

 

I sent a series of images to colleague Andrew Robinson, since he and I have been discussing the geeky particulars of compression a lot recently. He immediately started poking holes in my methodology, at my request.

 

“Are you using the same picture profiles?”

 

Yup.

 

“Are you letting the smart TV buffer up to full resolution?”

 

Uh huh.

 

“Is your Roku running through the video processing of your AV preamp?”

 

Nope. I bypassed my preamp and ran the Roku straight into HDMI 1 on my TV.

 

I’ve done my darnedest to think of any reason why the same UHD/HDR program would look so rough via one streaming device and so flawless via another connected to the exact same network switch in the same room, running the same

streaming service from the same account. The only thing I can come up with is something Andrew touched on in his most recent piece about compression: HEVC (aka H.265), the video codec Netflix uses to deliver UHD/HDR, is very processor intensive. The cost of shoving such high-quality video through such a small pipe is that it makes the device on the playback end do a lot of heavy number crunching. And if those numbers can’t be crunched quickly enough, the results look a lot like the top screen shot above.

 

My guess here is that my Roku Ultra has the horsepower to deliver Netflix practically flawlessly, whereas my smart TV simply doesn’t. (And as gorgeous as the TV is with native 4K video, its middling performance in upscaling lower-resolution video to 4K is further evidence of this. That’s why I use my AV preamp to upscale video.)

 

And look, none of this is intended to be an advertisement for Roku. It may be my streaming player of choice because it consistently delivers the best performance for the streaming apps I use most. But I haven’t tested every single media streamer on the market to compare their video quality. (As our own John Sciacca has reported, though, even the highly lauded Apple TV 4K sometimes struggles on the audio front, and Andrew reported anecdotally in our most recent conversation that he noticed a significant improvement in video quality when he switched to Roku.) Nor do I have a representative sample of smart TVs to confirm that all of their built-in Netflix apps render such poor video performance.

NETFLIX BY THE NUMBERS

A nuts & bolts comparison of different streaming devices

 

I started with a simple load-time test, to see how long it would take for Netflix to launch to the user-select screen via devices that had just been powered up. All of these numbers are, of course, influenced by the speed of my internet connection (500 mbps) and the quality of my home network.

 

Roku Ultra  3.05 seconds on average from the time I selected the Netflix app until it loaded to the user-select screen

 

Dish Network Hopper DVR  4.41 seconds on average

 

Smart TV  22.38 seconds on average

 

I then selected three different Netflix programs (Our Planet, Love, Death + Robots, and Test Patterns) and ran numerous tests to find the average time it took each device to start playing the program after it was selected.

 

Roku Ultra  3.20 seconds on average, from the time I pressed Select until the program started playing

 

Dish Network Hopper  9.64 seconds on average

 

Smart TV  13.15 seconds on average

 

Lastly, I cued up the Test Patterns again, specifically the pattern labeled “YCBrCr 10-bit Linearity Chart: 3840×2160, 23.976fps.” This test gives you a bitrate meter at the top of the screen, and also displays playback resolution, which let me gauge how long it would take each device to reach full bandwidth (16 mbps) and full resolution/color bit-depth.

 

Roku Ultra  Played at UHD 10-bit immediately, although it did start at 12 megabits per second and took 4.15 seconds on average to report full 16 mbps bandwidth

 

Dish Network Hopper DVR  Switched from 1920 x 1080 resolution to full 3840 x 2160 resolution after 15.62 seconds on average, and took an average of 46.26 seconds to reach full 16 mbps bandwidth

 

Smart TV Took 47.18 seconds on average to switch from HD to UHD resolution, and didn’t reach full 16 mbps bandwidth until an average of 142.54 seconds into the stream

All I can say for certain is that the device you use to access Netflix and all of the other streaming services you subscribe to does matter. And it matters way more than I would have predicted just a week ago. Simply put, if you’re streaming Netflix in your luxury entertainment system and notice that the picture isn’t up to snuff, don’t blame Netflix. Start pointing your finger at the device you’re using to access the app.

Dennis Burger

Dennis Burger is an avid Star Wars scholar, Tolkien fanatic, and Corvette enthusiast
who somehow also manages to find time for technological passions including high-
end audio, home automation, and video gaming. He lives in the armpit of 
Alabama with
his wife Bethany and their four-legged child Bruno, a 75-pound 
American Staffordshire
Terrier who thinks he’s a Pomeranian.

Compression Revisited

Compression Revisited

an example of the compression artifact “banding”

If my last post made it seem like I hate compression in all its forms, you’ll have to forgive me. The simple fact is, without compression, there would be no digital video. All video is compressed. Period. Usually at the point of capture, then for exhibition at movie theaters, then again for home video. Even movies or TV shows shot on film are later transferred to digital for post production and compressed. There is no way to have a moving digital image of any kind without some form of compression.

 

For years, most popular digital video formats and capture devices have used H.264 compression. You don’t need to know exactly what H.264 is or how it works. Suffice to say, it’s been with us for a long while now, and it’s time in the spotlight is 

running out. Why? Because we’re moving ever faster towards needing video that maintains H.264’s quality, but with much better efficiency.

 

Enter H.265 (aka HEVC). While the similarity in naming to H.264 suggests there’s not a big difference, H.265 is an entirely different beast and the next frontier in compression.

 

So why are we, or am I, suddenly talking about the AV industry’s most boring topic? Well, because of Game of Thrones naturally. What, did you think I was going to ramble on about a Starbucks cup? No, compression is a big deal now because winter came and for a lot of folks it didn’t come with a very spectacular view! Suddenly the whole world cares about compression, even if HBO and the show’s creators would rather blame it on our lack of calibration. (Don’t get me started.)

 

You see, compression not only allows for digital video to exist in the first place but  also allows for so many of us to enjoy it all at the same time. So when a lot of people all decided they wanted to see some dragon porn at precisely 8 p.m. on the same Sunday night, it took a fair amount of compression to make that happen. Why?

Because digital video files are huge—not to mention complicated. Not like, “Oh, you attached a big file to that last email,” but rather, “Damn, you know I don’t have unlimited data on my cellular plan!” They’re actually even larger than that. In many ways, we’ve long since taken digital video for granted, because prior to the Battle of Winterfell, the only people who really griped about compression were AV nerds like me.

Compression Revisited

For what it’s worth, even most AV nerds misrepresent compression. To give you an idea of what I mean, here’s a comparison between the amount of data it takes to deliver a 4K HDR stream via Netflix (or similar services) compared to the amount of data that UHD Blu-ray discs and your local cineplex deliver.

 

Most nerds will tell you (ignorantly) that the line between unacceptable 

garbage and perfect quality video falls somewhere between the bottom line and the middle one. That argument looks sort of silly, though, when you compare all of the above with truly uncompressed 4K video (see the chart below). The difference between the most and least compressed digital video you as a consumer can access is minuscule by comparison.

Compression Revisited

Again, this isn’t a conversation most people are having. But when everyone’s favorite cousin-f’ing dating show suddenly looks like The Lego Movie, well, people notice.

 

Mind you, as I indicated in my last post, I’m not saying there’s no such thing as too much compression. As we saw with Game of Thrones, you can reach the breaking point of any codec. But it’s not anyone’s fault. You see, we’ve only had digital video in a

meaningful way for a very short time. While digital video has existed since the ‘80s and ‘90s, it didn’t really become the standard until the early 2000s—which means we’ve covered a hell of a lot of technological ground in a very short time.

 

H.264 has been a godsend for digital video both at the capture and exhibition levels. But it does have limitations—not in quality, mind you. Believe it or not, H.264 is robust enough to handle even 8K-resolution files. No, H.264’s limitation is that for as compressed as it is, it actually doesn’t compress enough, so one of two things has to happen. Either you need to compress the files right up to their limit so more people can watch them on demand—thus the GoT debacle—or two, you need a new compression scheme. That’s where H.265 comes into play.

 

H.265 doesn’t really promise to do anything better than its predecessor, except retain the same or better quality but at a quarter of the size. That is all great news. But to get the same horsepower from an engine one quarter the size, you need to do some tweaking—or in this case, some fairly substantial computing.

 

As a result, not everything in today’s modern AV eco system is H.265 equipped, or compatible. Moreover, not every modern camera has H.265 capabilities despite being so-called state of the art.

 

In other words, we find ourselves in a bit of in-between state, a mixed bag of both H.264 and H.265 content and capability. That’s why, at the moment, Netflix can even rival silly spinning discs when it comes to picture quality, whereas other streaming providers, like HBO Go or HBO Now, can end up looking awful while eating up the same amount of your internet data.

 

The good news is that we’re marching ever forward toward the full-scale adoption of H.265—which, in theory, should make something like that disastrous Thrones episode a thing of the past. But until that day comes when we’re all able to get on the same page, more and more of us may have to come to grips with compression and why it is both the lifeblood of digital video and its achilles heel.

Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.

“Game of Thrones” Sheds Darkness on the Real Issue

"Game of Thrones" Sheds Darkness on the Real Issue

Hey, did you rage tweet after Episode 3 of Game of Thrones because, well, you couldn’t see it? Did you blame the filmmakers and HBO for an experience that was tantamount to trying to watch porn at 3 a.m. through lines of static like when you were a kid? Did you?

 

We’ve all come to the same conclusion in the weeks that have followed, and that is that compression is the villain here, not HBO, not TV manufacturers, and, of course, not us the viewers. It’s compression’s fault. To which I say good. I’m glad this happened because maybe now we can have an honest conversation about the issue of compression.

 

I feel like I’ve been stuck on an island these past 15 or so years, droning on about compression while the rest of the AV world ran full steam ahead into HD, then 3D, and now 4 and 8K. HD, 4K, 8K all sound sexy, and like the exterior of a car they’ve

marketed to get your ass in the showroom. So, if 4K is the body, compression is the engine, and, well, she’s a two-cylinder with some rather old horses under the hood.

 

Nothing makes or breaks a digital video presentation more than compression. Before those physical-media stalwarts start typing See, I told you so, may I remind them that their precious silver coasters are compressed to shit just like the rest of today’s digital video feeds. Now, I can hear them saying, Yeah, but discs are less compressed. True, but the argument is weak, for discs can vary wildly in their levels of compression (just like streaming). Moreover, no one wants your silly discs, so it’s all moot.

 

Getting back to the topic at hand, compression and streaming (i.e. the video format that will ultimately “win”). Presently most video is compressed using the H.264 format, which back in the day was fine—hell, it was great!

 

But when H.264 revolutionized digital video, it mostly had to contend with SD content and all that it entailed. Now, that same compression scheme is being pressed into service in a radically different world. It is because of compression that the promise of 4K—hell, HD—has been curbed over the years. Did you know the HD spec encompassed 10-bit color and a larger color space too? These are not 4K-exclusive selling points, but rather bits of information and performance left on the AV battlefield due to compression and our collective digital eco-system being unable to handle the demands of more.

 

So, what did we do?

Naturally, we gave poor old H.264 more to choke on, because no one understands compression, only what it looks like. They don’t want to accept why it’s happening, they just want to be mad at it. Thankfully H.265 is here, and is slowly being adopted, only it’s very hardware/processor intensive, which makes it expensive to implement.

 

H.265 promises higher quality at lower file sizes. For example, if 1 hour of content using H.264 comes to 4 GB, then H.265 should give you equal or better quality but with a file of only 1 GB. These are not exact figures, but rather an illustration I hope is easy enough for everyone to understand. With smaller file sizes, the hope is that it’s then easier for feeds to stream faster, further, and with more consistency, thus resulting in (hopefully) a better viewing experience. Of course this is all predicated upon the notion that the hardware at either end can do some of the heavy lifting itself, as H.265 is more complex than H.264. Thankfully we’re getting there, and will ultimately get there in the end. It just takes time.

 

So the next time you turn on Netflix or HBO Go and watch whatever drama turn into The Lego Movie, don’t get upset. Know that it’s happening because once again, we demanded to run before we learned to walk.

—Andrew Robinson

Andrew Robinson is a photographer and videographer by trade, working on commercial
and branding projects all over the US. He has served as a managing editor and
freelance journalist in the AV space for nearly 20 years, writing technical articles,
product reviews, and guest speaking on behalf of several notable brands at functions
around the world.