2021 new gen Apple tv 4k Dolby Atmos Problems

MartinLogan Audio Owners Forum

Help Support MartinLogan Audio Owners Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Ara noticed that Drive to Survive was 50Hz. I had to go into my Denon app to confirm, as there is no display of the video frame rate on my Denon AVR or Sony TV. I had been speculating that a busier Atmos track and higher bandwidth video were a factor in the Atmos dropouts, and the reason I have found Drive to Survive to be a most problematic program, but now we have proof.

As further proof, I ran the experiment of setting my ATV4K to 1080p with DV, with both match Frame Rate and Dynamic Range ON. I played an entire episode of Drive to Survive at 1080p 50Hz and the Atmos was flawless. This proves that higher B/W video, independent of Frame Rate, is the catalyst for the Atmos problem. This isn’t much of a surprise, except when you consider that the faster ATV4K Gen 2 is supposed to support 4K video at up to 120Hz due to having HDMI 2.1, an upgrade over the Gen 1 with HDMI 2.0. We also know that DV is not a factor, as Drive to Survive is not available to stream in DV. It is video bandwidth alone -- not Frame Rate or DV.

The next experiment I will perform is to see if Drive to Survive plays Atmos perfectly in 4K on my ATV4k Gen 1. I have never had an Atmos issue on the Gen 1, but I never tested it with 4K at 50Hz.

Knowing the Atmos problem is related to high bandwidth video doesn't help us solve the problem, but lowering the resolution to 1080p is a viable workaround for those that prefer Atmos over 4K. We don't know if the Atmos dropouts are caused by a hardware issue in the Gen 2 units, but it should be possible to significantly mitigate the problem in software based on my tvOS 14.x experience. It is all up to Apple and whether or not they care to do anything about the problem. At this point, it doesn't seem they do.
Great work! Thanks. It might be helpful if this was reported to Apple? Problem is, when we call in we never have the opportunity to speak directly with an engineer. It's always just a rep. I had my case escalated, but when I asked her about an opportunity to communicate direct with an engineer she said no. I wonder if it's even possible to email one directly?
 
Great work! Thanks. It might be helpful if this was reported to Apple? Problem is, when we call in we never have the opportunity to speak directly with an engineer. It's always just a rep. I had my case escalated, but when I asked her about an opportunity to communicate direct with an engineer she said no. I wonder if it's even possible to email one directly?
Just out of curiosity, did you ever hear back on your ticket? I ping my ticket every couple weeks asking for updates. Last update I received was on Jan 20 2022.

eta: never received the HDMI cable they were supposed to send either.
 
Just out of curiosity, did you ever hear back on your ticket? I ping my ticket every couple weeks asking for updates. Last update I received was on Jan 20 2022.

eta: never received the HDMI cable they were supposed to send either.
Well, she called back 2x. She was supposed to check back a month later and she never did. It's been several months since we last communicated. So I gave up on it. I feel like there isn't any more I can do. I tried. I might call back in the future, don't want to rule it out. I doubt they will ever contact me again. I feel like they just want us to go away. I doubt the problem is getting any attention now. Nobody from Apple has ever publicly acknowledged this from what i can tell.

Its like they don't really care.
 
Ara noticed that Drive to Survive was 50Hz. I had to go into my Denon app to confirm, as there is no display of the video frame rate on my Denon AVR or Sony TV. I had been speculating that a busier Atmos track and higher bandwidth video were factors in the Atmos dropouts, and the reason I have found Drive to Survive to be a most problematic program, but now we have proof.

As further proof, I performed an experiment setting my ATV4K to 1080p with DV, with both match Frame Rate and Dynamic Range ON. I played an entire episode of Drive to Survive at 1080p 50Hz and the Atmos was flawless. This proves that higher B/W video, independent of Frame Rate, is the catalyst for the Atmos problem. This isn’t much of a surprise, except when you consider that the faster ATV4K Gen 2 is supposed to support 4K video at up to 120Hz due to having HDMI 2.1, an upgrade over the Gen 1 with HDMI 2.0. We also know that DV is not a factor, as Drive to Survive is not available to stream in DV. It is video bandwidth alone -- not Frame Rate or DV.

The next experiment I will perform is to see if Drive to Survive plays Atmos perfectly in 4K on my ATV4k Gen 1. I have never had an Atmos issue on the Gen 1, but I never tested it with 4K at 50Hz.

Knowing the Atmos problem is related to high bandwidth video doesn't help us solve the problem, but lowering the resolution to 1080p is a viable workaround for those that prefer Atmos over 4K. We don't know if the Atmos dropouts are caused by a hardware issue in the Gen 2 units, but it should be possible to significantly mitigate the problem in software based on my tvOS 14.x experience. It is all up to Apple and whether or not they care to do anything about the problem. At this point, it doesn't seem they do.
I have looked over this thread and can't find very much stated about the HDMI cables everyone is using, especially cable length. When talking about bandwidth, cabling is an important consideration.

ATV4K has a "Check HDMI Connection" test it can run that can check each resolution setting, and it works pretty well. Have you tried this?
 
I have looked over this thread and can't find very much stated about the HDMI cables everyone is using, especially cable length. When talking about bandwidth, cabling is an important consideration.

ATV4K has a "Check HDMI Connection" test it can run that can check each resolution setting, and it works pretty well. Have you tried this?

Yes, cable test successful. Monoprice 8k cables 3ft and 6ft. Belken "Apple" HDMI cable. Various other 8k-tested cables. Theres some posts a few pages back where we went through the cable discussion, including speculation of why you can have "too short a cable". Cables have been ruled out.
 
Cables have been ruled out.
I can't figure out why I have only had one ATMOS audio dropout, and so many others have problems. I'm not really complaining, but I have an obsession to know why things do what they do.

Here's my setup:
4K Dolby Vision 60 (59.94Hz)
Match Dynamic Range: On
Match Frame Rate: On
1 Meter HDMI copper cable to processor.
10 Meter Fiber Optic HDMI cable from processor to display.

Since the fiber optic HDMI cable has only been connected for a week there's not many hours of ATMOS on it yet, but there are a few, with no issues. In fact, the video is actually cleaner than the 4.5 Meter copper HDMI cable I'd been using the last couple years. No more artifacts in tiny graphics anymore, and 4K streams settle much quicker. Some shows, like Drive To Survive, would take 20-30 seconds to settle into true 4K, but now it's just a couple seconds the first time in a session. And watching all the F1TV stuff over the weekend resulted in perfect video, vs before when the video would go low res for a few seconds and then clear up.
 
I have looked over this thread and can't find very much stated about the HDMI cables everyone is using, especially cable length. When talking about bandwidth, cabling is an important consideration.

ATV4K has a "Check HDMI Connection" test it can run that can check each resolution setting, and it works pretty well. Have you tried this?
I'm using a Wireworld $250 cable, so I should be covered. It's certified high speed. I tried using other cables too and nothing worked.
 
I can't figure out why I have only had one ATMOS audio dropout, and so many others have problems. I'm not really complaining, but I have an obsession to know why things do what they do.

Here's my setup:
4K Dolby Vision 60 (59.94Hz)
Match Dynamic Range: On
Match Frame Rate: On
1 Meter HDMI copper cable to processor.
10 Meter Fiber Optic HDMI cable from processor to display.

Since the fiber optic HDMI cable has only been connected for a week there's not many hours of ATMOS on it yet, but there are a few, with no issues. In fact, the video is actually cleaner than the 4.5 Meter copper HDMI cable I'd been using the last couple years. No more artifacts in tiny graphics anymore, and 4K streams settle much quicker. Some shows, like Drive To Survive, would take 20-30 seconds to settle into true 4K, but now it's just a couple seconds the first time in a session. And watching all the F1TV stuff over the weekend resulted in perfect video, vs before when the video would go low res for a few seconds and then clear up.
I've got match frame rate and dynamic range off. So every show I watch is in dolby vision. Maybe that causes issues? From what I read on our thread, having match turned on causes more problems.
 
I can't figure out why I have only had one ATMOS audio dropout, and so many others have problems. I'm not really complaining, but I have an obsession to know why things do what they do.

Here's my setup:
4K Dolby Vision 60 (59.94Hz)
Match Dynamic Range: On
Match Frame Rate: On
1 Meter HDMI copper cable to processor.
10 Meter Fiber Optic HDMI cable from processor to display.

Since the fiber optic HDMI cable has only been connected for a week there's not many hours of ATMOS on it yet, but there are a few, with no issues. In fact, the video is actually cleaner than the 4.5 Meter copper HDMI cable I'd been using the last couple years. No more artifacts in tiny graphics anymore, and 4K streams settle much quicker. Some shows, like Drive To Survive, would take 20-30 seconds to settle into true 4K, but now it's just a couple seconds the first time in a session. And watching all the F1TV stuff over the weekend resulted in perfect video, vs before when the video would go low res for a few seconds and then clear up.
Great that you're not having issues. I could be mistaken, but I think there might be another user in this thread using Emotiva processor that didn't have issues. There was earlier speculation that certain manufactures shared components in their design that somehow causes issues with the latest AppleTV sw/hw. That could explain the discrepancy that we see among the different AppleTV users.
 
Great that you're not having issues. I could be mistaken, but I think there might be another user in this thread using Emotiva processor that didn't have issues. There was earlier speculation that certain manufactures shared components in their design that somehow causes issues with the latest AppleTV sw/hw. That could explain the discrepancy that we see among the different AppleTV users.
Lots of us with Marantz having issues, but then you read about guys with simple sound bars from different manufacturers having troubles. One guy I read about just had his Apple tv plugged directly into his TV and the TV had the audio dropouts!
 
Lots of us with Marantz having issues, but then you read about guys with simple sound bars from different manufacturers having troubles. One guy I read about just had his Apple tv plugged directly into his TV and the TV had the audio dropouts!

"...speculation that certain manufactures shared components in their design..."

Agreed.
The problem lies in the Apple tv. It's hardware, software, or a combo of both.

Agreed. But there must be some common thread amongst certain users that expose this problem as others don't seem affected at all. I don't know how else to account for that discrepancy. My first inclination was that those claiming "no issue" just hadn't experienced it yet, or didn't really have Atmos configured correctly, or...? But, the number of seemingly qualified responses claiming no problem really makes me question assumptions. Do these users have processors that just "recover" from the bug, or are they not affected at all? Or, maybe they indeed aren't configured similarly to us. So many questions, so little answers!
 
"...speculation that certain manufactures shared components in their design..."

Agreed.


Agreed. But there must be some common thread amongst certain users that expose this problem as others don't seem affected at all. I don't know how else to account for that discrepancy. My first inclination was that those claiming "no issue" just hadn't experienced it yet, or didn't really have Atmos configured correctly, or...? But, the number of seemingly qualified responses claiming no problem really makes me question assumptions. Do these users have processors that just "recover" from the bug, or are they not affected at all? Or, maybe they indeed aren't configured similarly to us. So many questions, so little answers!
It's frustrating. Even I had periods of months where I watched it and not one hiccup. I think during mid summer into last fall it was because the OS they had out was better in this respect.

Many guys that use it and don't have the problem just blame it on our hardware and brush it off. Apple seems that way too. The number of customers having the problem is growing though as more and more buy it. Eventually they will need to address it, or lose a lot of customers.
 
I have looked over this thread and can't find very much stated about the HDMI cables everyone is using, especially cable length. When talking about bandwidth, cabling is an important consideration.

ATV4K has a "Check HDMI Connection" test it can run that can check each resolution setting, and it works pretty well. Have you tried this?

You make a good point about HDMI cables. I have discussed why I have ruled out the HDMI cables, but this is a very long tread. I only have a cheap 3 ft cable between my ATV4K Gen 2 and my Denon AVR. That short of a cable should have minimal issues with reflections or other transmission line effects, and it works perfectly with my Gen 1 ATV4K. I have also swapped the Gen 2 ATV4K to another HDMI cable and input on my Denon AVR (I have both the Gen 1 and Gen 2 connected to my Denon to conveniently test the latest tvOS releases), with no difference in the Atmos problem. Also, I do not believe it makes any sense that a HDMI cable issue could specifically target Atmos audio and leave all other audio formats and the more error-susceptible video completely free of any artifacts. I have run the “Check HDMI Connection” test numerous times and it always passes. In my case, I believe I can safely rule out the HDMI cable.

As for the Atmos decoder/processor, I have also discussed this. I suggested that different Atmos decoders/processors handle errors differently. Some may simply blank the audio to protect the speakers from potentially damaging pops/static, while others may replace that gap with the last good adjacent audio, similar to what CD players did to address errors due to dirt and scratches. There is also going to be variation in how far outside the MAT 2.0 Atmos specification the Atmos decoder/processor will accept before it detects an error. This is all just my speculation.

As I pointed out in a prior post, Ara Derderian, cohost of the “HT Guys” podcast (1050 weekly episodes and counting!), also never noticed an issue with Atmos until he played Drive to Survive per my suggestion. It is very possible that those who claim no issues with Atmos simply haven’t played more problematic source material. I believe I have proven that the Atmos issue requires very high video bandwidth, so the ISP service and other speed-limiting factors, such as Wi-Fi, may also play a part. No-one with less than a 4K TV or is feeding the video through an AVR that doesn't support 4K (not sure if there are non-4K Atmos AVRs) will ever experience the Atmos problem.
 
Last edited:
As for the Atmos decoder/processor, I have also discussed this. I suggested that different Atmos decoders/processors handle errors differently.
All good points.
Chipset makes a pretty big different between "some" brands. My brand, Emotiva, is in the minority with regards to the chipset they chose, and as such are behind in some ways, but this might be one difference between my system and others.

As I pointed out in a prior post, Ara Derderian, cohost of the “HT Guys” podcast (1050 weekly episodes and counting!), also never noticed an issue with Atmos until he played Drive to Survive per my suggestion. It is very possible that those who claim no issues with Atmos simply haven’t played more problematic source material. I believe I have proven that the Atmos issue requires very high video bandwidth, so the ISP service and other speed-limiting factors also play a part. No-one with a 1080p TV or feeding the video through an AVR that doesn't support 4K will ever experience the Atmos problem.
I failed to mention what my ISP speed is, which is 360Mbps max, and since it's been that fast for about a year it's not measured below 170Mbps.

I know that some brands of AVP's, like mine, don't report the same way as most do in regards to the Apple Dolby MAT audio container. Might this play a role?

When I've had issues with video breakup in the last year, it's always been a loose cable somewhere.

I haven't updated the firmware on the ATV4K since 2-3 months ago. I'll check the version tonight, but it might be a couple versions behind. What's the thought about the newest versions?
 
I failed to mention what my ISP speed is, which is 360Mbps max, and since it's been that fast for about a year it's not measured below 170Mbps.

Should be okay, but keep in mind that your ISP B/W is shared among all your internet devices. I assume you have a wired connection to your ATV4K Gen 2 and are not using Wi-Fi.

I know that some brands of AVP's, like mine, don't report the same way as most do in regards to the Apple Dolby MAT audio container. Might this play a role?


I doubt it. As long as your equipment is capable of of decoding the 5.1 LPCM + MAT 2.0 from the ATV4K, you should be fine. My Denon reports some Netflix Atmos programs as being multi-channel audio, rather than the advertised Amos. I don't know what that is about -- perhaps Netflix mislabeling the Audio. Cracow Monsters is an example. Perhaps just the native language track is Atmos -- I'll have to check.

I haven't updated the firmware on the ATV4K since 2-3 months ago. I'll check the version tonight, but it might be a couple versions behind. What's the thought about the newest versions?

There were several reports that tvOS 15.4 caused Gen 1 units to exhibit the Atmos problem, so I assume it is also worse on the Gen 2, not that I have noticed any difference. However, I have kept my Gen 1 on tvOS 15.3. My Gen 2 is on the latest 15.5 beta and no improvement.
 
Last edited:
My Denon reports some Netflix Atmos programs as being multi-channel audio, rather than the advertised Amos.
I've noticed this on Apple Music with Spatial Audio on the AppleTV. Specifically noticed on some "Made for Spatial Audio" playlists, everything gets decoded as Atmos, but some tracks in the playlist (Pink Floyd's A Momentary Lapse of Reason remaster comes to mind) only come in as multi-channel. I, too, assumed it was a mislabeling, but maybe something in the content is causing Apple TV's decoding to choke?

I haven't updated the firmware on the ATV4K since 2-3 months ago. I'll check the version tonight, but it might be a couple versions behind. What's the thought about the newest versions?
I'm on the latest production (non-beta) release. Also no improvements. That said, not sure if any of the app updates require newer versions, but it was nice to get the native tvOS player in Netflix. There could be other bug fixes I'm not aware of. But honestly, I feel like the further we go, the more bugginess I encounter with tvOS.
 
You make a good point about HDMI cables. I have discussed why I have ruled out the HDMI cables, but this is a very long tread. I only have a cheap 3 ft cable between my ATV4K Gen 2 and my Denon AVR. That short of a cable should have minimal issues with reflections or other transmission line effects, and it works perfectly with my Gen 1 ATV4K. I have also swapped the Gen 2 ATV4K to another HDMI cable and input on my Denon AVR (I have both the Gen 1 and Gen 2 connected to my Denon to conveniently test the latest tvOS releases), with no difference in the Atmos problem. Also, I do not believe it makes any sense that a HDMI cable issue could specifically target Atmos audio and leave all other audio formats and the more error-susceptible video completely free of any artifacts. I have run the “Check HDMI Connection” test numerous times and it always passes. In my case, I believe I can safely rule out the HDMI cable.

As for the Atmos decoder/processor, I have also discussed this. I suggested that different Atmos decoders/processors handle errors differently. Some may simply blank the audio to protect the speakers from potentially damaging pops/static, while others may replace that gap with the last good adjacent audio, similar to what CD players did to address errors due to dirt and scratches. There is also going to be variation in how far outside the MAT 2.0 Atmos specification the Atmos decoder/processor will accept before it detects an error. This is all just my speculation.

As I pointed out in a prior post, Ara Derderian, cohost of the “HT Guys” podcast (1050 weekly episodes and counting!), also never noticed an issue with Atmos until he played Drive to Survive per my suggestion. It is very possible that those who claim no issues with Atmos simply haven’t played more problematic source material. I believe I have proven that the Atmos issue requires very high video bandwidth, so the ISP service and other speed-limiting factors, such as Wi-Fi, may also play a part. No-one with less than a 4K TV or feeding the video through an AVR that doesn't support 4K (not sure if there are non-4K Atmos AVRs) will ever experience the Atmos problem.
Seems all of us use ethernet with a fast connection too, and my buffer bloat tests all came back good. B+ rating.
My audio system experiences both static/popping sounds and silence. So both have happened on mine.

My $60 Amazon Firestick never did this.
.
 
My speed is usually around 550 mbps, and it's at night when nobody else is using the internet. So the Apple tv has access to all of that.

Has anyone tried using wi-fi instead of hard wired? I've seen numerous people using only wi-fi and they had zero problems. Perhaps it's a problem with the ethernet card in the device?

We should probably test it and see. I haven't brought myself to test it yet because it seems idiotic, but now I wonder.

With the Amazon Firestick, I saw reports saying that using ethernet hardwired on that had big problems. Amazon made a separate part you could buy that allowed you to do it. People were warning not to waste your money because wi-fi worked better.
 
My speed is usually around 550 mbps, and it's at night when nobody else is using the internet. So the Apple tv has access to all of that.

Has anyone tried using wi-fi instead of hard wired? I've seen numerous people using only wi-fi and they had zero problems. Perhaps it's a problem with the ethernet card in the device?

We should probably test it and see. I haven't brought myself to test it yet because it seems idiotic, but now I wonder.

With the Amazon Firestick, I saw reports saying that using ethernet hardwired on that had big problems. Amazon made a separate part you could buy that allowed you to do it. People were warning not to waste your money because wi-fi worked better.
I was on wifi when it started. Now hardwired in. 1Gbps service; gigabit ethernet; no bufferbloat issues. I refuse to accept its networking.
 
Back
Top