2021 new gen Apple tv 4k Dolby Atmos Problems

MartinLogan Audio Owners Forum

Help Support MartinLogan Audio Owners Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
So, how about the other streamers like Amazon Firestick and Nvidia Shield? Don't they use the MAT 2.0, or is it bitstream too?
Robert, all devices that send audio > 2ch LPCM to an AVR use MAT 2.0, however, the format of the payloads is different.
For non-Apple devices, they send the raw bitstream (DD+ for streamers, DolbyTrueHD for discs/ripped), with the bed channels intact (undecoded).

An AppleTV takes the DD+ stream, decodes it to 5.1.x or 7.1.x bed channels in LPCM so it can optionally mix in audio responses from the UI. It then encapsulates these LPCM channels into a MAT format flagging the bed channels as LPCM, then appending any Atmos extensions (still encoded in DD+ format).
It is likely this added processing when combined with timing-related issues in the App (e.g. Neflix) seem to lead to the glitch.

My weekend usage seems to point me (yet again, see prior post ) to the Netflix App as the primary culprit.

As a big F1 fan, I was excited about this weekends race, and the new season of Drive To Survive, so I got up early on Saturday to binge a couple of episodes. Well, 15 minutes in, started getting audio dropouts on the AVR. I was still on 15.3, so I had the unit update to 15.4.
For good measure, I restarted the splitter, poked at the AVR video settings, as the receiver had reverted to 4K HDR video being 'Standard' vs 'Extended', Extended is required to support Dolby Vision. Seems a full power down of the rig 4 days ago caused it to lose that setting (or yet again, boot order is important, see this post).

Post 15.4 and the change to the AVR, I now have DV, but the Atmos soundtrack still randomly mutes every 10 minutes. The AVR maintains a lock on 'Atmos', which tells us the MAT 2.0 stream is stable in terms of types and is not re-synching, but is either affected by timing locally or more likely, at the encode in ATV.

On Sunday afternoon, we watched three episodes in a row of Boba Fett (Disney+ 4K DV - Atmos) and not a single audio glitch.
I then went back to Drive To Survive, and within 10 minutes got a dropout. Seemingly the Netflix App has a problem on ATV.

Since I have a ShieldPro, will test that next, but that should work, as it bitstreams the unmolested audio track.

As someone who has directed software teams for decades, much of it for multiplatform targets (Windows/Mac and iOS / Android), I can tell you that there is a high likelihood the Netflix App glitch is due to 'common code' vs platform code, especially since the ATV is unique in its local bed channel decode approach.
 
Robert, all devices that send audio > 2ch LPCM to an AVR use MAT 2.0, however, the format of the payloads is different.
For non-Apple devices, they send the raw bitstream (DD+ for streamers, DolbyTrueHD for discs/ripped), with the bed channels intact (undecoded).

An AppleTV takes the DD+ stream, decodes it to 5.1.x or 7.1.x bed channels in LPCM so it can optionally mix in audio responses from the UI. It then encapsulates these LPCM channels into a MAT format flagging the bed channels as LPCM, then appending any Atmos extensions (still encoded in DD+ format).
It is likely this added processing when combined with timing-related issues in the App (e.g. Neflix) seem to lead to the glitch.

My weekend usage seems to point me (yet again, see prior post ) to the Netflix App as the primary culprit.

As a big F1 fan, I was excited about this weekends race, and the new season of Drive To Survive, so I got up early on Saturday to binge a couple of episodes. Well, 15 minutes in, started getting audio dropouts on the AVR. I was still on 15.3, so I had the unit update to 15.4.
For good measure, I restarted the splitter, poked at the AVR video settings, as the receiver had reverted to 4K HDR video being 'Standard' vs 'Extended', Extended is required to support Dolby Vision. Seems a full power down of the rig 4 days ago caused it to lose that setting (or yet again, boot order is important, see this post).

Post 15.4 and the change to the AVR, I now have DV, but the Atmos soundtrack still randomly mutes every 10 minutes. The AVR maintains a lock on 'Atmos', which tells us the MAT 2.0 stream is stable in terms of types and is not re-synching, but is either affected by timing locally or more likely, at the encode in ATV.

On Sunday afternoon, we watched three episodes in a row of Boba Fett (Disney+ 4K DV - Atmos) and not a single audio glitch.
I then went back to Drive To Survive, and within 10 minutes got a dropout. Seemingly the Netflix App has a problem on ATV.

Since I have a ShieldPro, will test that next, but that should work, as it bitstreams the unmolested audio track.

As someone who has directed software teams for decades, much of it for multiplatform targets (Windows/Mac and iOS / Android), I can tell you that there is a high likelihood the Netflix App glitch is due to 'common code' vs platform code, especially since the ATV is unique in its local bed channel decode approach.
I have the same audio issue watching Apple tv + shows too. That REALLY surprised me because you'd think they could get their own app right. The Apple tv + app issue is much less frequent and a bit less severe. I had it the worst when watching the series See.

I think you're really onto something here, and correct about the cause. How does the whole totally random thing work? You could watch 3 episodes of See and have zero problems, and then on the 4th episode you have it happen 2x in the first 30 minutes. On the show For All Mankind it only happened once the entire series! I think it's 3 seasons, I forget. It's almost as if it somehow depends on the actual signal you're streaming? Some nights it's really bad.

I'm glad you came on the thread and offered your expertise. Have you contacted Apple? I can't remember. I wonder, if you communicate with them, you might be able to help.
 
So the decoding of the signal is needed so that the UI can mix in sound. I guess this is primarily for the Siri voice? Amazon Firestick has its voice, but does it differently?

Too bad the Apple tv doesn't have a setting where we can turn off the decoding and just send out the raw stream to the AVR.
 
Jonfo, why do you think the problem only happens on Atmos and not dolby 5.1?

How does your Shield Pro compare to the Apple? Audio and video quality comparison and all other factors. I think I've been told that the UI of the nvidia isn't as good.
 
@JonFo Formula1 finally released their AppleTV app, so I once again subscribed to check it out. Note that I have been watching on DirecTV - which I still have available - for prior seasons.

The video quality and sound quality were *vastly* better on the F1 app than on DTV. You can sign up for a 7 day trial of F1 to check it yourself on the next race. It was a simply wonderful experience.

Plus I don't have to try to figure out where the devil they're moving practice, and set up my DVR to grab that on DTV. And half the time when they move the race to a different channel, I've missed those too...



Edited to add: The "Pro" version - at least I think that's what they call it - is what I bought. It allows you to view your choice of telemetry during the live race (I assume, over an iPad or computer), plus I think it allows replaying coverage from the archives. It's not a ton of money, perhaps $79 for the season.

Did I mention how wonderful the sound is? And... no ads! This is the one thing that was a hard stop for canceling DTV for me, so now I'll probably revisit that as well.
 
Last edited:
Jonfo, why do you think the problem only happens on Atmos and not dolby 5.1?

How does your Shield Pro compare to the Apple? Audio and video quality comparison and all other factors. I think I've been told that the UI of the nvidia isn't as good.

I am not speaking for @JonFo, but that was the clue that had me pointing my finger at the MAT 2.0 Encoding/Timing as root cause, back when @JonFo was discussing EDID HDMI issues. If it is a MAT 2.0 Encoding/Timing issue, then it will only affect Atmos as MAT 2.0 is only used for Atmos metadata.
 
@JonFo Formula1 finally released their AppleTV app, so I once again subscribed to check it out. Note that I have been watching on DirecTV - which I still have available - for prior seasons.

The video quality and sound quality were *vastly* better on the F1 app than on DTV. You can sign up for a 7 day trial of F1 to check it yourself on the next race. It was a simply wonderful experience.

Plus I don't have to try to figure out where the devil they're moving practice, and set up my DVR to grab that on DTV. And half the time when they move the race to a different channel, I've missed those too...
This is great news! thanks for sharing.

I also hated that this weekend, the Qualifing show was on one of the ESPN channels that is NOT included in my sub. Also, agree about the challenges in scheduling recordings, it's a bear.
I believe my retention credits for DirecTV expire this or next month, and I'll cancel for good (after a loooong run, I purchased the first unit ever sold in Atlanta in the early 90's).
 
Jonfo, why do you think the problem only happens on Atmos and not dolby 5.1?

Several factors mentioned in the thread, one being the higher bandwidth requirements (both network and HDMI) for the combination of 4K DV and high-bit-rate DD+ with Atmos. But the likely problem is in the decode/reassemble step required for Atmos as it leaves the ATV.
For DD+ 5.1, the ATV simply decodes to 5.1 LPCM, and passes that along to the AVR via MAT, no additional re-assembly (and the time to do so) required.

How does your Shield Pro compare to the Apple? Audio and video quality comparison and all other factors. I think I've been told that the UI of the nvidia isn't as good.
I'll use the Shield later today and report. I expect video to be the same, and audio to just work.
 
I am not speaking for @JonFo, but that was the clue that had me pointing my finger at the MAT 2.0 Encoding/Timing as root cause, back when @JonFo was discussing EDID HDMI issues. If it is a MAT 2.0 Encoding/Timing issue, then it will only affect Atmos as MAT 2.0 is only used for Atmos metadata.
Yes, you were pointing in the right direction.
Slight correction, MAT 2.0 is used for all formats other than native 2ch LPCM over HDMI. It is a flexible, and extensible transport protocol (note that it is a transport, not an audio codec).
The formats and codecs used by the streams passed over it can be quite complex, and one of the newest, and obviously somewhat problematic, is the Immersive Atmos variant with bed channels as LPCM and all object extensions as DD+ codec.
In contrast to the typical DD+ codec with both the base (still in DD+ encodes) and objects in a single-codec stream.

There seems to be a timing related component here regardless, as you have the possibility of incoming (to the ATV) streams shifting resolution (therefore pacing to a degree) due to networking variances (some of which are in the ISP or backbones).
Other timing issues do seem to be driven by AVR/TV behaviours, some of which could be EDID, others could be firmware related.
 
...
I'm glad you came on the thread and offered your expertise. Have you contacted Apple? I can't remember. I wonder, if you communicate with them, you might be able to help.
I actually have contacts in Apple engineering (networking side), one is even an Apple Fellow. I'll reach out to him for a link-up with the TVOS team.
I also am in direct contact with a Director of Engineering at Netflix, as he is a customer of mine, and we chat about, well, networking ;-)
So if we need to ping them, I think I can get linked up.
 
Yes, you were pointing in the right direction.
Slight correction, MAT 2.0 is used for all formats other than native 2ch LPCM over HDMI. It is a flexible, and extensible transport protocol (note that it is a transport, not an audio codec).
The formats and codecs used by the streams passed over it can be quite complex, and one of the newest, and obviously somewhat problematic, is the Immersive Atmos variant with bed channels as LPCM and all object extensions as DD+ codec.
In contrast to the typical DD+ codec with both the base (still in DD+ encodes) and objects in a single-codec stream.

There seems to be a timing related component here regardless, as you have the possibility of incoming (to the ATV) streams shifting resolution (therefore pacing to a degree) due to networking variances (some of which are in the ISP or backbones).
Other timing issues do seem to be driven by AVR/TV behaviours, some of which could be EDID, others could be firmware related.

Thanks for the additional details. I assumed MAT 2.0 was only used to carry the Atmos metadata in the ATV4K. I am doubtful that the issue precedes the creation of the MAT 2.0 stream, as decoding issues might tend to corrupt other audio formats. I have also never noticed any video artifacts associated with the Atmos issues.

Do you have any better ideas why the issue only seems to affect the Gen 2? I have been speculating that there was a subtle difference in the hardware that the software did not take into account, affecting the MAT 2.0 creation/timing. I assumed the software was essentially identical for the Gen 1 and Gen 2, thinking Apple wanted to take the easy/cheaper route of simply spinning a new board to support the A12 Bionic, and more current specs., requiring no/minimal changes to the software/tvOS. I am curious if Apple engineering is even looking into the Atmos issue.

Your expertise in the subject is greatly appreciated. I have none and can only make assumptions and conclusions at a high or layman's level.
 
Thanks for the additional details. I assumed MAT 2.0 was only used to carry the Atmos metadata in the ATV4K. I am doubtful that the issue precedes the creation of the MAT 2.0 stream, as decoding issues might tend to corrupt other audio formats. I have also never noticed any video artifacts associated with the Atmos issues.

Do you have any better ideas why the issue only seems to affect the Gen 2? I have been speculating that there was a subtle difference in the hardware that the software did not take into account, affecting the MAT 2.0 creation/timing. I assumed the software was essentially identical for the Gen 1 and Gen 2, thinking Apple wanted to take the easy/cheaper route of simply spinning a new board to support the A12 Bionic, and more current specs., requiring no/minimal changes to the software/tvOS. I am curious if Apple engineering is even looking into the Atmos issue.

Your expertise in the subject is greatly appreciated. I have none and can only make assumptions and conclusions at a high or layman's level.
So, re: video artifacts...over the weekend was watching AppleTV+ (which does have the popping issue for me, albeit rarely, and without muting), I was getting this "skipping" or very, very brief pauses in the video. Pretty minor, but certainly noticeable. Tried closing all apps, exiting and opening the TV app, etc. but to no avail. Went away after I did a full restart. Could this all be a thread or scheduler issue, maybe handled differently on the new h/w? Grasping at straws here. Eager to here of any feedback @JonFo gets!
 
I am not speaking for @JonFo, but that was the clue that had me pointing my finger at the MAT 2.0 Encoding/Timing as root cause, back when @JonFo was discussing EDID HDMI issues. If it is a MAT 2.0 Encoding/Timing issue, then it will only affect Atmos as MAT 2.0 is only used for Atmos metadata.
Oh,ok! Makes perfect sense.
 
Several factors mentioned in the thread, one being the higher bandwidth requirements (both network and HDMI) for the combination of 4K DV and high-bit-rate DD+ with Atmos. But the likely problem is in the decode/reassemble step required for Atmos as it leaves the ATV.
For DD+ 5.1, the ATV simply decodes to 5.1 LPCM, and passes that along to the AVR via MAT, no additional re-assembly (and the time to do so) required.


I'll use the Shield later today and report. I expect video to be the same, and audio to just work.
Do you know what bit rate speed is needed for a combo of Dolbi Vision and Atmos? I'm hitting a consistent 550 to 600 mbps with low buffer bloat. To me it must not be that. I'm ethernet too, hardwired.
 
I actually have contacts in Apple engineering (networking side), one is even an Apple Fellow. I'll reach out to him for a link-up with the TVOS team.
I also am in direct contact with a Director of Engineering at Netflix, as he is a customer of mine, and we chat about, well, networking ;-)
So if we need to ping them, I think I can get linked up.
WHOA! 🙂
 
It might help to re-post something I wrote for another Audio forum where we discussed this:

Seems to me that part of the challenge in this discussion is the conflation of various elements whose definition likely needs to be clarified.

Having followed the evolution of various containerized formats over the past 20 or more years, it is common in threads like this to skip being really clear about a couple of key concepts, which to me are:

Transport
These are the standards used to relay content that is in a certain format and with a given coding between two or more devices in a playback chain.
Example of this are: SPDIF, AES-EBU, HDMI, Dolby-MAT 2.0 over HDMI

Dolby MAT is an interesting one, as, at its core, it is about transporting audio content across the high-capacity lanes provided by the 2 * 16bit * 192Khz ‘audio’ carrier lanes in HDMI 1.3 and higher standards.
It aggregates the bandwidth of multiple lanes to provide an agnostic data transport layer. Yet it also has elements of a ‘format’ as it negotiates the type of data to be relayed between sender/receiver based on mutual capabilities.

Streaming uses proprietary or open standards to deliver audio streams to the streaming device, which are then unpacked into the indicated format, typically DD+, which is then delivered to the playback device using MAT.

Formats
This is the definition of the structure used to convey the audio information as defined by the standard it applies to. It organizes the elements that will be used to unpack and then render the audio.
These can be quite complex and varied, with multiple layers to handle optional format features (such as pre-rendered 2-ch downmixes).
Examples of this are: 2ch LPCM, AC3, DTS, Doby Digital+, DTS-HD-MA, Dolby TrueHD, Dolby Atmos

Part of the confusion stems from the fact that formats are often highly intertwined with their respective codecs. Modern formats support a variety of codecs (lossy vs lossless) using the same format, which further adds confusion.
Formats specify what meta-data is required or optional to describe the associated audio streams.

I have evidence from my Smyth A16 that an ATV4K sends audio over to the processor (using a MAT transport) in a TrueHD format (but with LPCM beds + lossy object streams).

Coding
These are the actual codec (coder/decoder) types used to compress (lossy or losslessly), encode and decode actual audio streams.
Examples of this are: FLAC, Meridian Lossless Packing or MLP, AKA TrueHD Lossless, DTS lossy and lossless codecs.

Rendering
This is the step a processor will take once it has unpacked the payloads in the format, decoded them per specs to actually start the process of rendering to the selected output channels.
For 2ch, it’s easy, decode to two LPCM streams of appropriate bit-depth, volume trim and apply any DSP (like room correction) and pass to the DACs.

As this discussion has surfaced, rendering an immersive audio stream can be complex, with several variations of how to deal with bed channels and positional streams.

My point here is that it would be helpful if we used terms such as the above to clarify if we are talking about elements of a format, coding of streams within the format, or the process of rendering the supplied streams and then outputting them to a given speaker configuration.
 
Last edited:
And another little tidbit:

... the ATV normalizes incoming bed channels to LPCM so it can add UI feedback, then re-packs additional audio in some format.
It even uses MAT to pass 2ch LPCM when playing Apple Music stereo tracks, so as to minimize transport type changes (better stability).

So to my points above, MAT is used as the transport, and if we believe the Realiser, the format containing the audio data is a .thd (TrueHD format, not compression), containing LPCM streams for the bed (perfectly legit in TrueHD format to use no compression) channels, and the Atmos extension streams pulled from the DD+ unpacking, now placed into a TrueHD format (IIRC, DD+ is a subset of TrueHD format, so elements can just be copied over).

Again normalizing to my terminology above, what the Realiser calls 'Image' I call Transport type. It displays 'Atmos' when MAT 2.0 is in use, even when it is conveying 8 channels of LPCM (only 2 of which have the L/R LPCM when playing 2ch Apple music).
 
So, re: video artifacts...over the weekend was watching AppleTV+ (which does have the popping issue for me, albeit rarely, and without muting), I was getting this "skipping" or very, very brief pauses in the video. Pretty minor, but certainly noticeable. Tried closing all apps, exiting and opening the TV app, etc. but to no avail. Went away after I did a full restart. Could this all be a thread or scheduler issue, maybe handled differently on the new h/w? Grasping at straws here. Eager to here of any feedback @JonFo gets!
Was it dolby vision?
 
Do you know what bit rate speed is needed for a combo of Dolbi Vision and Atmos? I'm hitting a consistent 550 to 600 mbps with low buffer bloat. To me it must not be that. I'm ethernet too, hardwired.
For the network side, one can easily pull 4K DV + Atmos at 50Mbps, as long as bufferbloat is contained. The actual stream maxes at something like 30Mbps.

To get a sense of just how much capacity 600Mbps is relative to the actual load, here is the capacity chart for Sunday (yesterday) from my IQrouter Pro on my Gigabit line, the traffic manager was manually lowered to 700 as my ISP can't really deliver more than that reliably, plus as you can see, we don't ever even use that at peak.
I watched Drive to Survive at 8am, wife was also banging away at the Internet. The average (green) for the 8am hour barely registers, less than 50Mbps.
From 4pm to 8pm, we binged on Boba Fett (4K DV Atmos), again average barely around 25Mbps, peaks are higher, but all below 350.
IQrouterPro_GigLine_Streaming.png


So even with my kind of headroom, I still experienced audio issues, so definitely not the network here either.
 
Last edited:
For the network side, one can easily pull 4K DV + Atmos at 50Mbps, as long as bufferbloat is contained. The actual stream maxes at something like 30Mbps.

So even with my kind of headroom, I still experienced audio issues, so definitely not the network here either.

That doesn't surprise me. I never believed a network issue could cause the Atmos issues without also affecting the video. I assumed that all a network issue could do was cause the streamer to eat into the buffer a bit, potentially resulting in a lowering of video BW and image quality if the network issues persisted, which I believe you have previously discussed as a possibility.

If changes to the video stream BW/quality were related to the Atmos issue, I guess you might also expect Atmos problems during the start of any Atmos stream, except in that case the video stream starts out at a lower data rate, though the network BW should initially be much higher as the streamer fills its buffer and ramps up the video quality.
 

Latest posts

Back
Top