1. Summary
When a single MediaSource has two SourceBuffers active where both buffers receive Shaka’s mux.js-produced fragmented MP4 (one audio/mp4; codecs="ac3" and one video/mp4; codecs="avc1.4D401F"), Vega’s mediasourcebin cannot construct a working pipeline for the second registered buffer. After both buffers successfully accept their first init+segment append (no error event on either), playback transitions to playing and then the gstreamer pipeline aborts with:
GstAppSrc:appsrc_1: streaming stopped, reason not-linked (-1)
The same fMP4 H.264 bytes from mux.js play correctly when they are the only SourceBuffer attached to the MediaSource. The bug is therefore in how mediasourcebin handles two simultaneously-active fMP4 SourceBuffers, not in the H.264 or AC‑3 paths individually.
- App Name:
Tablo
Bug Severity: Blocks current development (this is the bug that prevents any client-only fix for AC‑3 audio in muxed-TS HLS - without it, we could ship a transmuxed two-buffer path right now)
2. Steps to Reproduce
-
Master playlist with combined codec list:
#EXTM3U #EXT-X-VERSION:4 #EXT-X-STREAM-INF:BANDWIDTH=10000000,CODECS="avc1.4D401F,ac3" pls.m3u8Media playlist serves muxed MPEG-TS segments (H.264 video + AC‑3 audio).
-
Load with Shaka Player 4.x configured with:
shakaPlayer.configure({ mediaSource: { forceTransmux: true }, }); -
Have the
MediaCapabilities.decodingInfopolyfill from Bug 1 installed (otherwise the variant is rejected before this code path is reached). -
Start playback.
3. Observed Behavior
Two SourceBuffers are created, both backed by mux.js transmux output:
MediaSource[1]::addSourceBuffer: audio/mp4; codecs="ac3" → SourceBuffer[0] (id 0)
MediaSource[1]::addSourceBuffer: video/mp4; codecs="avc1.4D401F" → SourceBuffer[1] (id 1)
Both buffers accept their first init+segment append cleanly:
SourceBuffer[0]:appendBuffer(86139) // audio init+seg
SourceBuffer[0]:handleEvent: 2,4,2,2 // updateend, no error
SourceBuffer[1]:appendBuffer(434399) // video init+seg
SourceBuffer[1]:handleEvent: 2,5
MediaSource[1]:rebuildActiveSourceBuffersList
Playback transitions to playing. Then the <video> element fires an error event with:
{
code: 3016, // MEDIA_ELEMENT VIDEO_ERROR
category: 3,
data: [3, undefined,
"error msg[ Internal data stream error. ]
debug info[ ../gstreamer-1.20.1/libs/gst/base/gstbasesrc.c(3127):
gst_base_src_loop ():
/GstPipeline:av-player/GstBin:mediasourcebin/GstAppSrc:appsrc_1:
streaming stopped, reason not-linked (-1) ]"
]
}
appsrc_1 is the second registered SourceBuffer (= the video one in our case). not-linked indicates a pad in that branch’s pipeline (downstream of qtdemux) had no consumer linked.
Critical control: the same exact 434,399-byte mux.js fMP4 video buffer plays cleanly when it is the only SourceBuffer on the MediaSource (i.e. master playlist with CODECS="avc1.4D401F" only, same forceTransmux: true):
addSourceBuffer: video/mp4; codecs="avc1.4D401F"
SourceBuffer[0]:appendBuffer(434399) // identical bytes
SourceBuffer[0]:handleEvent: 2,5,2,2
… multiple appends, all updateend, no errors …
[handleStateChangedEvent] newstate: playing
So the H.264 fMP4 from mux.js is fine; the AC‑3 fMP4 from mux.js is fine (its SourceBuffer ACKs init without error in the failing run); only the combination fails.
4. Expected Behavior
mediasourcebin should be able to construct an independent gstreamer pipeline branch (appsrc > qtdemux > decoder > sink) for each SourceBuffer on the same MediaSource, regardless of whether other branches’ init segments use overlapping internal identifiers. Two MSE SourceBuffers attached to the same MediaSource is a routine MSE configuration and the pipeline construction must not depend on cross-buffer uniqueness of MP4 track_ids, tfdt decode times, or moof sequence numbers.
4.a Possible Root Cause & Temporary Workaround
Suspected root cause: mux.js produces each MP4 init segment independently and emits its single track as track_id = 1. When two such SourceBuffers are attached, both init segments declare track_id = 1. The most likely failure modes are:
-
mediasourcebin(or the sharedqtdemux/decodebinit uses internally) requires globally uniquetrack_ids across SourceBuffers on the same MediaSource, hits a collision, and the second branch’s pad never gets linked downstream →not-linked. -
Independent
tfdt/moofsequence numbers across the two branches confuse a synchronization step inmediasourcebin.
A useful diagnostic on your side would be to log gstreamer pad-added / pad-linked events for the two pipeline branches when this fails, and confirm whether the second branch’s qtdemux ever emits a pad-added for its video track or whether it stalls earlier.
Temporary workaround: none on the client. We have evaluated:
| Configuration | Result |
|---|---|
Native TS, codecs "avc1+ac3" |
Single muxed video/mp2t buffer. Video plays. Audio silently dropped (Bug 2). |
forceTransmux: true, codecs "avc1+ac3" |
Two fMP4 buffers. This bug — pipeline aborts with code 3016. |
forceTransmux: true, codecs "avc1" only |
Single video/mp4 buffer. Video plays cleanly. No audio. |
There is no fourth option that yields synchronized H.264 video + AC‑3 audio from a muxed MPEG‑TS source on Vega today.
The only end-to-end workarounds available to us are server-side: serve the audio in a separate EXT-X-MEDIA:TYPE=AUDIO group (so only one SourceBuffer is ever transmuxed), or transcode AC‑3 → AAC and stay on the native MP2T path. Both require backend changes by our origin, which is not always under our control.
**6. Environment
SDK Version:** 0.22.5600
-
App State: Foreground
-
OS Information:
NAME="OS" OE_VERSION="4.0.0" OS_MAJOR_VERSION="1" OS_MINOR_VERSION="1" RELEASE_ID="14" OS_VERSION="1.1" BRANCH_CODE="TV Ship day60" BUILD_DESC="OS 1.1 (TV Ship day60/99)" BUILD_FINGERPRINT="4.0.227617.0(3072cab629675a74)/99N:user-external/release-keys" BUILD_VARIANT="user-external" BUILD_TAGS="release-keys" BUILD_DATE="Tue Feb 17 22:57:51 UTC 2026" BUILD_TIMESTAMP="1771369071" VERSION_NUMBER="1401010009950" -
Device:
Fire TV Stick 4K Select -
React Native: system-bundled (Vega 0.22 / RN 0.72)
-
Player SDK: Shaka Player 4.8.5 (from the docs)
-
gstreamer (per error message): 1.20.1
7. Example Code Snippet
import { ShakaPlayer } from './vegaShakaPlayer';
const shakaPlayer = new ShakaPlayer(videoPlayerRef.current);
await shakaPlayer.load(
{ uri: 'http://<host>/stream/pl.m3u8', startTime: 0, vcodec: '', acodec: '', drm_scheme: '', drm_license_uri: '', secure: 'false' }, /* loadAtBeginning */ false,
{
mediaSource: { forceTransmux: true },
streaming: { lowLatencyMode: false, bufferingGoal: 15 },
manifest: { hls: { sequenceMode: true } },
},
);
The master playlist returned for that URL declares CODECS="avc1.4D401F,ac3" and the media playlist serves muxed MPEG-TS segments with one H.264 PID and one AC‑3 PID.
Playback Issues
-
Player SDK: Shaka Player
-
Player SDK Version: 4.8.5
-
Audio Codecs: AC‑3 (transmuxed by mux.js into
audio/mp4; codecs="ac3") -
Video Codecs: H.264 / avc1.4D401F (transmuxed by mux.js into
video/mp4; codecs="avc1.4D401F") -
Manifest Types: HLS (m3u8), source segments are muxed MPEG-TS
Additional Context
This is the highest-impact of three related W3C-Media issues we’ve filed against Vega 0.22:
-
Bug 1:
MediaCapabilities.decodingInfothrows on non-AV1 codec strings -
Bug 2:
MediaSource.isTypeSupportedoverstates support forvideo/mp2t; codecs="…,ac3" -
Bug 3 (this report):
mediasourcebinfails on two simultaneously-active mux.js fMP4 SourceBuffers
Together, they make muxed-TS HLS with AC‑3 audio (a very common configuration — it’s the default for ffmpeg AC‑3 passthrough, and the on-the-wire format for several major OTA DVR products) effectively unplayable on Vega from the client side.