Broadcasting: Mux or Demux? What The Heck Is That About?

In broadcasting, muxing and demuxing are essential processes that allow for the transmission and distribution of audio and video streams.

Muxing, or multiplexing, is the process of combining multiple audio and video streams into a single stream. This combined stream can be transmitted over a network or broadcast through traditional media channels like television or radio. Muxing is commonly used in live streaming, video editing, video conferencing, and IPTV.

A mux works by taking multiple input streams and interleaving them into a single output stream, which can be encoded and transmitted over a network using a specific protocol. The output stream is typically optimized for transmission efficiency, so that it can be transmitted with minimal delay and bandwidth requirements.

10 use cases for a mux:

1. Live streaming: A mux can be used to combine multiple live audio and video feeds into a single stream for real-time broadcast.

2. Video editing: A mux can be used to combine multiple video tracks into a single output file for editing or post-production.

3. Video surveillance: A mux can combine multiple video feeds from surveillance cameras into a single stream for monitoring and recording.

4. IPTV: A mux can be used by IPTV providers to combine multiple TV channels into a single stream for distribution over the internet.

5. VoIP: A mux can be used to combine multiple voice streams into a single output stream for voice over IP (VoIP) applications.

6. Music production: A mux can be used to combine multiple audio tracks into a single output file for music production or mixing.

7. Video conferencing: A mux can be used to combine multiple audio and video feeds from participants in a video conference into a single output stream.

8. Digital signage: A mux can be used to combine multiple video feeds for display on digital signage screens.

9. Sports broadcasting: A mux can be used to combine multiple audio and video feeds from different cameras and microphones at a sports event into a single broadcast stream.

10. Online gaming: A mux can be used to combine multiple audio and video streams from players in an online multiplayer game into a single stream for spectators to watch.

Conversely….

Demuxing, or demultiplexing, is the opposite process of separating the combined stream back into its individual audio and video streams. This allows for the decoding and processing of the individual streams separately. Demuxing is commonly used in media playback, video editing, audio processing, and network monitoring.

A demux works by analyzing the input stream and separating it into its constituent parts based on the underlying format and structure of the stream. The output streams can then be decoded or processed separately using appropriate software or hardware.

10 use cases for demuxing:

1. Media playback: A media player uses a demux to separate the audio and video tracks of a media file, so that they can be decoded and played back separately.

2. Video editing: A demux can be used to separate multiple video tracks from a single media file for editing or post-production.

3. Audio processing: A demux can be used to separate multiple audio tracks from a media file for processing or analysis.

4. Closed captioning: A demux can be used to separate the closed captioning data from a video file, so that it can be displayed separately.

5. Subtitles: A demux can be used to separate the subtitle data from a video file, so that it can be displayed separately.

6. Video transcoding: A demux can be used to separate the audio and video tracks of a media file for transcoding into a different format or resolution.

7. Network monitoring: A demux can be used to analyze network traffic and separate different types of data packets for monitoring or analysis.

8. Digital forensics: A demux can be used to extract individual files or data streams from a larger disk image or data file for forensic analysis.

9. Compression: A demux can be used to separate different data streams for compression or archiving purposes.

10. Streaming: A demux can be used to separate audio and video streams from a network broadcast for playback on different devices, or for further processing and analysis.

Both muxing and demuxing are critical processes in broadcasting that allow for efficient transmission and distribution of audio and video streams. These processes are used in a wide range of applications, from live sports broadcasting to online gaming, and are essential for ensuring high-quality audio and video transmission.

Comment, Like, and/or Subscribe- it’s free!

How much bandwidth do you need ?

It’s essential to have sufficient bandwidth to have an optimal streaming experience. So, let’s dig into the bandwidth requirements for different resolutions and streaming services.

Understanding Video Bitrate

Video bitrate is an important metric independent of other factors like resolution, frame rate, and audio quality that impact a viewer’s streaming experience. It represents the amount of data per second your video source supplies and is a critical factor in delivering an enjoyable experience.

Streaming Services and Bitrate

It’s interesting to note that compared to a Blu-ray disc, streaming services like Netflix need to use compressed streams with considerably lower bitrates. Despite their best efforts to maintain the quality through various compression techniques, a higher bitrate equals more data and a superior image quality.

Minimum Bandwidth Required

To sustain a smooth, buffer-free stream at varying resolutions, one should consider these average minimum bandwidth requirements. Whether you’re using older equipment or new streaming devices with the latest TV models, Broadcasters generally provide viewers with the best possible streaming experience.

Following is the general resolution for videos and Minimum download speeds required:

480p (SD):  Needs about 3-4 Mbps

720p (HD):  Needs about 5-8 Mbps

1080p (HD):  Needs about 8-10 Mbps

2160p (4K):  Needs about 32 Mbps

4320p (8K): Needs about 120 Mbps

Required Bandwidth

H264 H265

1280×720(HD) 3Mbps 1.5Mbps

1920X1080(FHD) 6Mbps 3Mbps

3840×2160(UHD) 25Mbps 12Mbps

3820x2160p(4K) 32Mbps 15Mbps

7640×4320(8K) See notes below

1080p Streaming required Bandwidth & Internet Speed

1080p streaming videos are at a display resolution of 1920X1080, and it offers full HD video content on the Internet. These videos have more clarity and resolution than an HD video at 720p. Also, 1080p video consumes more amounts of data compared to SD and HD streaming. As stated in the above table and with the H264 codec, the recommended bandwidth is 6 Mbps, and with the H265 codec, it usually requires up to 3 Mbps.

4K and 8K Streaming Bandwidth requirement / Internet Speed

To stream 4K HDR content, one needs a 4K UHD TV with HEVC decoder and HDR support. 

4K videos with a display resolution of 4096p x 2160p offer the most life-like video content on the Internet. These high definition videos have more visual information than ever about the texture, color, shapes than an HD video. Unfortunately, 4k consumes enormous amounts of data compared to SD, HD & FHD streaming. With the H264 codec, the recommended bandwidth is 32 Mbps, and with the H265 codec, it could be up to 15 Mbps. To stream 4K HDR content, one needs a 4K UHD TV with HEVC decoder and HDR support.

Even with an 8K streaming service, most people wouldn’t be able to use it. Platforms like Netflix specifies a 25 Mbps stream for 4K content. This requirement seems to quadruple as there is no H.265 standard alongside 8K to reduce the file size. Netflix consumes 3.1GB/hour at 1080p for 60fps video and even 7GB/hour at 4K. If we assume that the transition from 4K to 8K consumes an equivalent amount of bandwidth to 1080p – 4K transition, the per hour bandwidth requirement to stream 8K content would be nearly 6.44GB – 19.2GB/hour for 23.976fps content. This is still a high bandwidth rate to burn out.

The bitrate for 8K video services that use HEVC is between 85 Mbps for satellite and 65 Mbps for OTT. 

When you implement Content Aware Encoding (CAE) used in combination with HEVC, you can lower the bit rate for 8K distribution by another 50%. CAE leverages the mechanics of the human eye to assess video quality and optimize encoding parameters in real-time.

If you have any questions please reach out. 👍 Follow, and Comment- it’s free!