Streaming Protocols for Live Broadcasting: Everything You Need to Know [2023 Update]

streaming protocols

Live streaming is a phenomenon that continues to grow. In fact, a massive 80% of people in this study said they’d rather watch a live stream than read a blog post. Furthermore,  video playback by live stream receive 27% more watched minutes compared to standard video.

As live video continues to become the preferred type of media, businesses and broadcasters have gravitated towards implementing it through professional streaming platforms for their viewers. But when it comes to live broadcasting, there’s a lot of technology in the works behind the scenes. Consequently, setting up live the proper streaming protocols can be a daunting task.

Thankfully, we’re going to clarify the various important aspects to live streaming. We’ll discuss the specific streaming protocols and what they are. We’ll detail the six most prominent live streaming protocols used by professional broadcasters today. And we’ll compare some similar technologyes related to live streaming including codecs and video streaming formats. By the end, you’ll learn the best streaming protocol for your specific application.

Table of Contents

  • The Basics of Streaming Protocols
  • Live Streaming Protocol vs. Codec
  • Streaming Protocols vs. Video Streaming Formats
  • 6 Common Video Streaming Protocols
  • Final Thoughts

The Basics of Streaming Protocols

Streaming formats

Streaming protocols make up one of the building blocks of professional broadcasting.

A streaming protocol, also known as a broadcast protocol, is a standardized method of delivering different types of media (usually video or audio) over the internet.

Essentially, a video streaming protocol sends “chunks” of content from one device to another. It also defines the method for “reassembling” these chunks into playable content on the other end.

That points toward one important aspect of streaming protocols: both the output device and the viewer have to support the protocol in order for it to work.

For example, if you’re sending a stream in MPEG-DASH, but the video player on the device to which you’re streaming doesn’t support MPEG-DASH, your stream won’t work.

For this reason, standardization is important. There are currently a few major media streaming protocols in widespread use, which we’ll look at in detail in a moment. Six common protocols include:

  1. HTTP Live Streaming (HLS)
  2. Real-Time Messaging Protocol (RTMP)
  3. Secure Reliable Transport (SRT)
  4. Dynamic Adaptive Streaming over HTTP (MPEG-DASH)
  5. Microsoft Smooth Streaming (MSS)
  6. Web Real-Time Communication (WebRTC)

Before we dive into the specific protocols, let’s clear up some potential confusion in the realm of live streaming protocols and codecs.

Live Streaming Protocol vs. Codec

“Codec” is a word that comes up often in the world of live streaming, and at first glance, the definition seems similar to that of a video protocol. However, a live streaming protocol is different from a video codec

Codec stands for “coder/decoder.” It is a tool for making video files smaller. RAW video files are made up of many still images played quickly in sequence (typically at 30 frames per second). Now, imagine thirty photos that are two megapixels each. That’s a lot of storage space. That is where a codec comes into play.

The solution for saving space is compression, which uses mathematical algorithms to discard data that isn’t very important. For example, if a corner of the video is black, and remains black for a few seconds, you can toss the individual pixel data and just include a reference instead.

Once the file has arrived at its destination, it is then decompressed so that the video can play as normal. This process happens in real-time when it comes to live streaming. This process is what happens when a video codec is used. A video codec is essentially streaming method tool.

Streaming Protocols vs. Video Streaming Formats

Another source of potential confusion is the video streaming format. This refers to the “container” or “package” that’s used for video transmission. A container format usually contains compressed video, compressed audio, and metadata such as subtitles, timing info, and so forth.

This data is transmitted via a streaming protocol. The transport format defines how the content is stored within the individual chunks of data as they are streamed. Common transport formats or containers for streaming video include MP4 (fragments) and MPEG-TS.

6 Common Streaming Protocols

Different video streaming protocols are used for different use cases. Certain streaming protocols are better suited for some streaming setups and others. The best protocol for live streaming depends on the situation.

There are six common streaming protocols that professional broadcasters should be familiar with, including HLS, RTMP, SRT, MSS, and MPEG-DASH, and WebRTC. Let’s take some time to explore some of the background and technical requirements for the most popular protocols.

1. HTTP Live Streaming (HLS)

HTTP Live Streaming (HLS)

The popularization of Apple products demanded an iOS-compatible protocol.

is an alternative protocol developed by Apple. HLS stands for HTTP Live Streaming, and today it is the most widely used streaming protocol on the internet. However, this was not always the case because when Flash was still around, the top streaming protocol was RTMP.

HLS is an adaptive bitrate protocol and also uses HTTP servers. This protocol is an evolving specification, as Apple continually adds features and regularly improves HLS.

Here are a couple examples of how Apple has improved HLS in recent years:

  • Performance: When compared to a streaming protocol like DASH, HLS had a few shortcomings in the past. Notably, DASH was arguably able to deliver better quality streams than HLS, but that’s no longer the case.
  • Resolution: DASH was previously able to support videos with higher resolution than HLS. Now, HLS supports 4K video resolution, so HLS is no longer at a disadvantage compared to DASH on this front either.

Despite improving on their past shortcomings, Apple has not yet been able to fix the latency issue associated with HLS. The HLS protocol has a relatively high latency compared to RTMP, for example. As mentioned, though, Apple is constantly working on HLS and has even come out with Low-Latency HLS.

Low-Latency HLS

Low-latency HLS is an extension of the HLS protocol that can enable latency at 2 seconds or less. This latency is a great improvement from the 15-30 seconds of latency that HLS live streams are generally associated with.

Unfortunately for Apple, this protocol hasn’t grown in popularity as fast as Apple would like, so they’ve tried a few things to speed up adoption. Still, vendor support  is lacking across the video delivery ecosystem.

HLS is one of the protocols that Dacast uses. Dacast has also added support for HLS ingest, which is still relatively new. Keep in mind that very few streaming platforms at the moment support HLS ingest.

Video Codecs Supported:

  • H.264
  • H.265 / HEVC

Audio Codecs Supported:

  • AAC
  • MP3

Transport/Package Format:

  • MPEG-2 TS

Playback Support:

  • iOS and macOS devices
  • Safari, Chrome, Firefox, and Edge web browsers
  • Many set-top boxes, such as Roku
  • Many online video players, such as JW Player and the Dacast all-device video player

Segment Duration:

  • 10 seconds (can be manually reduced as part of reducing latency)

If you want to connect with viewers who use Apple devices, HLS streaming is one of the best protocols for live streaming.

2. Real-Time Messaging Protocol (RTMP)

Real Time Messaging Protocol

The RTMP protocol sends video files from the encoder to the online video platform.

Real-Time Messaging Protocol (RTMP) is a protocol that was previously used to deliver videos to the Adobe Flash player. RTMP was developed by Macromedia with the primary use case of working with Adobe Flash player, but as you already know, Flash player is now dead.

To understand the popularity of RTMP as a delivery protocol, consider that at one point, Adobe Flash Player was installed in about 99% of desktops in the West. RTMP was heavily used for many years.

And because RTMP and Flash worked so closely together, many people now confuse the two terms as being interchangeable, but they’re not. In other words, Flash is dead, but RTMP isn’t. Instead, it lives on with a new use-case now that HTML5 has replaced Flash.

RTMP has limited playback support nowadays. Instead, RTMP is now used for ingestion from the encoder to the online video platform.

RTMP ingest allows users to tap into the support of low-cost RTMP encoders. Much of the online video streaming industry, including leading streaming software and OVPs, is still compatible with RTMP ingest.

When paired with HLS delivery, RTMP ingest produces a low-latency stream. RTMP is still powerful because it’s capable of supporting low latency, which is a top reason RTMP ingest has remained popular. Another top reason RTMP ingest is currently the most popular protocol for ingesting has to do with compatibility. HLS ingest, for example, is still not widely supported by streaming services.

Video Codecs Supported:

  • H.264
  • MP4
  • x264

Audio Codecs Supported:

  • AAC-LC
  • AAC

Transport/Package Format:

The transport/package format for RTMP is unavailable.

Playback Support:

  • Flash Player
  • Adobe AIR
  • RTMP-compatible players

Segment Duration:

  • The segment duration for RTMP is unavailable.

If you need a stream with low latency, where there is a minimal delay in the processing of the data, RTMP ingest is one of the best video streaming standards to use.

3. Secure Reliable Transport (SRT)

SRT protocol

SRT is a new, innovative streaming protocol.

Secure Reliable Transport (SRT) is a relatively new streaming protocol from Haivision, a leading player in the online streaming space. SRT is an open-source protocol that is likely the future of live streaming. This video streaming protocol is known for its security, reliability, and low latency streaming.

SRT is still quite futuristic because there are still some compatibility limitations with this protocol. The protocol itself is open-source and highly compatible, but other streaming hardware and software have yet to develop to support this protocol.

Haivision has created the SRT Alliance, which is a group of companies in the technology and telecommunications industry that are dedicated to bringing SRT up in the live streaming space. Currently, the best way to access SRT is by using technology that is founded by or backed by any of the SRT Alliance members.

Video Codecs Supported:

  • SRT is media and content agnostic, so it supports all video codecs.

Audio Codecs Supported:

  • SRT is media and content agnostic, so it supports all audio codecs.

Transport/Package Format:

  • SRT is media and content agnostic, so it supports all transport and package formats.

Playback Support:

  • Haivision does not specify playback support for SRT.

Segment Duration:

  • Haivision does not specify segment duration for SRT.

If you want to be on the cutting edge of video streaming protocols, you may want to consider adapting SRT

4. Microsoft Smooth Streaming (MSS)

microsoft mss

MSS or Microsoft Smooth Streaming is an older streaming protocol with broad playback support.

Before we dive deep into Microsoft Smooth Streaming (MSS), you should know that it’s no longer a protocol that’s used as of 2022. But we believe it’s helpful to still talk about it to show that just because a big name like Microsoft was behind the protocol, no protocol is bulletproof.

MSS is a streaming protocol that Microsoft developed in 2008 to meet early needs for adaptive bitrate streaming. This video streaming protocol was known for being cost-effective, reducing buffering, and offering optimized performance.

Microsoft Smooth Streaming is behind what allowed you to stream content on an XBox 360, Silverlight, Windows phone 7, and a few other connected TV platforms back in the day. It was also used for the 2008 summer Olympics as the streaming protocol to NBC’s online platform.

Deploying Smooth Streaming used to require Silverlight, Microsoft’s proprietary developer plugin framework. However, Microsoft Silverlight was discontinued late in 2021. One strength of Smooth Streaming was support for PlayReady DRM to thwart piracy.

Despite the failure of MSS, Microsoft is still behind a few other protocols like MPEG DASH. Although MSS was promising in its early days, tech enthusiasts could see that Silverlight wasn’t going to last long and as a result, MSS came crashing down with it.

Video Codecs Supported:

  • H.264
  • VC-1

Audio Codecs Supported:

  • AAC
  • WMA

Transport/Package Format:

  • MP4 fragments

Playback Support:

  • Browsers with the Silverlight plugin
  • Xbox
  • Windows Phone
  • iOS devices
  • Windows computers
  • Many Smart TVs

Segment Duration:

  • 2-4 seconds

If you are looking for the best streaming protocol for Windows devices, be sure to consider MSS.

5. Dynamic Adaptive Streaming over HTTP (MPEG-DASH)

MPEG-DASH

MPEG-DASH is the live streaming protocol of the future.

The last protocol in our review is MPEG-DASH. This is one of the newest streaming protocols, and it is beginning to see broader adoption.

Dynamic Adaptive Streaming over HTTP (DASH) which is also known as MPEG-DASH, uses standard HTTP web servers. This reduces the cost and technical difficulty of implementation when compared to legacy methods of streaming like RTP.

MPEG-DASH is also an adaptive bitrate (ABR) protocol. This means it will automatically detect changes in the internet connection speed of the viewer and serve the best available quality video at any given time. ABR streaming reduces buffering and enhances the viewers’ experience.

It is also important to note that MPEG-DASH is an open standard that isn’t controlled by any one company. It was developed as a joint effort between over 50 organizations, including big name organizations such as Apple and Microsoft.

Although most web browsers support MPEG DASH, a big downside to consider when learning about the protocol is that iOS and Safari don’t yet support it and might never support it. Considering the popularity of Apple devices, this has huge implications.

Video Codecs Supported:

  • H.264 (the most common codec)
  • H.265 / HEVC (the next-generation successor)
  • WebM
  • VP9/10
  • Any other codec (MPEG-DASH is codec agnostic)

Audio Codecs Supported:

  • AAC
  • MP3
  • Any other codec (MPEG-DASH is codec agnostic)

Transport/Package Format:

  • MP4 fragments
  • MPEG-2 TS

Playback Support:

  • Native support on Android devices
  • Plays back on most Samsung, LG, Philips, Panasonic, and Sony TVs made after 2012
  • Works on Chromecast
  • Supported on YouTube and Netflix
  • Not natively supported via HTML5, but players can be implemented via Javascript and Media Source Extensions

Segment Duration:

  • Variable

Not all viewers have the same internet connection, which means that when trying to connect with a large audience, you need to stream your video at different resolutions, which is possible with adaptive bitrate protocol, which MPEG-DASH supports. MPEG-DASH is the best streaming protocol for providing your viewers with a video that meets their needs.

6. WebRTC

WebRTC is a free and open-source project providing web browsers and mobile applications with real-time communication (RTC) via application programming interfaces (APIs).

Web Real-Time Communication (WebRTC) is relatively new compared to the others on our list and technically not considered a streaming protocol, but often talked about as though it is. It is what’s largely responsible for your ability to participate in live video conferences directly in your browser.

WebRTC gained a lot of popularity during the pandemic because it was made with the intention of supporting web conferencing and VoIP. Microsoft Teams, which exploded in popularity during the pandemic, uses WebRTC for both audio and video communications.

WebRTC supports adaptive bitrate streaming in the same way HLS and MPEG-DASH does. In the same way as HLS, WebRTC also relies on live transcoding to produce multiple bitrate variants so that users with poor and strong connections can enjoy the stream. WebRTC has a very optimistic future ahead.

Video Codecs Supported:

  • H.264
  • VP8 + VP9

Audio Codecs Supported:

  • PCMU
  • PCMA
  • G.711
  • G.722
  • Opus

Playback Support:

  • Native support on Android devices
  • As of 2020, iOS Safari 11 and newer versions support WebRTC
  • Works on Google Chrome, Mozilla Firefox, and Microsoft Edge
  • Supported by YouTube and Google

Segment Duration:

  • Not applicable

Final Thoughts

By now you’ll have a better understanding of live streaming. Whether you’re a veteran or a newcomer, the working knowledge of codecs, container formats, CDNs and more as outlined in this article will help you choose the best live streaming protocol for your needs.

Each streaming protocol comes with its own set of pros and cons. Which protocol you use will largely depend on who you’re trying to reach and the devices they use. In other words, the needs of your audience.

We believe HLS is currently the best protocol for most live video streaming cases. That’s why, with HDS, it’s our default protocol here at Dacast. We choose them because we want the very best for our customers.

Not yet joined Dacast but interested in what we offer? Say no more. Our professional solutions can be tried for free with our 14-day risk-free free trial. With our free trial, you’ll have complete access to live streaming, secure video upload, on-demand content hosting and more.

Try Dacast for free

Technology is always evolving, and we’ll surely be using different methods in the future. For regular , exclusive offers, and updates on video streaming, please join our LinkedIn group.

Emily Krings

Emily is a strategic content writer and story teller. She specializes in helping businesses create blog content that connects with their audience.