Video Streaming Protocols: Which to Use for Professional Broadcasting

video streaming protocol

When you get started with live streaming, you’ll notice an abundance of acronyms that serve many different purposes. There’s RTMP, HLS, HDS, and more.

Many of these acronyms relate to different video streaming protocols. Basically, protocols are technical processes that facilitate the transfer of data from one program to another. In streaming, this means the transfer of your video files to and from your encoder, streaming host, and eventually, the video player where your audience views your stream.

Today, we’re going to identify some of the most common streaming protocols you’ll encounter, what they do, and when they should use them. In order to provide some relevant background to help you understand, we’ll also explain the relationship between a codec and a container format. 

Are you ready to dive into live streaming protocols?

Please note that this post has been updated to reflect the latest developments in video streaming protocols as of November 2020.

Table of Contents

  • What is a Video Streaming Protocol?
  • Streaming Protocol vs. Codecs vs. Container Format
  • Common Video Streaming Protocols
  • Video Protocols for Professional Live Streaming
  • Conclusion

What is a Video Streaming Protocol?

live video streaming protocols
A video streaming protocol is necessary for live broadcasting.

Before we go further, let’s look a little further into the definition of a video streaming protocol. Most digital video is designed for two things: storage and playback. This leads to two major considerations, namely small file size and universal playback.

Most video files aren’t designed for streaming, which means streaming a video involves first converting it into a streamable file. This involves breaking it up into small chunks. These chunks then arrive sequentially and playback as they’re received. If you’re streaming live video, the source video comes in straight from a camera. Otherwise, it’s coming from a file for VOD content.

A video streaming protocol is a standardized delivery method for breaking up a video into chunks, sending it to the viewer, and reassembling it.

Streaming protocols can get much more complex. Many are “adaptive bitrate” protocols, for example. This technology will deliver the best quality that a viewer can support at any given time. 

Some protocols focus on reducing latency, or the delay between an event happening in real life and when it plays on the viewer’s screen. Some protocols only work on certain systems, And other protocols focus on digital rights management (DRM).

As we move through some specific protocols, we will put these and other characteristics into perspective. 

Streaming Protocol vs. Codec vs. Container Format

codec vs container
Protocols, codecs, and container formats are each separate facets of streaming.

Among others, one common source of confusion in the realm of streaming video relates to the difference between a protocol and a codec. 

Simply put, the term “codec” refers to video compression technology. Logically, different codecs are used for different purposes. For example, Apple ProRes is often used for video editing and H.264, the most common video codec, is widely used for online video.

As with codec, the term “format” can also be confusing in the context of video streaming protocols. In many cases, format simply refers to the container format of a video file. Common container formats include .mp4, .m4v and .avi. 

In essence, a container format functions like a “box” that usually contains a video file, an audio file, and metadata. However, container format isn’t as central a concept for live streamers.

Let’s make a comparison to make it easier to understand the relationship between a codec, a container format, and a streaming protocol. 

Imagine that you’re a merchant, and you’re transporting clothing in bulk (the clothing represents the video content). The codec is equivalent to the machine that compresses the clothing into a bundle to save space. The container format is the boxcar that these bundles are packed inside. The streaming protocol is analogous to the railroad tracks, signals, and drivers who deliver it to the destination. 

As a broadcaster, you want your live video content to function in concert with a codec, container format, and streaming protocol.

It’s also important to note that most streaming protocols only support certain codecs. We’ll get more into this later.

Common Video Streaming Protocols

Now that you have a better idea of the purpose of video streaming protocols, let’s start our comparison of the most common video streaming protocols today. 

In this comparison, we’ll also offer use cases for each protocol whenever possible.

1. Real-Time Messaging Protocol (RTMP)

rtmp Real-Time Messaging Protocol
The purpose of RTMP has evolved in recent years.

First up is the veteran protocol: RTMP. Originally developed by Macromedia in the early days of streaming, RTMP is still widely used. 

Today RTMP is mostly used for ingesting live streams. In plain terms, when you set up your encoder to send your video feed to your video hosting platform, that video will reach the CDN via the RTMP protocol. That content eventually reaches the end viewer in another protocol, usually HLS streaming protocol.

RTMP is rarely used as a viewer-facing video streaming protocol like it once was. That’s because it’s dependent on the Flash plugin, which has been plagued with security problems for years and is rapidly becoming obsolete.

Who Should Use RTMP?

RTMP is a streaming protocol that provides very low latency streams. However, because it requires the Flash plugin to be played, we do not recommend it. Again, the exception is for stream ingestion. For this purpose, RTMP is still one of the best options. It’s robust and almost universally supported.

2. Real-Time Streaming Protocol (RTSP)

Perhaps a lesser-known video streaming protocol, Real-Time Streaming Protocol (RTSP) was first published in 1998. RTSP was developed to control streaming media servers in entertainment and communications systems, specifically. 

In 2016, an updated RTSP 2.0 became available. Overall, it is known as a video streaming protocol for establishing and controlling media sessions between endpoints.

RTSP is similar in some ways to the HTTP Live Streaming (HLS) protocol, which we’ll cover below. However, transmitting live streaming data is not what RTSP accomplishes on its own. Instead, RTSP servers often work in conjunction with the Real-Time Transport Protocol (RTP) and Real-Time Control Protocol (RTCP) to deliver media streams.

Who Should Use RTSP?

RTSP was designed to support low-latency streaming and is a good choice for streaming use cases such as IP camera feeds (e.g. security cameras), IoT devices (e.g. laptop-controlled drone) and mobile SDKs

One significant drawback, however, is there is limited native browser support for RTSP.

3. Dynamic Adaptive Streaming over HTTP (MPEG-DASH): up-and-coming protocol

Dynamic Adaptive Streaming over HTTP (MPEG-DASH)
Adaptive streaming capabilities are extremely valuable to professional broadcasters.

At the opposite end of the spectrum, we have MPEG-DASH, one of the newest protocols on the scene. While it’s not widely used yet, this protocol has some big advantages. 

First, it supports adaptive-bitrate streaming. That means viewers will always receive the best video quality that their current internet connection speed can support. This tends to fluctuate second-to-second, and DASH can keep up.

MPEG-DASH fixes some long-standing technical issues with delivery and compression. Another advantage is that MPEG-DASH is “codec agnostic,” meaning it can be used with almost any encoding format. It also supports Encrypted Media Extensions (EME) and Media Source Extension (MSE) which are standards-based APIs for browser-based digital rights management (DRM).

Who Should Use MPEG-DASH?

These days, MPEG-DASH is only being used by a fraction of professional broadcasters as compared to HLS. However, we believe that it will be the standard technology in the future. 

The reason that this protocol is not incredibly popular yet can be attributed to compatibility (e.g. Apple Safari and iOS devices do not support it) and other related issues.

4. Microsoft Smooth Streaming (MSS)

Next up is Microsoft’s Smooth Streaming (MSS) protocol. Originally introduced in 2008, MSS was integral to that year’s Summer Olympics. However, it’s popularity has dropped off, except among Microsoft-focused developers and those working in the Xbox ecosystem.

Smooth Streaming supports adaptive-bitrate streaming and includes some robust tools for DRM. Overall, it’s a hybrid media delivery method that functions like streaming, yet is based on HTTP progressive download.

Who Should Use Smooth Streaming?

Unless your main target audience is Xbox users or you plan to build Windows-specific apps, we don’t recommend using MSS as a primary video streaming protocol.

5. HTTP Dynamic Streaming (HDS)

HTTP Dynamic Streaming (HDS)
HDS is the least recommended streaming protocol.

Adobe’s entry into the streaming protocol world is HTTP Dynamic Streaming (HDS), the successor of RTMP. Like RTMP, HDS is a flash-based streaming protocol. However, it also adds support for adaptive streaming and has a reputation for high-quality.

HDS is also one of the better protocols when it comes to latency. On the other hand, latency isn’t as low as with RTMP due to the fragmentation and encryption process, which makes it less popular for streaming sports and other events where seconds matter.

Who Should Use HDS?

Generally, we don’t recommend that you use HDS. In recent years, Flash support has become too weak for any broadcaster to rely on this technology to reach its audience. In short, building your web video around the Flash player is simply a poor choice today.

6. HTTP Live Streaming (HLS)

HTTP Live Streaming (HLS) protocol
The HLS protocol, or HTTP Live Streaming, was developed by Apple, and has support for media players, web browsers, mobile devices, and media servers.

The final video streaming protocol we’ll discuss is HTTP Live Streaming or HLS. Apple originally released this protocol in 2009 to enable them to drop Flash from iPhones. Since then, HLS has since become the most widely-used streaming protocol. 

There are several reasons for this. First, desktop browsers, smart TVs, and both Android and iOS mobile devices all support HLS. HTML5 video players also natively support HLS, in comparison with HDS and RTMP. 

This allows a stream to reach as many viewers as possible, making HLS the safest protocol today for scaling a live stream to large audiences. For example, you can use this protocol to stream live video on your website with a simple embed code.

As far as features, the HLS standard also supports adaptive-bitrate streaming, dynamically delivering the best possible video quality at any moment. With recent updates, this standard now supports the latest-and-greatest H.265 codec, which delivers twice the video quality at the same file size as H.264.

Currently, the only downside of HLS is that latency can be relatively high. However, there are methods for reducing HLS latency.

Who Should Use HLS?

HLS is the most widely-used protocol today because it’s robust and effective. For example, we know that few viewers will return to a site during a stream if they experience a video failure. Using a widely-compatible, adaptive protocol like HLS will deliver the best possible audience experience.

We’d also like to mention that HLS is now the default streaming protocol used at Dacast.

Video Protocols for Professional Live Streaming

live streaming professional camera
Do you know which video protocols are best for professional live streaming?

To recap, there are many video streaming protocols in existence today, and many of these can be used for live video streaming. 

As we covered above, all of the protocols discussed here—RTMP, RTSP, MPEG-DASH, MSS, HDS, and HLS—have specific use cases for specific broadcasters. However, when taking everything into account, HLS comes out on top, especially in terms of codec compatibility, all-device compatibility, HTML5 video player native support, and adaptive-bitrate streaming capacity.

Our takeaway recommendation here is simple: for now, almost all broadcasters should stick to using the HLS video streaming protocol.

Of course, some users may find other protocols better for their needs. However, whether you want to stream live video on your website, do live streaming of sports events or broadcast professional events and gatherings live, HLS is generally the best way to go.

Remember, MPEG-DASH is an up-and-coming option. Look for the growing adoption of that stream protocol in the near future.

Conclusion

dacast live streaming online video platform
Live streaming protocols make the transmission of video files possible and are an integral part of any online video platform.

Although streaming protocols and related technology are a bit complex, they are totally approachable when broken down into smaller, digestible ideas.

We hope this post has helped clarify the purpose of video streaming protocols and the relation between video streaming protocol, codec, and container format. We trust that you are equipped to choose and use the right protocol for your needs.

To test HLS streaming on the Dacast platform, we invite you to sign up for our 30-day free trial. That way you can get acquainted with the features before you commit/

GET STARTED FOR FREE

Any questions? Let us know by leaving a comment below! We have experience with most kinds of live video streaming protocols, so we can probably help no matter what issues you’re experiencing. For exclusive offers and regular live streaming tips, join our LinkedIn group.

Thanks for reading and happy streaming!

2 thoughts on “Video Streaming Protocols: Which to Use for Professional Broadcasting

  1. Peter Turner says:

    A very comprehensive explanation.
    Can HLS live video streaming be stored in a file such as VIDEO_TS or other for subsequent playback, and repeated playbacks?

Leave a Reply

Your email address will not be published. Required fields are marked *