like Android or PC web players will not be able
to play it. We recommend creating a separate
application for fMP4 delivery if you have different target platforms for your streams. This
may definitely change in future but for now
you need to consider this factor.
Accordingly, in this tutorial we’ll discuss how to create a fragmented MP4 stream for HEVC-capable Apple devices and a legacy stream for older Apple and
non-Apple end points.
When producing a live event, HEVC provides valuable
bandwidth savings in the first mile and the last mile,
so if you can get your hands on an encoder with HEVC
output, go for it. In our tests, we used Vitec’s MGW Ace
HEVC Encoder, which can also output H.264 if needed.
Interestingly, you can’t push HEVC using the RTMP protocol, so you’ll have to use UDP or some other protocol.
Although it’s not rocket science, this requires a bit more
network knowledge that using the RTMP protocol.
As an example, we originally planned to use port
265 for the UDP stream, but the Vitec encoder didn’t
support ports below 1200 or so, so we changed to port
26500. Configuring this wasn’t particularly challenging, but I was glad I wasn’t under the gun when I discovered the problem. If you’ll
be using a new encoder for this
project, or a new protocol, budget plenty of time to get issues
like this resolved.
Step 1: Set Up Global Options
When Apple added HEVC to
HLS, the company specified
that you can only use the frag-
mented MP4 format for the
streams, not MPEG- 2 trans-
port streams. To accomplish
this in Nimble, you create what
the service calls an application,
and set “HLS(FMP4)” as a sup-
ported protocol. You reach the
screen shown in Figure 1 by
clicking Nimble Streamer in the
top menu and choosing “Live
streams settings,” then choosing
the Applications tab on the left.
In addition to the HLS(FMP4)
protocol, you should also select
the “Softvelum Low Delay proto-
col” (SLDP), since it’s faster and
easier to preview your streams
via this protocol (you can read
more about it at go2sm.com/
sldp). You can see that we have both protocols se-
lected on the bottom of Figure 1.
You can also see that we’ve set the chunk dura-
tion to 6 seconds, and the chunk count to four, which
means that that the server won’t publish the playlist
until it has four chunks. These settings favor play-
back stability and encoding efficiency more than low
latency, so if time-to-live latency is an issue for you,
you may want to be more aggressive. For example,
changing to a chunk duration of 2 seconds will reduce
latency from 24 seconds ( 6 seconds x four chunks) to
8 seconds ( 2 seconds x four chunks).
On top of Figure 1, you see we have two applica-
tions running—the f MP4 application we’re editing in
the figure, and “leg” (which stands for legacy), the ap-
plication we’ll use to create HLS streams for legacy
devices. In that application, we selected the HLS and
When working with Nimble, clicking the wrench always opens your configuration options, which is intuitive. Less intuitive is the question mark, which opens
publishing URLs for each application in this window,
and opens preview windows in most other screens.
Step 2: Set Up for Input and Output
In terms of high-level workflow within Nimble
Streamer, you must first input the stream, then route
Click the wrench to set application settings. Here we’re choosing the fMP4 application for fragmented MP4 files.