Quantcast
Channel: Softvelum news: Nimble Streamer, Larix Broadcaster and more
Viewing all 436 articles
Browse latest View live

HTTP/2 in Nimble Streamer

$
0
0
The Internet as we know it was created on top of HTTP versions 1.0 and 1.1, with HTTP/1.1 being dominant for the last 20 years. The growing demand for new features and capabilities showed several cavities in it and they were addressed in HTTP/2, which has been developed and adopted as a successor for 1.1. You can read the Introduction to HTTP/2 by Google team to see what exactly HTTP/2 has to offer.

At the moment HTTP/2 is supported in all modern browsers and every time users are trying to reach some resource on the Internet, their browser tries to connect through this new protocol version. If it succeeds then all further collaboration is performed via this new channel. If HTTP/2 in unavailable then the browser switches to HTTP/1.1.

Softvelum team has implemented HTTP/2 for some of Nimble Streamer output HTTP-based protocols to provide our customers with one more advantage for their end-users and establish the ground for further development.

HTTP/2 in Nimble Streamer


Nimble Streamer now has support for HTTP/2 protocol in most popular live streaming scenarios. No change to input streams is required, you only need to enable the feature as described in "Enabling HTTP/2" section below.

HLS

HTTP Live Streaming (HLS) by Apple is a de-facto standard for end-user media consumption so we implemented full live streaming support for HTTP/2:

  • Live HLS streams with fMP4/CMAF, MPEG-TS and audio-only containers are fully supported.
  • Ads insertion in live HLS via Nimble Advertizer is working fine as usual.
  • HLS DVR output works fine with all of its features.

MPEG-DASH

Live MPEG-DASH streams can also be played via HTTP/2. Both HLS and DASH output can be generated from the same live input so having HTTP/2 enabled, you can get two outputs through it.


Other protocols

At the moment only aforementioned protocols live streaming is supported via HTTP/2. HTTP re-streaming, VOD, Icecast and HTTP MPEG-TS will be processed only via HTTP/1.1.

Enabling HTTP/2


HTTP/2 can be used only when Nimble Streamer streams over HTTPS, so in order to make it process HTTP/2 requests, you need to do the following.


After that you'll be able to use HTTP/2 to reach live streams with HLS and MPEG-DASH protocols enabled.

LiteSpeed HPACK library


Nimble Streamer uses LS-HPACK library for encoding and decoding HTTP headers using HPACK compression mechanism.

Softvelum team would like to say thank you to LiteSpeed team, we appreciate their effort and technical expertise. We've made several contributions to LS-HPACK code base and we plan to continue that support as long as we move forward in our HTTP/2 development.



Let us know if you have any thoughts or feedback regarding HTTP/2-based streaming.

Related documentation


Nimble StreamerNimble AdvertizerHLS in Nimble StreamerMPEG-DASH in Nimble Streamer


SVT-HEVC H.265 encoding setup in Nimble Streamer Transcoder

$
0
0
Nimble Streamer Live Transcoder has support for various codecs using a number of encoding libraries. H.265 (HEVC) encoding has been supported only via NVENC and QuickSync hardware acceleration so we were looking for the best ways to provide software alternative to that.

Now Live Transcoder can use SVT-HEVC for software encoding. The Scalable Video Technology for HEVC Encoder by Intel® is an HEVC-compliant encoder library core highly optimized for Intel Xeon™ Scalable Processor and Xeon™ D processors. However it can also be used on other hardware supported by Live Transcoder.


The output can be delivered by Nimble Streamer via any protocol which supports HEVC delivery.

The library is delivered with Nimble Live Transcoder and can be used like any other software encoder. The setup process is described below.

Install Live Transcoder


Live Transcoder is a premium add-on for Nimble Streamer freeware media server. You'll need to subscribe for its license in order to start using it.

You need to follow these installation instructions in order to set it up for further usage.

Create transcoding scenario


Live streams transcoding is set up using transcoding scenarios. Each scenario is a graphical representation of content transformation for video and audio ingredients. It has decoding elements to specify how the stream is decoded, filter elements to define the content transformation and encoder elements to put the content into the right codec.

You can refer to Documentation reference for setup details, including video tutorials.

The next section explains how to use encoder element to setup HEVC encoding with SVT-HEVC.

SVT-HEVC encoder settings


Once you've set up the designated transcoding scenario, add encoder element for your output and choose libsvthevc from the list of Encoder field values.


You'll be able to specify Key frame alignment from the list of supported values similar to those used in libx264 setup. The profile values can vary between "main" and "main10". In addition to that you can define other parameters specific to SVT-HEVC.

If you have any questions related to transcoding, feel free to contact us.

Related documentation

Live Transcoder, Transcoder documentation reference,
.

Running encoders in out-of-process mode

$
0
0
Live Transcoder for Nimble Streamer allows using a number of encoder libraries for producing output stream content. All encoding activities are performed in the same process with the other transcoding pipe.

However, there are cases when a customer uses some experimental encoders or legacy encoders in some un-regular environment which may be unstable during extensive usage. This causes main Nimble Streamer process to go down which affects overall robustness and stability of streaming infrastructure.

To address that, we've made Live Transcoder to support running encoders in out-of-process mode. It makes Nimble Streamer to start a new separate process for each encoder in transcoder scenario. If the encoder fails, the process is stopped and automatically re-started without affecting overall transcoding. The output is also not interrupted so your end-users will notice just a short image freeze.

By default, out-of-process encoding is enabled only if you choose SVT-HEVC encoding library for specific encoder. For all other encoders, this capability is disabled.

To enable this feature, you need to open encoder details dialog, enter new nimble-execution-mode parameter on the left and then enter its out-of-process value to the right as shown on the image below.



Once you save the transcoding scenario and it's synced to Nimble Streamer instance (which happens 30 seconds), this will take immediate effect.


This improvement brings ability to have more robust and reliable transcoding. Let us know if you have any questions regarding this feature usage.


Related documentation


Live Transcoder, Transcoder documentation reference

Introducing Apple Low Latency HLS in Softvelum products

$
0
0
The HLS of RFC8216 is currently a de-facto standard for live and VOD delivery. Playlists and chunks delivered over HTTP/1.1 provide the simplicity of implementation, wide platforms availability, adaptive bitrate support and scalability. These advantages allowed it to get the biggest share of customer base. However it has some disadvantages for live streaming, primarily because chunks delivery means many seconds of delay at the start of a live stream. The industry has been making some improvements on top of HLS, and then Apple released its spec update to address existing issues.

Low Latency HLS


Low Latency HLS (LL-HLS) is the next generation of Apple's HTTP Live Streaming protocol introduced in early 2019.
Several key features improve it over regular HLS for live delivery:
  • Partial segments (parts) can be accessed before full chunks of content are available.
  • Server side can use HTTP/2 Push for sending parts.
  • Holding playlist requests for obtaining latest parts as soon as they appear.
Softvelum team added LL-HLS into the bundle. We provide customers with playback capabilities using SLDP Player for iOS, and Nimble Streamer allows generating LL-HLS for all supported containers such as fMP4, audio-only and MPEGTS.

1. SLDP Player to play LL-HLS


Apple still has LL-HLS in beta stage as of iOS 13.3 at the end of December of 2019, so there are a number of shortcomings in its implementation. The main concern is the fact that iOS native player implementation cannot be published into AppStore yet. Lack of browsers' and other platforms' availability is also a big blocker so far. So the only way to try the playback for development purposes is to build your own app for that.

SLDP Player SDK for iOS allows having full-featured Low Latency HLS playback on iOS devices. It covers live streams from any source capable of LL-HLS like Wowza Streaming Engine and Nimble Streamer, and it also supports regular HLS from any available source.

If you'd like to build your own low latency playback app, you can get player SDK from our team for further test and integration. Once the LL-HLS technology goes from Apple beta to production (in early 2020 as per Apple), you'll be able to have full-featured app and publish it under your brand.

2. Nimble Streamer for LL-HLS transmuxing


Nimble Streamer software media server allows taking any supported live input streams and re-packaging them into Low Latency HLS. Here are the steps you need to follow in order to make it work.

2.1 HTTP/2 and config setup


2.1.1. LL-HLS uses HTTP/2 via SSL as a transport protocol. So you need to enable it before performing any further setup.
Please follow this HTTP/2 setup article to make this protocol working for Nimble Streamer.

2.1.2. In addition to that, you need to add this parameter into nimble.conf and restart Nimble Streamer, read this page for config setup details:
hls_add_program_date_time = true
If a client tries to access LL-HLS stream via HTTP/1.1, or if HTTP/2 is not properly set up, then player will fall back to regular-legacy HLS and will not use any advantages of LL-HLS.

You can check if you have HTTP/2 working via access log. To enable access logs, add this parameter into nimble.conf the same way you've done it for other parameters:
log_access = file
Once you re-start Nimble, you'll be able to view the log. In Ubuntu it's located in /var/log/nimble/access.log by default. Now when you try to get your regular HLS live stream via https:// via curl or HTTP/2-capable player, you'll get this kind of record in access log:
Dec 24 17:43:09 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/playlist.m3u8 HTTP/2" 200 84 1114 372 "-""AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
You can see HTTP/2 there which means it's working. In other cases it will have HTTP/1.1 and this will mean you need to check what's wrong. Contact us in case of issues.

2.2 Live streaming setup


Now you need to set up transmuxing settings via WMSPanel web service. If you are not familiar with live streaming setup of Nimble Streamer, please refer to live streaming digest page, or respective input protocol pages, such as RTMP streaming. Please make sure you have a correctly set up live stream (like regular-latency HLS or SLDP) before trying to use LL-HLS.

Once you have a live stream set up in WMSPanel, go to Nimble Streamer top menu and select Live streams settings. You will see Global setting tab for selected server (and you may create application-specific setting as well).


Currently, Nimble Streamer supports all 3 containers available for HLS, you can see their respective checkboxes on the screenshot above:
  • HLS - HLS with audio-only container. Audio-only is optimized for audio delivery having a reduced size. The ID3 tags are also inserted in each audio part.
  • HLS (MPEGTS) - MPEG-TS: the only container with video and audio support for LL-HLS
  • fMP4 - fragmented MP4. Notice that fMP4 container playback has a couple of issues related to current Apple implementation of their player as of iOS 13.3, please refer to section "3. Known issues" below for more information.
Once you select either of those containers, WMSPanel will show Enable Apple's Low Latency HLS checkbox and you need to select it. It will also show HLS part duration edit box to define parts' duration in milliseconds, we recommend using default value of 1000ms, see section "3. Known issues" for details.

Once LL-HLS is enabled, you need to re-start the input stream so Nimble Streamer could start producing LL-HLS output stream.

2.3 Workflow and playlists


Now as the set up has been made, you can use the player to consume the stream using the usual playback URL. The main playlist will have proper chunklists which will have a content according to LL-HLS spec, as shown in the example below.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:02.609Z
#EXTINF:5.995,
a_6_6016_1.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_12011_2_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_12011_2_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:08.604Z
#EXTINF:5.995,
a_6_12011_2.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_18006_3_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.362,URI="a_6_18006_3_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:14.599Z
#EXTINF:5.994,
a_6_18006_3.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.384,URI="a_6_24000_4_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:29:20.593Z
#EXTINF:6.016,
a_6_24000_4.fmp4?nimblesessionid=1


Parts in chunklist. Comparing to regular HLS, you see a lot of lines representing parts like this:
#EXT-X-PART:DURATION=0.512,URI="a_6_24000_4_0.fmp4?nimblesessionid=1"
The full chunk which contain these parts will be described after all parts' lines:
a_6_24000_4.fmp4?nimblesessionid=1
All parts within chunks are numerated starting from zero. So "a_6_18006_3_0.fmp4" mean its the first part of chunk number 3.

Part length. This attribute declares a designated size of upcoming parts:
#EXT-X-PART-INF:PART-TARGET=0.512
In this example it's 512 milliseconds.

Can block reload
. Check this line:
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
The "CAN-BLOCK-RELOAD" declares that media server allows holding playlist request.

Hold playlist request. LL-HLS allows requesting the server to hold sending out the playlist until a specific chunk and/or part is available for the stream.
So a player may request some part which is going to be available within a few seconds from now, then Nimble Streamer will check if that part is available. Once the requested part is ready, Nimble will return a playlist.

Check this request example:
curl -k "https://localhost:8443/livell/stream/chunks.m3u8?nimblesessionid=1&_HLS_msn=59&_HLS_part=5"
The highlighted _HLS_msn=59 and _HLS_part=5 parameters indicate that the server must hold the request until Nimble Streamer has part number 5 of chunk number 59 or later and then it could return a playlist. You can use only _HLS_msn=59 parameter, in this case the playlist will be sent out only once full chunk is available.

The resulting chunklist will look like this:
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MAP:URI="audio.fmp4?nimblesessionid=1"
#EXT-X-TARGETDURATION:7
#EXT-X-MEDIA-SEQUENCE:55
#EXT-X-SERVER-CONTROL:CAN-BLOCK-RELOAD=YES,PART-HOLD-BACK=1.536
#EXT-X-PART-INF:PART-TARGET=0.512
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:26.599Z
#EXTINF:5.994,
a_6_330006_55.fmp4?nimblesessionid=1
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:32.593Z
#EXTINF:6.016,
a_6_336000_56.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_342016_57_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_342016_57_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:38.609Z
#EXTINF:5.995,
a_6_342016_57.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_5.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_6.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_7.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_8.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_9.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_348011_58_10.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.363,URI="a_6_348011_58_11.fmp4?nimblesessionid=1"
#EXT-X-PROGRAM-DATE-TIME:2019-12-23T02:34:44.604Z
#EXTINF:5.995,
a_6_348011_58.fmp4?nimblesessionid=1
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_0.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_1.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_2.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_3.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_4.fmp4?nimblesessionid=1"
#EXT-X-PART:DURATION=0.512,URI="a_6_354006_59_5.fmp4?nimblesessionid=1"
You can see it ends with part a_6_354006_59_5.fmp4 - it's part number 5 of the upcoming chunk 59. That chunk will be available only a few seconds later, but the player can already perform the playback, this helps a lot with reducing the latency.

Push the requested part. In addition to requesting specific part upon its arrival, a player may request Nimble Streamer to make HTTP/2 Push of that part to reduce the playback latency even further. This will be made by adding "_HLS_push=1" parameter in URL. If we look at Nimble Streamer access logs, we'll see the following actions:

Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=0&_HLS_push=1 HTTP/2" 200 84 1114 372 "-""AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:04 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_0.aac?nimblesessionid=18 HTTP/2" 200 0 49896 662 "-""AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "GET /livell/stream/chunks.m3u8?nimblesessionid=18&_HLS_msn=9&_HLS_part=1&_HLS_push=1 HTTP/2" 200 84 1180 341 "-""AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"
Dec 24 18:43:05 nimble[5500]: [nimble-access.log] 192.18.1.2 - - "PUSH /livell/stream/l_4_27008_9_1.aac?nimblesessionid=18 HTTP/2" 200 0 49828 568 "-""AppleCoreMedia/1.0.0.17C54 (iPhone; U; CPU OS 13_3 like Mac OS X; en_us)"

As you can see the player is sending hold-playlist request (described earlier) with specific part number and _HLS_push=1 parameter. Nimble Streamer returns that playlist in response, as well as making HTTP/2 Push for the requested part.

Performance. With all these specific actions Nimble Streamer generates and serves parts within HLS stream with high efficiency. From resource consumption perspective, LL-HLS processing costs the same as handling regular-latency playlists and chunks.

3. Known issues and troubleshooting


At the moment Apple native player on iOS 13.3 has the following problems with LL-HLS implementation which may affect end-user experience.

1. fMP4 video+audio. If you use fMP4 container then you will be able to get either video or audio component working. Video+audio fMP4 streaming is not working properly yet. You can try using MPEGTS container for video+audio instead.

2. Part duration. If you set part duration to less than 1000 ms then video will not work at all. So we recommend setting part duration as "1000".

We are sure those issues will be fixed in Apple's upcoming releases. Meanwhile on iOS 13.3 you'll have to test it with aforementioned limitations.

3. Interleaving compensation. If you have video+audio stream you may have issues with playback due to interleaving as described in this article. This kind of issues becomes even more severe in case of low latency playback. In order to fix this you can try enabling interleaving compensation with Min. delay set to zero, see the image below.





Feel free to try Nimble Streamer with Low Latency HLS and buy SLDP Player SDK to get your hands on the iOS playback.

Let us know if you have any questions.

The State of Streaming Protocols - 2019 Q4

$
0
0
Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 2.25 billion hours this quarter, or 24+ million view hours per day.

The State of Streaming Protocols - Q4 2019

You can compare these numbers with metrics from Q3 2019:

The State of Streaming Protocols - Q3 2019

Most protocols state kept the same share with HLS controlling the most of delivery landscape.


We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitter and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

2019 summary

$
0
0
As the the year of 2019 is over, we want to recap the most significant products and features which we introduced during past 12 months.

We'd like to remind you that you can track all of our latest Softvelum changes via our TwitterFacebookLinkedIn posts and even YouTube channel news podcasts.

Qosifire web service


This year we released a new product called Qosifire. It's a streaming quality monitoring web service which allows tracking live HLS, RTMP and Icecast streams. Qosifire checks for stream correctness from protocol integrity and general content consistency viewpoints. Qosifire agent software checks streams 24/7 using your own server, then our web service console collects data to analyse and sends alerts via email and mobile push notifications.

Read more about why you may need Qosifire for your streaming infrastructure and how you can get started with Qosifire. In addition, read a Qosifire review article by Jan Ozer and find more information in Qosifire knowledge base.

You can also run a free 30-seconds checkup for your live stream without a sign-up.

Nimble Streamer bundle updates


As we explained in January, Flash has been continuously removed in all browsers which caused the decline of RTMP playback. This affects primarily live low latency streaming, this is why we've been working on low latency improvements in our products.

SLDP. First of all, SLDP ABR capabilities ignited a wide adoption among our customers. They use Nimble Streamer for their delivery edges and play their content via HTML, Android and iOS players to have nearly a latency of just about a second long.

Apple introduced Low Latency HLS earlier and released it for developers community.
Now Apple Low Latency HLS is supported in Nimble Streamer with MPEGTS, audio-only and fMP4 containers. Read this introduction article which describes Nimble Streamer setup and LL-HLS usage. As of iOS 13.3, Apple hasn't released LL-HLS from beta stage yet, so we don't have player app available. But our iOS SLDP Player SDK is able to provide this for our subscribers.
BTW, LL-HLS is working on top of HTTP/2 implementation available in Nimble Streamer. You can use it for HLS and MPEG-DASH live streaming delivery.

SRT. Another outstanding technology which we improved over this year was SRT reliable delivery over UDP. Being a member of SRT Alliance we contributed to the community and kept improving user experience allowing to tune latency and maxbw to improve redundancy. Our products always have the latest stable versions of SRT library to make sure they have all the latest improvements.

Icecast. Speaking of other improvements, we added more features related to Icecast metadata as described on our Icecast feature page.

SSL. For those of our customers who use Certbot with Let's Encrypt, we made a detailed description for using this duo with Nimble Streamer.


Live Transcoder has been improved in several ways as well. First, take a look at Transcoder overview screencast and Transcoder documentation reference page to see what we got.

We've added SVT-HEVC software library for H.265/HEVC encoding in Live Transcoder for Nimble Streamer.
This feature utilizes the latest improvement, the ability to run encoder out-of-process which allows securing the main Nimble Streamer process in case if some encoder library causes crashes.

The HEVC in general has been on the rise this year. To meet customers' demands we've released an experimental support of H.265/HEVC over RTMP in Nimble Streamer and Larix Broadcaster apps.

As for encoder libraries, QuickSync hardware acceleration is now available on Ubuntu which makes it easier to install.

Nimble Advertizer was improved through this year to handle SCTE-35 markers:
Read Advertizer spec to full details.

Reference pages. Last but not least, we added a couple of useful digest pages:


Mobile solutions


Our mobile solutions were improved over this year.

One of the most significant improvements is adding SRT playback into SLDP Player for Android and iOS. You can also take a look at our SRT digest page to ind out more about product support for this technology.

As was mentioned earlier, our iOS SLDP Player SDK is able to provide Low Latency HLS playback capabilities for those who would like to try this new technology. Feel free to subscribe to iOS SLDP Player in order to obtain the latest version and build your own app with LL-HLS.

We also released Larix Screencaster for iOS - a highly anticipated great addition to our mobile apps bundle.

Larix Broadcaster is now able to produce RTMPS and RTSPS, which means RTMP and RTSP can be delivered via SSL. It's a great advantage for those who would like to secure their live streams in un-secure environments like mobile networks or public WiFi.
Larix also has ABR support for outgoing streams which means it can lower bitrate and framerate according to network conditions.

Softvelum website now has Larix documentation reference which has links to all articles and pages related to mobile streaming with our products.

You can read SDKs release notes to find out more about our updates and releases.




Softvelum team wishes you a Happy New Year and looks forward to bringing you more features and improvements!



Follow us via TwitterFacebookLinkedIn and YouTube to get updates on time.

FFmpeg custom build support in Live Transcoder

$
0
0
Live Transcoder for Nimble Streamer supports a variety of decoding, filtering and encoding libraries. All the libraries which we have there were checked for reliability, performance and proper licensing before being added into the deployment package.

Our customers ask us to add some new libraries into Transcoder deployment package so they could be available by default in the UI. Mostly those are some existing open-source encoders, or commercial encoder libraries, or even custom encoders built by our customers themselves. However we couldn't add all the libraries which we are requested and this kept the doors closed for new functionality and gave bad experience to our customers.

To solve this problem, it's now possible to use custom builds of FFmpeglibraries to utilize any video and audio encoders as well as filters which are not supported in the default Transcoder package. Live Transcoder uses FFmpeg and its libraries for certain tasks under LGPL license which allows re-building it as necessary. So now you can just add more libraries if you need.

Linux packages of Live Transcoder can pick up custom libraries and use them for further encoding.
Re-building FFmpeg on Windows is also possible. If you are familiar with building FFmpeg for Windows you may try it, however we do not provide support for this case.

Here's how you may re-build FFmpeg and use it further.

1. Legal disclaimer


This article describes the process of building custom third-party FFmpeg libraries and using them in Softvelum Live Transcoder in addition to the libraries which are deployed as part of Live Transcoder package.

Every custom library which is a result of building FFmpeg has its own licensing terms. So every library must be examined for its licensing terms prior to any usage or distribution, including but not limited to the patent licensing terms.

Softvelum, LLC is not responsible for any license or patent infringement which can occur as a result of any FFmpeg custom build usage by Live Transcoder users.

2. Building FFmpeg


This section describes how you can build FFmpeg for further usage in Transcoder.

We strongly recommend you to try custom build approach in testing environment first. Once you get consistent result there, you may apply it to your production environment.

If something goes wrong after any of the steps and you'd like to revert it, just re-install Live Transcoder. This will rewrite all libraries with their default copies.

2.1 Making default FFmpeg build


To make sure your environment is ready for custom builds, let's start with building FFmpeg with the default libraries for Live Transcoder.

First, download the FFmpeg package. As required by FFmpeg license, we've uploaded the FFmpeg package and its build script on our website.

Second, run the shell script in the same directory where you've just downloaded FFmpeg. It has all commands needed for getting a working copy of FFmpeg. Its compiled libraries can be used with Live Transcoder as is.

You may get errors related to missing packages, like Freetype or Speex libraries. Just install respective packages using these commands
sudo apt install libfreetype6-dev libspeex-dev
You'll be able to proceed with building after that.

2.2 Making and using custom FFmpeg build


Now when you have FFmpeg ready for builds, you may add third-party encoder. This depends on what encoder you'd like to add, so you need to refer to your library documentation for more details on installation.

Having an encoder installed, you need to modify your build script to include it. Change your build script and modify the following line:
--enable-encoder=aac,png,mjpeg,customvideocodec,customaudiocodec \
Append your custom encoder name into that line. This is the name which is used within FFmpeg and which will later be used in Live Transcoder. In this case you can see "customvideocodec" and "customaudiocodec". You may also need to append additional lines for other parameters, so check library documentation for more information.

You can find examples of other custom build scripts in our github.

Once the build is over, you can use the new library.

2.3 Using libraries


You can ingest the libraries to Live Transcoder by copying them from "build/lib/" subdirectory of build directory into proper location.

Run this command on Ubuntu to see where Transcoder libraries are located:
dpkg -L nimble-transcoder
Most probably your directory will be /usr/lib/x86_64-linux-gnu/.

On CentOS you can run this command to see where it is:
rpm -ql nimble-transcoder

Once you find the location, you can re-write the libraries by copying from your build directory to Transcoder location.

2.4 Re-start Nimble Streamer


The last step to make those new libraries start working, is to re-start Nimble Streamer using the command required by your specific OS.

For Ubuntu it's this one:
sudo service nimble restart
You can find other OSes on installation page.

3. Using custom libraries in Live Transcoder


Now when you have the custom library available, you can start using it from your Live Transcoder web UI.

Create a Transcoder scenario as usual and add a new encoder element. You can watch this tutorial video to see how transcoding scenarios are created.

For custom video codec follow these steps:
  1. Drop video encoder element.
  2. In the "Add output stream" dialog, the "Encoder" dropdown menu must be set to "FFmpeg" value.
  3. In "Codec" field you need to specify the encoder name as defined in video encoder library, e.g. "customvideocodec" in our example. See section 2.2 regarding codec name in build parameters.


Custom audio codec is added the same way:

  1. Drop audio encoder element.
  2. In "Add output stream" dialog, set the "Encoder" field to "FFmpeg".
  3. In Codec field, specify the encoder name as defined in audio encoder library, e.g. "customaudiocodec" in our example. See section 2.2 regarding codec name in build parameters.



Besides that you can specify whatever parameters that are used in your custom encoder.

That's it. Using this technique you may use some third-party libraries which are not yet available in Live Transcoder out-of-the-box.

If you have any questions regarding this feature set usage, don't hesitate to contact us and show us your use case.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder tutorial videos, Transcoder documentation reference

Mobile streaming to DaCast and Akamai

$
0
0
Larix mobile SDK allows publishing live streams from mobile devices to wide variety of destinations like media servers and streaming services. Some destinations require special handling due to authorization or other concerns.

DaCast service provides a turn-key solution for live streaming. It uses Akamai CDN for ingest and delivery, making it simple for an average user to get it working. However, Akamai has its requirements for authorization and other stream parameters. Nimble Streamer allows publishing to Akamai already so we've added the same support into Larix Broadcaster.

Here is how you can get it working.

Set up DaCast stream


We assume you already have DaCast account, so just open the dashboard and add a new stream.



Click on the stream name to see its full setup details. Click on Encoder tab on top to see the encoder setup details.

Click on Other RTMP encoder to see full parameters of RTMP connection.



Here you see Login and Password values which you will use later in Larix.

Now click on "Click if your encoder has one field" link to see a field with full URL for publishing.


Copy this full URL for later use, it should look like this:
rtmp://p.ep123456.i.akamaientrypoint.net/EntryPoint/dclive_1_150@123456

While you're in DaCast dashboard, check Publish settings tab to get a player page in order to check future result of your setup.

Now let's get to Larix setup.

Set up Larix Broadcaster


Larix Broadcaster is available for both Android and iOS platforms so just install it as usual.

Open the app and and enter settings by tapping on gear icon.

Tap on Connections -> New connection to enter a form below.



  • Name field can contain any alias you want for your connection. Larix Broadcaster allows streaming to multiple destinations simultaneously, so this is how you will distinct them from one another.
  • URL field defines the target destination. Insert the URL which you copied in the previous section.
  • Target type must be set to Akamai/DaCast.
  • Login and Password need ot be exactly like you've seen in DaCast connection settings.

Save connection, get back to connections list and make sure you select this new connection.
Now return to image preview screen and just hit the red button to start streaming.

Now check DaCast player page from previous section to watch the results.

Akamai


This setup procedure is applied the same way for publishing to Akamai CDN via RTMP. The publishing URL will have the same structure with same type of credentials. Larix Broadcaster target type is also "Akamai/DaCast". Please refer to Akamai website to learn more about its setup.



If you have any issues with this feature set, just let us know.

Related documentation


Larix mobile apps and SDK, Nimble Streamer RTMP feature setPublishing to Akamai from Nimble Streamer 

HbbTV MPEG-DASH support in Nimble Streamer

$
0
0
Hybrid Broadcast Broadband TV (HbbTV) has been working with MPEG-DASH for some time by now and Nimble Streamer MPEG-DASH implementation also has that support.

To enabled this support, a specific profile needs to be added to DASH outgoing streams. This can be done by adding the following parameter into nimble.conf file:

dash_live_profiles = urn:hbbtv:dash:profile:isoff-live:2012,urn:mpeg:dash:profile:isoff-live:2011

You need to re-start Nimble Streamer after changing the config. Read this page to learn more about operating config file.

Related documents


MPEG-DASH support in Nimble Streamer

Live Transcoder control API

$
0
0
Nimble Streamer Live Transcoder is well known for its drag-and-drop web UI which allows setting up live stream transformation of any complexity using any browser.

However we have a number of users who need to have some automation of Transcoder operations.

Our team introduced our first approach to Transcoder API.

Visit this WMSPanel API page to see all details of API setup and usage.

The operations which you can do over Transcoder instance are as follows:

  • Retrieve the list of transcoder scenarios
  • Get details of particular scenario
  • Pause and resume particular scenario
  • Delete an existing transcoder scenario

So having a set of scenarios for your servers, you can operate them just like you can do it from scenarios list in UI.

If you need more API call, please feel free to share them using our helpdesk so we could prioritize features in our wishlist.

Related documentation


Nimble Streamer Live Transcoder, Transcoder documentation reference,

Fallback of published RTMP, RTSP and Icecast streams

$
0
0
RTMP, RTSP and Icecast live protocols can be pulled by Nimble Streamer for further processing, and in order to improve robustness each pulled stream can have fallback streams. So if a primary stream cannot be pulled for some reason from the origin, an alternative stream is pulled to do a failover. The playback is not stopped so the user experience is not harmed much.

The aforementioned protocols are often used in publishing mode when the stream is pushed into Nimble Streamer for processing. In this case there is no built-in way to cover that.

Nimble Streamer provides another reliable mechanism for covering fallback of RTMP, RTSP and Icecast published streams, the Live Transcoder hot swap feature set. It allows shifting to secondary stream if the primary one is down for some reason, while maintaining the playback output  for video and audio.

The following steps allow setting this up.

1. Install Live Transcoder


Hot swap feature set requires Live Transcoder premium add-on for Nimble Streamer.

There are two main reasons for Live Transcoder usage:

  • Secondary (substitution) stream needs to fit the primary (original) stream by video resolution and audio sample rate.
  • The primary stream need to be decoded in order to get the substitution smoothly.

You need to obtain a license for Transcoder, then install the add-on and register a license for it.

2. Set up published inputs


You need to have both primary (original) and secondary (substitution) stream being set up and published into Nimble Streamer. In case you haven't done it yet, check the articles for RTMP, RTSP and Icecast publication setup.

3. Set up hot swap failover


Having both streams ready and Transcoder installed, you can set up failover hot swap for them. Follow the instructions and make sure you complete all steps.

4. Test the setup


As always, you need to test the setup before using it in production use cases. If you have any questions or issues, please contact our team so we could help.

Related documentation


Live streaming via Nimble StreamerFailover hot swap, Emergency stream hot swap,

Synchronized playback on multiple devices with SLDP

$
0
0
Playing a live stream simultaneously on multiple devices often requires synchronized playback.

The cases look simple:
  • One big screen shows something and viewers need to have the same audio on their individual devices.
  • A second screen application need to be in sync with ongoing live stream on TV.
  • Multiple screens in the same room show the same content with single source of sound.
  • A number of surveillance cameras need to be shown on the same page.

You probably know other cases where you might have the same requirement.

With traditional delivery protocols like HLS ad MPEG-DASH it's very hard to achieve without dramatically increasing the latency.

SLDP live streaming protocol allows delivering streams real time with low latency, adaptive bitrate and small zapping time. Now it also allows synchronizing the playback among devices and browsers for all the cases which you can see above. It's supported on both server side and client side.

Take a look at sneak previews below.

Here are two browsers running the same stream, with one of them catching up with the playback.



Here are iPhone and iPad running the same stream. In the first scene, the video is catching up with counterpart, in the second scene, the audio is catching up with the video.



Let's see how you can using this feature with SLDP.

Notice that all implementations use additional buffer to make proper synchronization which will increase latency. This buffer must be the same across all platforms. Check each player platform for parameter setup.

Enable feature in Nimble Streamer


We assume you are already familiar with Nimble Streamer setup and you have a working SLDP live stream. If not, please read SLDP setup and usage article to make it work.

On your server, edit nimble.conf to add this parameter and re-start Nimble Streamer:
sldp_add_steady_timestamps = true

You can visit Nimble Streamer parameters reference to learn more about operating that config file.

Once you re-start the server, every SLDP live stream will have a steady clock timestamps needed for playback adjustments. If the connected player doesn't request the steady clock time, Nimble Streamer will not have it in the output stream to avoid any overhead.

Playback in HTML5 SLDP player


If you want to have a synchronized playback in web browsers, use our freeware HTML5 SLDP player.

By default, the feature is turned off. To enable it, add sync_buffer buffer parameter which specifies the buffer size in milliseconds. Recommended values are from 1000 to 5000 and it needs to be the same in all players.

Playback on iOS


SLDP Player for iOS allows enabling and using synchronized playback.
  1. Install SLDP Player via AppStore.
  2. In connection setting, enable Steady clock flag as shown on the screenshot below.
  3. Use Buffering field do define the buffer for this feature. As mentioned above, it needs to be the same in all players.



Playback on Android


Android SLDP Player will have that feature soon.



Once you start playback on multiple devices and browsers with that feature enabled, your playback on all devices will catch up.

Let us know how it works for you and what improvements you'd like to have for it.

Related documentation


SLDP technology overview, SLDP support in Nimble Streamer, Softvelum playback solutions,

Using Certbot with Nimble Streamer working on port 80

$
0
0
When you start using Certbot with Nimble Streamer, you may face the use case when Nimble Streamer is running on port 80. We'll show how this part can be handled.

If you follow this instruction for Certbot on step "4. Choose how you'd like to run Certbot", you need to choose "No, I need to keep my web server running." option.

This setup considers serving Cerbot's  '.well-known; 'folder by Nimble Streamer.

To make it available do the following.

1. Create a folder /pub/.well-known, assign nimble as an owner using the following commands:
sudo mkdir /pub/.well-known
sudo chown nimble:nimble /pub/.well-known
sudo chmod 775 /pub/.well-known
You can use any folder location, but please change it accordingly in other steps.

2. Go to Nimble Streamer -> HTTP origin applications top menu. Click on Add origin application and set .well-known as shown on the screenshot :



3. Choose Nimble Streamer -> Edit Nimble Routes menu and click on Add VOD streaming route then set it as shown on a screenshot.


4. Execute the following command to get certificates, with your_domain_name replaced with your domain name:
certbot certonly --webroot -w /pub -d your_domain_name

That's it. Now you may proceed with Certbot setup instructions from our original article.

Related documentation


Using Certbot with Nimble StreamerSSL support for HLS, MPEG-DASH, Icecast, MPEG-TS and SLDP, Paywall feature set

Glass-to-glass SRT delivery setup

$
0
0
SRT delivery of live streams is gaining momentum as more companies add this protocol support into their products. Being SRT Alliance member, Softvelum provides extensive SRT support in various products.

Currently it's possible to create a glass-to-glass delivery with SRT using Softvelum products.


This article describes detailed setup of the following streaming scenario:
  • Content creator is streaming from Larix Broadcaster to Nimble Streamer server instance.
  • Nimble Streamer takes incoming SRT and provides output for playback.
  • Viewer uses SLDP Player for pulling live stream from Nimble Streamer.
To make this work, we'll set up each element of this chain. We'll show the setup within a local network and you can make it work across any network.

1. Set up Nimble Streamer


Before moving forward you need to complete the following steps:


We'll use an instance available via 192.168.0.104 IP address in local network, all mobile devices will be connected to the same local network.

Here's what we'll set up on Nimble Streamer side:

  • Larix Broadcaster app will use SRT in Push mode to deliver the content, so we'll set up Nimble in "Listen" mode to receive it.
  • SLDP Player will work in "Pull" mode to retrieve live stream for playback, so we'll set up Nimble output in "Listen" mode to take those requests and respond with content.

So let's set up both these elements.

1.1 Receiving input SRT via Listen


In WMSPanel, go to Nimble Streamer -> Live streams settings top menu, then choose "MPEGTS In" tab.


Now click on Add SRT stream to see a new dialog.


Here you need to choose Listen from Receive mode drop down box, enter 0.0.0.0 in Local IP and use some port that is available on your server, like 2020 in this case. The Alias is used for further reference in UI.

Check Add outgoing stream checkbox and define Application name and Stream name, this will create proper output which we'll use later on the next step.

Once you click Save, you'll see this setting being synced to Nimble Steamer instance.


Now if you click on MPEGTS Out tab, you'll see that proper output has also been described for further use.


This is required because SRT uses MPEG-TS as its media transport, which requires this distinction due to a nature of that protocol. You can read more about MPEGTS setup in this article.

From this moment you'll be able to publish a stream into Nimble Streamer, so we have one more setup step left.

1.2 Providing the SRT output via Listen


Go to UDP streaming tab.


Click on Add SRT setting button to see the following dialog.


Set Mode field to ListenLocal port is selected from available ports. Local IP is set to "0.0.0.0", this will allow getting requests on all interfaces.

If you want to use Nimble Streamer with connections from outside of your network, you need to make sure that your firewall and network in general are set up properly to make those connections.

Source application name and Source stream name are defined as the app and stream name from MPEGTS Out section above, those are "srt" and "output". This will redirect the source content into SRT output.

In addition you may define maxbw and latency parameters in case if you use some uncontrolled network for delivery. Read this article for more details.

Now when we have an instance of Nimble Streamer ready to work, we can set up a streaming app.

2. Set up Larix Broadcaster SRT streaming


Larix Broadcaster is a free app, you can install it from Google Play and from AppStore. You can find out about all features of Larix on Android page and on iOS page, they have full lists of capabilities.

Let's use Larix Broadcaster for Android to set up live streaming to Nimble Streamer instance. Once you install and launch it, you'll see preview screen.


Click on gear icon to enter settings dialog.


You may keep default settings, or change some parameters like the resolution from Video menu. You can discover them by browsing the menus.

Now let's set up SRT output stream. Go to Connections menu.


We've previously set up some connections to testing purposes - you can see RTMP and RTSP being both checked. This allowed streaming simultaneously into two destinations. We need to add a new one, so tap on New connection menu.


Here you need to add a name for your connection and then enter the publishing URL. That URL consists of srt:// prefix, the address of a server and port number. In our case it will be srt://192.168.0.104:2020/ - the IP of a server and proper port which we used during the SRT setup on step 1.1. You may leave other options as they are.

After saving a setting you will see it in the list. You can un-check other connections if you want to stream only via the new one.


Now return to preview screen. You can push the big red circle button to start streaming.


The button will change its shape and you will see FPS and streaming length on top and particular stream stats in the bottom.

Now let's watch this stream on other device.

3. Set up SLDP Player SRT playback


SLDP Player is a solution which allows playing SLDP low latency protocol on HTML5 pages and provides wide playback capabilities via Android and iOS apps. You can install it from Google Play and AppStore.

We'll use Android app to demonstrate the SRT playback. Once you install it, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Here you will enter a connection Name and a URL. The URL is our case will be srt://192.168.0.104:2021/ where the IP and port are taken from step previous 1.2.

Once you tap Save, you'll see a new entry in streams list.


Now you can just tap on the name and start watching the stream.




That's it. You can change any of the described components for your streaming scenario as well as combine them with other products and features of our company.

Let us know if you have any questions about the described products and setup.

Related documentation


SRT support overview for Nimble Streamer, SRT setup in Nimble StreamerLarix Broadcaster, Larix Broadcaster docs referenceSLDP Player

SLDP Player setup for Android

$
0
0
SLDP Player is an application which allows playing SLDP, SRT, Icecast, RTMP, HLS and MPEG-DASH on Android device.

In this article we'll overview all settings of SLDP Player.

You can install it from Google Play and once it's installed, you'll see connections menu which will be empty.


Click on plus button to enter a dialog to create a new connection.


Name field sets up a name of current connection in connections' list.

URL is the field where you define your connection. Each connection has common URI structure like protocol://server:port/app/stream/some-suffix where port can be skipped for some default value and suffix may also not be needed:

  • HLS, MPEG-DASH and Icecast streams will have familiar URLs like http://servername:8081/app/stream/playlist.m3u8, https://servername/app/stream/manifest.mpd or https://servername/app/stream/icecast.stream
  • SLDP will have name like sldp://servername:8081/app/stream
  • RTMP will have URL like rtmp://servername:1935/app/stream - notice that stream name is appended to the end of the URL.
  • SRT address will look like srt://servername:1234/ . If you use streamid - see its description below.

Source type for HTTP is used when you play an HTTP-based protocol but the protocol cannot be determined based on its parts like playlist or manifest name. E.g. a URL like https://servername/live/stream.index can mean both HLS and MPEG-DASH, so in that case you should explicitly specify that. In other cases just leave it a s Auto.

SLDP offset parameter allows decreasing start time for streams, read this article for more details.

Buffering defines the size of buffer used before playback starts. It's used to avoid re-buffering during connection issues.

Synchronized playback is described in this article.

Bitrate for Internet radio is for cases when you use SLDP for transmitting online radio and it has adaptive bitrate (ABR). This parameter defines the default bitrate which is used for starting the playback.

SRT passphrase and SRT pbkeylen are specific to your use case security settings so refer to your server admin for more details.

SRT latency and SRT maxbw are related to data re-transmission in SRT connection. Read this article to understand that better.

SRT streamid field is used only if your data source uses that field for identifying streams.


Once you tap Save, you'll see a new entry in streams list. In the example below we've saved SRT playback example from glass-to-glass SRT delivery article.


Now you can just tap on the name and start watching the stream.

You can also see that in action in this video.



Take a look at Softvelum Playback solutions and let us know if you have any questions.


Building NVENC-only pipeline with Nimble Transcoder

$
0
0
Live Transcoder for Nimble Streamer provides wide feature set for transforming live content using both software libraries and hardware acceleration.

NVidia NVENC always fully supported in Live Transcoder for decoding and encoding but all filtering operations were performed using CPU. That caused extra resources usage to transfer processed frames among CPU, GPU and RAM.

Nimble Live Transcoder now allows building transcoding pipelines which are performed completely with NVidia GPU hardware acceleration. This is done using specific FFmpeg libraries which we use in addition to our own code.

We'll show you how to set up this NVENC-powered processing chain.

1. Installation and initial setup


We assume you've already set up Nimble Streamer, it's been set up to get an incoming live stream and you've tested basic streaming. In our example we'll use a stream which application name is "input" and stream name is "source".

If you're not familiar with Live Transcoder, take a look at Transcoder documentation reference.

Notice that the described functionality is available on Ubuntu 18.04 only. We'll support other upcoming LTS Ubuntu releases as well.

The basic steps to make NVENC working are as follows:

  1. Install the latest NVidia drivers on your server.
  2. Create a transcoder license and subscribe for it.
  3. Install Live Transcoder add-on.
  4. Create some simple scenario with CPU transcoding (e.g. downscale your stream to 240p). This way you'll make sure the transcoder was set up properly.
If you already have Transcoder installed, please run these commands to upgrade the package:
sudo apt-get update
sudo apt-get install nimble-transcoder

Now create a new scenario to start a new pipeline setup.

2. Decoder setup


Once you create a new scenario, drag and drop a blue decoder element onto the dashboard. There you need to specify "NVENC-ffmpeg" in Decoder field.


Once the incoming stream is received, Nimble Transcoder will use proper NVDEC/CUVID FFmpeg decoder: h264_cuvid, hevc_cuvid or mpeg2_cuvid. Each decoder has its set of options in case you'd like to fine-tune them or if you want to use extended feature set.

One of those features for all decoders is the ability to resize the frame during decoding. This operation is highly optimized and you can use it to reduce further resource usage. This is available via "resize" parameter as shown on a picture below.


This feature is specifically helpful when you have FullHD stream input and you need to downscale it further. This resolution requires a lot of resources to handle so if you make initial downscale to HD or even lower resolution, then all further operations will consume less RAM and processing power on GPU.

Notice that all forwarding features (subtitles and SCTE-35 markers forwarding) mentioned at the bottom of the dialog will work regardless of decoding option which you choose.

Now let's set up filtering.

3. Filtering


Once the frame is decoded you can process it via a set of ffmpeg filters which are able to control NVENC behavior. Nimble Transcoder supports a number of those, here are the most frequently used:

  • "split" - allows creating several identical outputs from input video. It's available as a filter element in a tool box of Transcoder UI.
  • "scale_npp" performs frame scaling. You add a custom filter to your scenario, set its name to "scale_npp" and its value to resolution, e.g. "854:480" or "640:360".
  • "fps" is a filter which sets the frames per second value. It's also defined via custom filter.


Let us know if you need information about other filters.

In addition to available NVENC-related filters, you can take the frame out of GPU and process it separately using "hwdownload" and "hwupload_cuda" filters. Add custom filter, set its name as mentioned and leave the value field empty. "hwdownload" will get the frame from GPU and "hwupload_cuda" will put it back after processing. Notice that it will increase RAM/CPU usage so use it only if you need to do something you cannot do on GPU.

4. Encoder setup


Having the content transformed via filters, you can now encode it. Add encoder element to your scenario and select "FFmpeg" in "Encoder" field.

Then define "Codec" field as either h264_nvenc or hevc_nvenc - for H.264/AVC or H.265/HEVC codecs respectively.


For more encoder settings, refer to FFmpeg documentation.


Just like you saw in decoder element, all forwarding features from listed under Expert setup at the bottom of the dialog will work properly.

5. Further setup


When you have video pipeline set up, you need to define audio part. If you don't need any sound transformation, you can add a passthrough for it just like it's described in other setup examples.

Click on the picture to see an example of such setup.


Here's what we have:

  • A decoder has a downscale to 720p as described in section 2 above.
  • A split filter which has 3 equal outputs.
  • One output goes directly to the encoder. It takes the downscaled frame and simply encodes it into live/stream_720 output. The encoding parameters are similar to what you see in section 4.
  • Another output it processed via Scale_npp filter which scales it to 480p. That filter is described in section 3. Its output is encoded to live/stream_480 output stream.
  • One more output of split filter goes through "Scale_npp" (to scale to 360p) to "Fps" filter which sets its "fps" value to "25". Then it's encoded into live/stream_360 output.
  • Audio input is passed through for all 3 available output renditions.

This scenario uses only NVENC capabilities for video processing. The output streams are then transmuxed into the output streaming protocols which you select in global server settings or specific settings for "live" application.

Later on we'll introduce a video tutorial showing this scenario creation step by step.

If you have any questions, issues or questions, please feel free to contact us.

Related documentation


Live Transcoder for Nimble Streamer, Transcoder documentation reference,

The State of Streaming Protocols - 2020 Q1

$
0
0
Softvelum team keeps tracking the state of streaming protocols. It's based on stats from WMSPanel reporting service which handles data from Wowza Streaming Engine and Nimble Streamer servers. This quarter WMSPanel collected data about more than 17.8 billion views. Total view time for our server products is 3.04 billion hours this quarter, or 33+ million view hours per day.

The State of Streaming Protocols - Q4 2019
The State of Streaming Protocols - Q1 2020

You can compare these numbers with metrics from Q4 2019:

The State of Streaming Protocols - Q4 2019

You can see a slight decrease of HLS while shifting views to progressive download and MPEG-DASH

We'll keep tracking protocols to see the dynamics. Check our updates at FacebookTwitter, Telegram and LinkedIn.

If you'd like to use these stats, please refer to this article by original name and URL.

2020 Q1 summary

$
0
0
This first quarter of 2020 brought a lot of disruption into lives of billions of people. The unprecedented global measures to reduce the harm from pandemic require a lot of businesses to move online, work remotely and use live streaming more intensively.

Softvelum is fully committed to provide the best support to our customers, as always. Since the early days of inception, we were working remotely full-time. We were building and adjusting our business processes to keep the efficiency while expanding our team. Now when we have to self-isolate in order to stay healthy, we keep doing what we used to do through all these years, keeping the level of support on the same highest level. Feel free to contact our helpdesk and take a look at the list of our social networks at the bottom of this message to stay in touch with us.

With the extreme rise of interest for live streaming, we keep working on new features, here are updates from this quarter which we'd like to share with you.



Mobile products

Mobile streaming in on the rise now, so we keep improving it.




SRT

SRT protocol is being deployed into more products across the industry. Our company was among the first to implemented it into our products, and now we see more people building their delivery networks based on this technology. So we've documented this approach:
  • Glass-to-Glass Delivery with SRT: The Softvelum Way - a post for SRT Alliance blog about building delivery from mobile deice through Nimble media server into mobile player.
  • Glass-to-glass SRT delivery setup - a post in our blog describing setup full details.
  • All of our products - Nimble Streamer, Larix Broadcaster and SLDP Player - now use the latest SRT library version 1.4.1.
  • Just in case you missed, watch vMix video tutorial for streaming from Larix Broadcaster to vMix via SRT, which can also be used a the source for such delivery chain.


Live Transcoder

We are continuously improving Live Transcoder so this quarter we made a number of updates to make it more robust and efficient. Here are the latest features we've made.

  • You can now create transcoding pipelines based only on NVENC hardware acceleration which works for Ubuntu 18.04+. Read this setup article for more details.
  • FFmpeg custom builds are now supported. This allows using additional libraries that are not supported by Transcoder at the moment. Read this article for setup details.
  • Transcoder control API is now available as part of WMSPanel API. It's a good way to automate some basic control operations.


Nimble Streamer

Read SVG News article about how Riot Games build their streaming infrastructure with various products, including Nimble Streamer.

A number of updates are available for Nimble Streamer this quarter:

Also, take a look at the State of Streaming Protocols for 2020Q1.


If you'd like to get our future news and updates, please consider following our social networks. We've launched a Telegram channel recently and we now make more videos for our YouTube channel. As always, our TwitterFacebook and LinkedIn feeds keep showing our news.



Stay healthy and safe, our team will help you carry on!

SRT FEC (forward error correction) support in Nimble Streamer

$
0
0
Softvelum is an active adopter of SRT technology and Nimble Streamer has extended support for it.

Once of the features introduced in latest SRT library versions is the ability to set custom packet filters for SRT transmission. The first introduced built-in filter is Forward Error Correction (FEC).

Before using feature, please read carefully the SRT Packet Filtering & FEC feature documentation in SRT library github repo.

Disclaimer


We assume you are already familiar with SRT setup and usage, and you've successfully used SRT in other scenarios and use cases.

Before proceeding further, set up a test streaming scenario and make sure it works without additional filter.

FEC filter is still under development so, here is what you need to consider first:
  • Use FEC filter feature on your own risk.
  • It may crash the server, so if you face any issues, check Nimble Streamer logs to analyse the problem.
  • Try using it with test streams first, and then move to production only when you make sure it works as expected.

Upgrade


In order to use this filter, you must upgrade Nimble Streamer and make sure you have the latest SRT library package.

  1. Nimble Streamer version must be at least 3.6.5-6, use this procedure to upgrade.
  2. SRT library package must be at least 1.4.1, use this page to get upgrade instruction.

Once you upgrade and re-start Nimble Streamer instance, you may proceed to further setup.

Setup details


According to information from SRT developers team, the FEC filter must be set on both sender and receiver sides and at least one side should define a configuration for it. IN our example we'll define configuration parameter on sender

As was mentioned, we assume you've set up your SRT streaming scenario. Let's modify it to set up sender part.

Sender


Go to "UDP streaming" tab on "Live streams setting" page and open your SRT setting. Scroll down to parameters list and add new parameter with "filter" name with a value which you fin appropriate, as shown on a screenshot below.



We use "fec,cols:10,rows:5" there but you can use any other value which you find appropriate for your case, please refer to FEC documentation to learn more.

Receiver


Now on a receiver side, you need to also define "filter" parameter with "fec" value.

In case of Nimble Streamer setup, go to "Live streams settings" page, "MPEG_TS In" tab and open existing incoming stream dialog. Then enter this "filter" parameter as shown below. Notice that you don't need to set more details because you've defined them earlier on sender side.



This was an example of FEC usage in Nimble Streamer.

As we've mentioned earlier, in case of any issues please analyse Nimble Streamer logs to get more details for further analysis. Look for existing issues and solutions in SRT issues on github and post your questions there in case of concerns with FEC filter.

Related documentation


SRT support in Softvelum products, SRT in Nimble Streamer,

iCloud support and file operations in Larix Broadcaster for iOS

$
0
0
Larix Broadcaster for iOS now has improved capabilities for recording live stream into local files.

Here are the options you can use for files storage:

  • Local storage available via MacOS Finder as described in this article.
  • iCloud Drive
  • Photo Library

You can also split video into sections, like it's usually done in dash cams. The recording is divided into multiple files by length.

Here's how you can set this up.

First, install Larix Broadcaster from AppStore.

Go to app Settings / Capture and recording / Record menu.


First, turn on the Record stream to automatically record any live stream.

Tap on Storage to select the default storage for recorded videos and screenshots. This will be one of 3 options mentioned above.

Split video into sections allows defining the length of recording which the stream will be split into. By default of Off.

Once you make recordings, go to Settings / Manage saved files menu.


Here you can long tap on file name to move it to proper destination. You can also tap Edit to perform multiple files operation. The iCloud tab will show the content of respective iCloud Drive folder. The recorded or copied files an be found in respective folders.

Take a look at a brief video overview of this feature.





Let us know if you'd like any improve in this feature set.

Related documentation


Larix Broadcaster, Larix documentation reference, Softvelum YouTube channel,

Viewing all 436 articles
Browse latest View live