Quantcast
Channel: Softvelum news: Nimble Streamer, Larix Broadcaster and more
Viewing all 436 articles
Browse latest View live

Compensating the interleaving in Nimble Streamer

$
0
0
Each live stream includes 2 parts - video and audio. They usually go hand-by-hand but sometimes they are un-synchronized, which means video or audio stream has delay comparing to its counterpart. This behavior was seen in some of MPEG-TS hardware encoders like Elemental and Digital Rapids as example.

Small delays are usually compensated by all player software but as far as the delay becomes significant - up to a few seconds - this becomes a problem which leads to playback failure in some players like VLC.

This issue is overcome by adding interleaving compensation. What is does is it creates a buffer of incoming frames and sorts them for further output. This solves the described problem but introduces some delivery delay and resource overhead because buffering and sorting is done for all stream all the time. This operation is resource-consuming and brings latency into streaming process, as each frame is placed into the buffer and needs to be held at least until the next frame comes in.

This is why Nimble Streamer has this compensation disabled by default and you need to specifically enable it to make this work. As always, this is done via WMSPanel web control UI. You can enable it both for the entire server and for any specific application.



To make the compensation work, click on "Enable interleaving compensation" checkbox. This will show new fields to control parameters.

Set interleaving compensation server-wide.
Set interleaving compensation for specific application.

Min. delay is the parameter which specifies when Nimble needs to start buffering. The default value is 1000 ms which means that if the delay between video and audio is less than 1 second, then compensation will not be performed. As soon as the delay reaches this value, the data will be buffered and sorted. If the delay is changing over time - e.g. from 0.4 to 1.5 seconds - then compensation will be applied only when the delay value reached the "Min. delay" value.
If you set this value to 0 then your streams will be sorted all the time giving out monotonous sequences.
The default value of 1 second will be enough to skip the processing of small deviations that will play fine in existing players like VLC.

Max. delay specifies what is the maximum delay that needs to be handled and its default value is 3000 ms. This parameter is needed in case when the source encoder goes into some malfunction like when components' timestamp is invalid.

Max. queue items defines how many elements - e.g. frames - must be stored in a buffer. The deault value is 250 which stands for 5 seconds of 50fps stream. If the buffer is fully loaded then the oldest elements are sent out.

The described approach can be used for a number of corner cases of live streams processing, while still having high performance and efficient resources usage.

If you need any help on this feature set, feel free to contact us.


Intel Quick Sync video encoder parameters in Nimble Streamer Transcoder

$
0
0
Intel® Quick Sync technology provides efficient encoding capabilities. It allows using hardware acceleration for video encoding using Intel® processors feature set and software encoding in all other cases.

Nimble Streamer Transcoder now allows using Intel® Quick Sync as a H.264 video encoder in transcoding scenarios.

Let's take a look at encoder settings available at the moment.

First of all, take a look at Quick Sync encoder usage in our web UI.


As you see, it takes just a few clicks to use Quick Sync as encoder.

Now let's see what parameters you can use there in order to control encoding process. It's similar to previously described libx264 encoder settings but it has its specifics. In the encoder settings dialog box you can add any of the parameters described below.



preset

It's mapped to target usage and indicates trade-offs between quality and speed. The application can use any number in the range. The actual number of supported target usages depends on implementation.

  • "ultrafast" - best speed
  • "veryfast"
  • "faster"
  • "fast"
  • "medium" - balanced
  • "slow"
  • "slower"
  • "veryslow" - best quality


profile

Specifies the codec profile use it explicitly or the Quick Sync functions will determine the correct profile from other sources, such as resolution and bitrate.
"baseline"
"main"
"high"
"extended"
"high422"

async_depth

Specifies how many asynchronous operations an application performs before the application explicitly synchronizes the result. If zero, the value is not specified. Set async_depth=1 to low latency encoding. Read this article for more details.

level

Specifies the codec level.

  • 0 - Unspecified codec level, determine the correct level from other sources, such as resolution and bitrate
  • 1
  • 1b
  • 11
  • 12
  • 13
  • 2
  • 21
  • 22
  • 3
  • 31
  • 32
  • 4
  • 41
  • 42
  • 5
  • 51
  • 52


keyint

Number of pictures within the current GOP (Group of Pictures).

  • 0 - the GOP size is unspecified
  • 1 - only I-frames are used.


bf

Maximum number of B frames between non-B-frames.
0 - unspecified,
1 - no B frames

cgop

Frames in this GOP do not use frames in previous GOP as reference. The encoder generates open GOP if this flag is not set. In this GOP frames prior to the first frame of GOP in display order may use frames from previous GOP as reference. Frames subsequent to the first frame of GOP in display order do not use frames from previous GOP as reference.
The AVC encoder ignores this flag if idr_interval is set to 0, i.e. if every GOP starts from IDR frame. In this case, GOP is encoded as closed. This flag does not affect long-term reference frames.

scenecut

The encoder must strictly follow the given GOP structure as defined by keyint parameter, bf etc. Otherwise, the encoder can adapt the GOP structure for better efficiency, whose range is constrained by keyint, bf parameters etc. See also description of i_adapt and b_adapt.

idr_interval

Specifies IDR-frame interval in terms of I-frames;

  • 0 - I-frame is an IDR-frame.
  • 1 - every other I-frame is an IDR-frame, etc.

If keyint or bf is zero, idr_interval is undefined.

rate_control

Sets bitrate control methods.

  • cbr - Use the constant bitrate control algorithm. "bitrate", "init_bufsize", "bufsize", "max_bitrate" - might be specified.
  • vbr - Use the variable bitrate control algorithm; "bitrate", "init_bufsize", "bufsize", "max_bitrate" - might be specified.
  • cqp -  Use the constant quantization parameter algorithm; "qpi", "qpp", "qpb" must be specified.
  • avbr - Use the average variable bitrate control algorithm. The algorithm focuses on overall encoding quality while meeting the "bitrate" specified bitrate, within the "accuracy" accuracy range, after a "convergence" period. This method does not follow HRD and the instant bitrate is not capped or padded. "bitrate", "accuracy", "convergence" must be set.
  • la - Use the VBR algorithm with look ahead. It is a special bitrate control mode in the encoder that has been designed to improve encoding quality. It works by performing extensive analysis of several dozen frames before the actual encoding and as a side effect significantly increases encoding delay and memory consumption. The only available rate control parameter in this mode is "bitrate". Two other parameters, "max_bitrate" and "init_bufsize", are ignored. To control LA depth the application can use "la_depth" parameter. This method is not HRD compliant.
  • icq - Use the Intelligent Constant Quality algorithm. This algorithm improves subjective video quality of encoded stream. Depending on content, it may or may not decrease objective video quality. Only one control parameter is used - quality factor, specified by "icq_quality"
  • vcm - Use the Video Conferencing Mode algorithm. This algorithm is similar to the VBR and uses the same set of parameters "init_bufsize", "bitrate" and "max_bitrate". It is tuned for IPPP GOP pattern and streams with strong temporal correlation between frames. It produces better objective and subjective video quality in these conditions than other bitrate control algorithms. It does not support interlaced content, B frames and produced stream is not HRD compliant.
  • la_icq - Use intelligent constant quality algorithm with look ahead. Quality factor is specified by "icq_quality". To control LA depth the application, use "la_depth" parameter. This method is not HRD compliant.
  • la_hrd - Use HRD compliant look ahead rate control algorithm.
  • qvbr - Use the variable bitrate control algorithm with constant quality. This algorithm trying to achieve the target subjective quality with the minimum number of bits, while the bitrate constraint and HRD compliance are satisfied. It uses the same set of parameters as VBR and quality factor specified by "qvbr_quality".


bitrate

Sets bitrate in Kbps, must be specified for cbr and vbr

init_bufsize

Sets how full the rate buffer must be before playback starts. If is equal to zero, the value is calculated using bitrate, frame rate, profile, level and etc.

bufsize

Sets the size of the rate buffer in kilobits (Kb). If is equal to zero, the value is calculated using bitrate, frame rate, profile, level, and so on

max_bitrate

Set max bitrate in Kbps.

qpi, qpp, qpb

Quantization Parameters for I, P and B frames, respectively, for the CQP mode.
It's a value from 1…51 range, where 1 corresponds to the best quality. 0 uses Quick Sync default.

accuracy

AVBR accuracy in tenth of percent

convergence

AVBR Convergence period with 1..100 frames.

icq_quality

This parameter is for ICQ bitrate control algorithm.
It's a value from 1…51 range, where 1 corresponds to the best quality.

qvbr_quality

qvbr quality factor.
It's a value from 1…51 range, where 1 corresponds to the best quality.

slices

Number of slices in each video frame; each slice contains one or more macro-block rows.
If "slices" equals zero, the encoder may choose any slice partitioning allowed by the codec standard. Slices contain a fixed number of macroblock rows.  The number of rows per slice is automatically calculated.   When the number of rows is not evenly divisible slices may be different sizes (+/-1 row).

ref

Number of reference frames;
if "ref" = 0, this parameter is not specified.

pict_struct

Set picture structure

  • default is for progressive
  • tiff is for tiff;
  • bff is for bff;


cavlc

CAVLC value

  • 0 - CABAC is used;
  • 1 - CAVLC is used;

mbbrc

Setting this flag enables macroblock level bitrate control that generally improves subjective visual quality. Enabling this flag may have negative impact on performance and objective visual quality metric. The values are 0 and 1 for "off" and "on".

la_depth

Specifies the depth of look ahead rate control algorithm. It is the number of frames that encoder analyzes before encoding. Valid value range is from 10 to 100 inclusive. To instruct the encoder to use the default value the application should zero this field.

trellis

This option is used to control trellis quantization in AVC encoder.

  • 0 - Turn trellis quantization off for all frame types.
  • 1 - Turn on for I frames.
  • 2 - Turn on for I and P frames.
  • 3 - Turn on for I,P,B frames.

b_pyramid

This option controls usage of B frames as reference. This parameter is valid only during initialization.

  • 0 - Do not use B frames as reference.
  • 1 - Arrange B frames in so-called “B pyramid” reference structure.

i_adapt

This flag controls insertion of I frames by the encoder. Turn ON this flag to allow changing of frame type from P and B to I. This option is ignored if scenecut=1.

  • 0 - don't allow changing of frame type from P and B to I
  • 1 - allow changing of frame type from P and B to I

b_adapt

This flag controls changing of frame type from B to P. Turn ON this flag to allow such changing. This option is ignored if scenecut=1.


fps_n, fps_d
Set output FPS numerator and denominator. It only affects num_units_in_tick and time_scale fields in SPS.
If fps_n=30 and fps_d=1 then it's 30 FPS
If fps_n=60000 and fps_d=2002 then it's 29.97 FPS
Source stream FPS or filter FPS is used if fps_n and fps_d is not set.




These are the parameters which you can use already in order to control Intel® Quick Sync video encoder.
We keep improving our transcoder feature set, contact us for any questions.

Related documentation


Enable hardware acceleration for Intel Quick Sync in Windows

$
0
0
Nimble Streamer Transcoder supports Intel® Quick Sync technology for both software video encoding and hardware encoding acceleration using Intel® processors feature set. As we've described earlier, now allows using Quick Sync as a H.264 video encoder in transcoding scenarios.

Software encoding is available in our Transcoder by default while hardware acceleration needs to be enabled separately.

Let's see how hardware acceleration is enabled on Windows platform for Nimble Streamer.

Install Nimble


We assume you've already installed Nimble Streamer and the Transcoder on top.

Install Intel® Media SDK


To make Quick Sync work, you need to install Intel® Media SDK first. Follow this link to fill in the form and get the SDK installation package.

Application isolation workaround


In Windows operating system starting from Windows Vista and later, implementation of hardware acceleration relies on interaction with the hardware graphics drivers. The system isolates the services in a non-interactive environment called "Session 0 Isolation." Applications running in this isolation don't have access to hardware drivers. This means when Nimble Streamer is launched as a system service, Quick Sync hardware acceleration isn't available.



This is why you need to use a workaround when Nimble Streamer is run as an application in an environment with full access to the graphics hardware.

You need to create a setup where Nimble Streamer is started when a server is rebooted and it runs in an environment with full access to the hardware.

You'll need to have the following steps completed:
  1. Create a Windows Scheduled Task that runs the Wowza media server when a particular user logs on to the server.
  2. Configure the server to automatically log on after reboot.

Create Windows Scheduled Task


To create a Windows Scheduled Task to run the Wowza media server software, do the following steps.

Open Task Scheduler (Start> All Programs> Administrative Tools> Task Scheduler).

On the Action menu, click Create Task action and set the following values.

General tab
  • In Name, enter Nimble Streamer.
  • Click Change User or Group, select the user that will be used to automatically log on to the server and run Nimble.
  • Select Run only when user is logged on.
  • Uncheck Run with highest privileges.
  • Uncheck Hidden.
Triggers tab
  • Click the New button.
  • In the New Trigger dialog box, in Begin the task, select At log on.
  • In the Settings area, select Specific user, click the Change User button, and then select the same user as in the previous step (if not already selected).
  • In Advanced settings select Enabled option clear all other options.
Actions tab
  • Click the New button.
  • In the New Action dialog box, in Action, select Start a program.
  • In the Program/script field, enter <<Nimble Streamer full exe path>>.
  • Leave the Add arguments (optional) field blank.
  • In the Start in (optional) field, enter <<Nimble Streamer full path>>.
Conditions tab
Clear these options:
  • Start the task only if the computer is idle for
  • Start the task only if computer is on AC power
  • Wake the computer to run this task
  • Start only if the following network connection is available
Settings tab
  • Select Allow task to be run on demand option.
  • Clear Run task as soon as possible after a scheduled start is missed option.
  • Clear If the task fails, restart every option.
  • Clear Stop the task if it runs longer than option.
  • Select If the running task does not end when requested, force it to stop option.
  • Clear If the task is not scheduled to run again, delete is after option.

Now click OK.

Now when you click the Task Scheduler Library folder icon in the Task Scheduler contents panel, you will see a list of active scheduled tasks. The new Nimble Streamer task should be displayed in it. You may right-click the Nimble Streamer task and select Run, a command prompt will open and Nimble Streamer instance will start. If a command prompt doesn't open, double-click the Nimble Streamer task and make sure that it's configured according to the instructions above. After you've set up the task properly, log off the computer and log back on as the user for which you configured the task to run. You will see a command prompt open and Nimble Streamer start. Proceed to the next section once you have this working.

Set server to auto-logon on reboot


Now you need to set Windows to automatically logon as the user for which you configured the scheduled task to run. There are several methods to configure the auto-logon.

There are also third-party tools that help to make auto-logon more secure by better protecting the user name and password of the target user and locking the server after logon. You may consider LogonExpert as such tool. When using LogonExpert and the Lock computer after logon security setting, you may need to set the Delay computer lock for [x] seconds value to 20 seconds to enable Intel Quick Sync acceleration to work properly.

Make sure that you configure power settings on the server such that the current user isn't automatically logged off and the server doesn't go into Sleep or Hibernation mode. Also, make sure to adjust power settings so that the server always runs at full performance.


You're all set now and you can use hardware acceleration on Windows.

We keep improving the Transcoder feature set, contact us for any questions.

Related documentation


Live Transcoder for Nimble Streamer, Live Streaming features, Build streaming infrastructure with Nimble Streamer,

Intel is a trademark of Intel Corporation in the U.S. and/or other countries.

AAC LATM header support in Nimble Streamer

$
0
0
Nimble Streamer has wide variety of audio-related features and advantages. This includes processing of AAC in any available incoming stream regardless of its protocol.

Speaking of AAC, most audio encoders produce AAC with ADTS headers which is commonly used over the Internet.

Now Nimble Streamer supports LATM header in addition to ADTS. It is widely used in satellite streaming over DVB, DVB-T and DVB-T2.

This header can be processed from incoming RTSP and MPEG-TS streams for further transmuxing into other protocols for live streaming. The stream can be also be handled in our Live Transcoder for further transformation.

The State of Streaming Protocols - June 2016

$
0
0
WMSPanel team continues analyzing the state of streaming protocols.

The metrics calculations are based on nearly 3 billion views. The stats are collected from 2800+ media servers (Nimble Streamer and Wowza).

Protocols share remain stable, i.e. HLS share is about 70% with progressive download having 9%.

The State of Streaming Protocols, June 2016


You can compare that to May stats below.

The State of Streaming Protocols, May 2016



This report is brought to you by WMSPanel team. We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitterGoogle+ or LinkedIn.

June updates

$
0
0
This month we've released new mobile screencasting app and introduced Quick Sync Video in our Transcoder as well as some other features.

P2P


But first check out a new article from our P2P partner Peer5 - HLS with Nimble - which describes the process of Nimble Streamer setup for both VOD streaming and RTMP transmuxing to HLS. The result stream can be used as a source for Peer5 serverless CDN which highly reduces traffic coming from from the origin.
Check our P2P streaming overview page showing different P2P capabilities and use cases.

Transcoder


Our latest product Nimble Streamer Live Transcoder now supports Intel® Quick Sync Video technology for hardware acceleration.
You can check our Quick Sync featured page for all details including Quick Sync encoder setup parameters and the process of enabling Quick Sync HW acceleration on Windows and on Linux.
This new integration allows using full power of Intel processors for your streaming scenarios.

We're asked how is our transcoder different from FFmpeg library which is a worldwide well-known product. We've released an article describing the benefits of our product over FFmpeg.

Mobile SDK and Screencaster


Our mobile SDK can be used for a number of live streaming cases. One of them is screencasting when you transmit the content of your screen to any destination.
We introduce Larix Screencaster which is based on mobile SDK and is capable of streaming via RTMP or RTSP.
You can get it on Google Play and set it up as described in this article.

Get it on Google Play

You can get our mobile SDK here.
In case you used it before, notice that it has a number of new features and improvements including the aforementioned screencasting.

Nimble Streamer


Nimble Streamer now has the enhanced interleaving compensation. It helps a lot in cases when video or audio stream has some delay.

Another enhancement is LATM header support for AAC while streaming via RTSP and MPEG-TS.

One more interesting feature is live HEVC transmuxinginto HLS. This is useful for the cases when your end users devices are capable of H.265 playback.



The last but not the least update: check the State of Streaming Protocols for June 2016.


Follow us at FacebookTwitterGoogle+ or LinkedIn to get latest news and updates of our products and services.

Enable hardware acceleration for Intel Quick Sync in Linux

$
0
0
Nimble Streamer Transcoder supports Intel® Quick Sync technology for both software video encoding and hardware encoding acceleration using Intel® processors feature set. Nimble Streamer allows using Quick Sync as a H.264 video encoder in transcoding scenarios.

Once you have Quick Sync installed, the software encoding is available in our Transcoder by default while hardware acceleration needs to be enabled separately.

The instruction below describes how to enable hardware acceleration on Linux. CentOS 7.1 is required for Quick Sync acceleration to work properly, other distribuives and versions cannot be applicable.


1. Install CentOS


Install 64-bit CentOS 7.1-1503 from http://vault.centos.org/7.1.1503/isos/x86_64/

When installing, you should use the "Development and Creative Workstation" base environment.
    Do not update the system via yum update. The installed default components are required.

    2. Install Intel® Media Server Studio

    Install Intel® Media Server Studio Free Community Edition for Linux from https://software.intel.com/en-us/intel-media-server-studio according to Intel® Media Server Studio Getting Started Guide:

    As root


    # usermod -a -G video [LOGIN]

    As regular user


    $ tar -xvzf MediaServerStudio*.tar.gz $ cd MediaServerStudio* $ tar -xvzf SDK*.tar.gz $ cd SDK* $ cd CentOS $ tar -xvzf install_scripts*.tar.gz 

    As root


    # ./install_sdk_UMD_CentOS.sh
    # mkdir /MSS
    # chown {regular user}:{regular group} /MSS

    As regular user:


    $ cp build_kernel_rpm_CentOS.sh /MSS
    $ cd /MSS
    $ ./build_kernel_rpm*.sh

    As root:


    # cd /MSS/rpmbuild/RPMS/x86_64
    # rpm -Uvh kernel-3.10.*.rpm
    # reboot


    3. Install Nimble Streamer Transcoder


    Follow Transcoder CentOS 7 installation procedure.



    You're all set now and you can use hardware acceleration on Windows with our live Transcoder.

    We keep improving the Transcoder feature set, contact us for any questions.

    Related documentation


    Live Transcoder for Nimble StreamerLive Streaming featuresBuild streaming infrastructure with Nimble StreamerTranscoder support for Intel® Quick Sync, Enabling hardware acceleration on Windows,

    Intel is a trademark of Intel Corporation in the U.S. and/or other countries.

    EditListBox MP4 primitive support in Nimble Streamer

    $
    0
    0
    Nimble Streamer media server has a wide VOD feature set which includes transmuxing MP4 files into HLS and MPEG-DASH. MP4 is supported as video+audio, video only, audio only, original MP4 and Apple QuickTime extension.

    Some MP4 files which were created by FFmpeg tool have its specific edts->elst MP4 atoms used (it's called EditListBox). It's used in case when audio and video are written unsynchronized in order to tell the player or other client software to perform appropriate playback.

    Nimble Streamer allows using MP4 files with EditListBox atom for further transmuxing into HLS and MPEG-DASH. The resulting media will be fully-synchronized and playable in any player.


    Contact us if you face any concerns about other MP4-specific issues.

    Related documentation


    Nimble StreamerVOD streamingLive streaming capabilitiesMPEG-DASH feature set, HLS feature set


    Manipulating audio channels in Nimble Streamer Live Transcoder

    $
    0
    0
    Live Transcoder for Nimble Streamer has rich audio transformation feature set which is based on audio filters and AAC decoding/encoding. Besides common operations like volume control, changing sample rate or bitrate, you can set up any FFmpeg filters.

    Let's see an interesting task for channels manipulation which can be accomplished with Nimble Streamer Live Transcoder.

    There's a stream with video and stereo audio where each of the audio channels has its own language. The goal is to create 2 streams with identical video and different audio, each stream having its language.

    We assume you've already installed Transcoder. If not, use this procedure. You can check our videos describing general setup process to get familiar with the UI. As a result you need to set up transcoding scenario as shown below.


    As you can see, first we've made 2 passthrough videos.

    The next step is to split audio after it's decoded to avoid any resources usage overhead.
    Then you can see the key element which is "Pan" filter which goes all the magic. It's applied to each of the split audio streams. You can read FFmpeg audio manipulation page for more details about pan filter usage.

    Check to pictures below to see each of the filters' settings. Those are:

    • stereo|c0=c0|c1=c0
    • stereo|c0=c1|c1=c1




    All you need after that is to add 2 standard AAC audio encoders.

    That's it. When the scenario is saved, it's applied to the Transcoder instance and you get 2 streams with same video and different languages audio. Those streams will be delivered via any protocol which you defined for corresponding applications, e.g. RTMP, HLS etc.


    To set up more audio encoding parameters, please check Audio encoder parameters video on our YouTube channel.

    Feel free to visit Live Transcoder webpage for other details and contact us if you have any question.

    Related documentation


    VP6, VP8 and VP9 transmuxing in Nimble Streamer

    $
    0
    0
    Live transmuxing feature set of Nimble Streamer now covers support for VP6, VP8 and VP9 codecs. Let's see how you can use them in Nimble Streamer.

    VP6 via RTMP


    VP6 codec is supported only in RTMP protocol. Nimble Streamer has wide RTMP feature set, so you can create full-scale VP6 transmuxing scenarios which includes the following:

    • receiving VP6 content as published stream;
    • get RTMP pulled streams, including pull by request scenario;
    • transmuxing VP6 for RTMP playback;
    • performing RTMP re-publishing to allow delivery to other destinations like your edge servers or third-party CDN.


    VP8 and VP9 via RTSP


    VP8 and VP9 codec are supported only in RTSP protocol. Nimble Streamer also has RTSP feature set, and VP8/VP9 transmuxing scenarios cover the same set of features:


    VP6, VP8 and VP9 playback capabilities all have support for our paywall features like hotlink protectiongeo-blocking or pay-per-view framework.

    Another great security option is publish control for VP6/VP8/VP9 streaming which allows applying your business logic to the streams that are published by third-parties.



    As you see, Nimble Streamer allows full-scale streaming with all mentioned codecs, from input to multiple destination output. This is done with high performance and low resource usage which allows building delivery networks with low cost of ownership.

    If you have any VP codecs family feature request, feel free to contact us about it.

    Publishing RTMP to Akamai

    $
    0
    0
    Even though Nimble Streamer allows building live streaming networks, there are cases when some external CDN needs to be used to provide additional geo coverage and off-load your network during peaks.

    This is why we added Akamai CDN RTMP publishing support in Nimble Streamer. It's based on RTMP re-publishing scenario with a few additional steps.

    Here is the setup instruction.



    Enable interleaving compensation


    Akamai requires synchronized video and audio in your live stream, so you need to enable interleaving compensation for the source application for further publishing. This application needs to be used as a target for your encoder or transcoder which you use as an origin for your stream.

    Another requirement for the source stream is that its time stamps must be sequential.

    Now, being logged into WMSPanel, go to Nimble Streamer -> Live streams settings top menu. Open application settings.


    Now click on Add application settings to open the dialog below. Here you will enter the name of the application that will be publishing RTMP stream to Akamai. So your encoder needs to have that stream as output.

    Adding application for further Akamai publishing.

    You can define the application name as well as its publishing login and password if required. Selecting RTMP protocol from the list of check-boxes should be enough.

    Click on Enable interleaving compensation and fill in the following field as follows:

    • Min. delay to 0
    • Max. delay to 10000
    • Max queue items to 500

    Then save this setting to the designated servers. Your apps list will be appended with new item.



    Notice that you also need to define interface and port for RTMP publishing in Interfaces tab as well.


    Now your stream is ready for re-publishing to Akamai.

    Set up re-publishing with Akamai authentication


    Go to Republishing tab


    Now click on Add Akamai RTMP button to see the dialog below.



    Here you need to fill the fields:

    • Source application - this is the app which you previously created.
    • Source Stream - this is the stream created by your source encoder.
    • Akamai Stream ID is provided by Akamai
    • Destination address and Port are also defined by Akamai
    • Destination application an Stream are defined by yourself.
    • Authorization schema will be set to "Akamai"
    • Login and Password must be known to you.

    Saving these changes you'll see new entry in the list.



    That's it. Once you start streaming from your origin encoder, you'll get the stream in the CDN. Using the playback URL provided by Akamai.


    Other useful features


    Besides Akamai, see some other examples of republishing:

    For other live streaming scenarios, check our live streaming use cases.

    We also recommend using our Live Transcoder for content transformation activities. It overcomes the performance of FFmpeg while having flexible web UI.

    Having that, you can create flexible delivery chains using Nimble Streamer for media hubs and WMSPanel for easy-to-use control panel. Install Nimble Streamer if you still haven't done that and contact us if your streaming scenarios need any updates of described functionality.

    Related documentation


    Live Streaming featuresLive Transcoder for Nimble StreamerRTMP feature setBuild streaming infrastructure with Nimble Streamer,

    AC3 support in Nimble Streamer transmuxing

    $
    0
    0
    Nimble Streamer has extended audio-related features. This includes processing of AAC as a primary codec. However, a lot of people use other codecs.

    Nimble Streamer supports AC3 audio codec for transmuxing.

    • As input, AC3 is supported in RTSP and MPEG-TS incoming streams.
    • As output, AC3 is supported in RTSPMPEG-TS and HLS outgoing streams.

    This could easily be supported in MPEG-DASH as well if modern players could handle it in live streaming scenarios.

    Also check full list of codecs supported by Nimble Streamer.

    Publishing RTMP to Limelight CDN

    $
    0
    0
    Our customers use Nimble Streamer for building live streaming networks with their own infrastructure. However there are cases when some external CDN needs to be used to provide additional geo coverage and off-load your network during peak hours or extraordinary events.

    Thus we added Limelight CDN RTMP publishing support in Nimble Streamer. It's based on RTMP re-publishing scenario with a few additional steps. With Limelight RTMP republishing you can reach broader audience within just a few clicks.

    Here is the setup instruction.


    Set up incoming stream


    We assume you've already installed Nimble Streamer and you're familiar with live streaming setup process.

    If you're not, please check installation instructions for your platform and also read these how-tos to transmux the incoming stream using various protocols as input:


    Same re-packaging engine is used in all scenarios so whatever protocol you use as input, you'll get all supported streams as output. This includes RTMP which can be played via public URL as well as re-published to any destination.

    Once you follow one of the mentioned scenarios, you'll get your stream ready for re-publishing to Limelight.

    Set up re-publishing to Limelight


    Go to Nimble Streamer -> Live streams settings menu. To see live streaming setup UI. Choose Republishing tab.


    Now click on Add Limelight RTMP button to see the dialog below.



    Here you need to fill the fields:

    • Source application - this is a source app
    • Source Stream - this is a source stream Nimble already has
    • Destination address and Port, then Destination application and Destination Stream as defined by Limelight
    • Authorization schema will be set to "Limelight"
    • Login and Password your Limelight login/password

    Saving these changes you'll see new entry in the list.




    That's it. Once you start restreaming you'll get the stream in the CDN. Check your output stream URLs provided by Limelight.

    Other useful features


    For other live streaming scenarios, check our live streaming use cases.

    We also recommend using our Live Transcoder for content transformation activities. It overcomes the performance of FFmpeg while having flexible web UI.

    Having that, you can create flexible delivery chains using Nimble Streamer for media hubs and WMSPanel for easy-to-use control panel. Install Nimble Streamer if you still haven't done that and contact us if your streaming scenarios need any updates of described functionality.

    Related documentation


    Live Streaming featuresLive Transcoder for Nimble StreamerRTMP feature setBuild streaming infrastructure with Nimble Streamer,

    Block users by HTTP Referer header via WMSAuth

    $
    0
    0
    Nimble Streamer paywall capabilities cover several aspects of content protection. Today we add another enhancement - access control based on HTTP "Referer" header.

    You may add required Referers as regular expressions into deny list to avoid them accessing the streams.

    Let's see how this is set up.

    Create Referers group


    First, let's go to "Control" -> "WMSAuth paywall setup" top menu to see the page with the list of current WMSAuth groups. Click on Referer groups to see the page below.


    Click on Add Referrer group to make a new tab for defining new group.



    Enter a regular expression for the Referrer which you don't like to be used by your customers and click Add rule.


    You may test it against any string by clicking on Test Referer string.


    Here you can enter some referrer and see if it matches your regular expression.

    Now when your Referrer group is defined, you can create a blocking WMSAuth rule.

    Create WMSAuth group and rule


    Go back to WMSAuth groups main page and click on Add WMSAuth group. After entering a name you'll see new group tab.

    Here you can assign servers which will be blocking your viewers by Referer. Simply choose it from the list and click on Assign server.


    Now click on Add rule to create a rule for applying referer block.


    Enter any name and then specify the application name which is planned to be affected. You may also specify a stream name if you need so the rule will be applied to the application + stream couple.

    Then scroll down to geo restrictions sections. You can see Allow and Deny lists.


    Also you can see drop down lists with countries, IP ranges, User Agent groups and finally Referer groups.

    Click on ">>" button to add designated referer group to Deny list.

    After clicking on Update WMSAuth rule your restriction will be applied within a few seconds.


    That's it - if your player will send HTTP Referer header in any request, and its content will match your regular expression, then it will be blocked.

    The State of Streaming Protocols - July 2016

    $
    0
    0
    WMSPanel team continues analyzing the state of streaming protocols.

    The metrics calculations are based on 3.16 billion views. The stats are collected from 2800+ media servers (Nimble Streamer and Wowza).

    Protocols share remain stable, i.e. HLS share is about 71% with progressive download still having 9%.

    The State of Streaming Protocols, July 2016

    You can compare that to June stats below.




    The State of Streaming Protocols, June 2016

    This report is brought to you by WMSPanel team. We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitterGoogle+ or LinkedIn.

    July news

    $
    0
    0
    July was very intense in terms of new features.

    Audio streaming


    Icecast/SHOUTcast transmuxing was entirely re-worked and it now allows processing audio streams via same engine as used for RTMP, RTSP and MPEG-TS. The output protocols are the same plus HLS and MPEG-DASH.
    It also allows using our Live Transcoder for audio transformation.
    Read this article to see how you can now setup Icecast transmuxing.
    Please upgrade or install latest Nimble Streamer and use new Icecast management interface in live pull settings in order to use all new benefits.

    Live Transcoder 


    Live Transcoder also allows manipulating audio channels to split stereo signal into 2 separate streams. Read this article for details.

    Speaking of Live Transcoder, we've added Linux support for Intel QuickSync. So if you have Intel-powered hardware, you should check this capability.

    CDNs support


    We've added support for RTMP publishing into most popular live CDNs - Akamai and Limelight.
    They have special requirements for RTMP authentication and we have them covered now.
    Read more about Limelight RTMP setup and Akamai RTMP setup.

    Codecs support in transmuxing


    We've added a few more codecs into our transmuxing engine.


    All new codecs will be used in pass-through mode, with no changes to the content

    You can see full list of supported codecs here. There are plenty of them there.

    Also, we've update MP4 support to handle EditListBox (elst) primitive for VOD transmuxing.

    Paywall update


    Our customers we asking for HTTP Referer header support in our paywall feature set. So we've added Referer groups into WMSAuth engine. They can be used in Deny lists.

    Mobile broadcasting SDK


    Our SDK for iOS was updated with live rotation capabilities that work the same way as in Android.
    You can install Larix Broadcaster from AppStore to try it out and get mobile SDK on our website.


    WMSPanel API


    WMSPanel control API was updated with HTTP origin alias methods.
    We've also updated the API reference page with new UI for your convenience. It's now easier to find any functionality or method.



    The last but not the least update: check the State of Streaming Protocols for July 2016.


    Follow us at FacebookTwitterGoogle+ or LinkedIn to get latest news and updates of our products and services.

    PCM G.711 audio support in Nimble Streamer

    Server logging in Nimble Streamer

    $
    0
    0
    As any server software, Nimble Streamer allows tracking its behavior and performance via logging.

    By default, Nimble Streamer server logs are available in /var/log/nimble.log for Linux and in application log folder for Windows. However you can control that in a config file. Read this article to see how you can define destination for your logs.

    You may also choose logging level for server information via web UI. The more detailed logging is selected, the more information you'll get as output.

    All settings are applied without server restart which allows easily manage its behavior depending on your needs.



    Go to Servers menu and then click on the name of the server which you need to make changes for.



    In server details page, click on Edit to see the following page.


    Here you can see Log mode drop-down selector. It has the following values:

    • error - this is the default mode, it writes down only errors
    • info - gives information from "error" mode plus general information about server work
    • verbose - adds more details on top of "info" mode output
    • debug - most detailed output.

    We recommend enabling debug log mode in case you want to contact our support team.

    As was mentioned earlier, all settings are applied dynamically without server restart.

    If you need more info about access logs, check this article.

    Related documentation


    Nimble Streamer, Nimble Streamer config file description,

    Subtitles support for VOD

    $
    0
    0
    Nimble Streamer has wide VOD streaming feature set which covers many HLS-related capabilities like ABR streaming or multiple tracks support.

    Now Nimble Streamer is capable of adding subtitles to VOD streams.
    Supported formats include  WebVTT, SRT and TTML.

    Let's see how they can be used. Major scenarios for VOD subtitles usage can be split into those two categories:

    • Simple scenario - a VOD file with one rendition has one corresponding language subtitles file. This is most common use case and it's handled in a simple way.
    • Complex scenarios like multiple languages for multiple renditions or other versions of the content. It's based on SMIL files usage and it can cover various use cases.
    Let's take a look at both of them.

    One subtitles file and one rendition


    If you have just 1 subtitles file, you need to name it the same way as the original MP4 file. Like this:
    /home/user/content/mp4/sample_with_subtitles.srt
    /home/user/content/mp4/sample_with_subtitles.mp4
    So when you have VOD route set, your playlist URL will look like
    http://your.domain/mp4/sample_with_subtitles.mp4/playlist.m3u8
    And the playlist itself will be formed like this:
    #EXTM3U
    #EXT-X-VERSION:3
    #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="sub",NAME="English",URI="subtitle.m3u8?nimblesessionid=91",LANGUAGE="eng"
    #EXT-X-STREAM-INF:BANDWIDTH=1049607,CODECS="avc1.66.30,mp4a.40.2",RESOLUTION=424x240,SUBTITLES="sub"
    chunk.m3u8?nimblesessionid=91
    Now the player will be able to get subtitles using the URI mentioned there.

    Default language

    Unlike TTML, the WebVTT and SRT formats have no language code in their structure so there's no way to determine it. This is why we added vod_subtitle_default_language_id parameter into Nimble configuration file. It's alpha3 language code and it's "eng" by default if nothing else is specified.
    If you need to use combination of different languages on the same server, you should look at the next section.

    Multiple subtitles and/or renditions


    SMIL format is a way you can describe VOD content which has combination of video and audio, like aforementioned ABR streaming.

    This format can also be used for subtitles. You can specify which subtitle files will be used for which video files. Having SMIL file, the result URL will look like this:
    http://your.domain/path/smil:bunny.smil/playlist.m3u8
    Let's take a look at some examples. All sample SMIL files and playlists mentioned below can be found on WMSPanel github page.

    One WebVTT subtitles file for 4 renditions of same video


    Simple use case when you have just 1 subtitles file and you need to apply it to multiple renditions.


    Two SRT subtitle files for 4 different renditions


    Check 2 "textstream" elements specifying applied English and Spanish SRT subtitles.


    One TTML subtitles file with 5 languages for 4 renditions


    Here one textstream file specifies a reference to TTML file containing 5 languages.


    Mix of languages and renditions


    You can see "first_group" of subtitles (see subgroup="first_group") containing one set of languages (English, German and Norwegian) and it's applied to bigbuckbunny_450_first.mp4 and bigbuckbunny_750_first.mp4 files.
    The default group of subtitles contains English, French and Spanish and it's applied to bigbuckbunny_450.mp4 and bigbuckbunny_750.mp4 files that a basically same movies but have probably other audio tracks or logos etc.
    So defining groups of subtitles and assigning them to each "video" tag will make Nimble Streamer produce corresponding playlist entries.
    If you don't specify the groups then all languages will be used by the player to display to the user.


    As you see simple SMIL syntax allows mixing various combinations of content and subtitles so it's up to you to define what is best for your case.

    Contact us if you have any questions or suggestions regarding this feature set.

    Related documentation


    Nimble StreamerHLS feature setVOD streaming in NimbleTTML format,

    The State of Streaming Protocols - August 2016

    $
    0
    0
    WMSPanel team continues analyzing the state of streaming protocols.

    The metrics calculations are based on 3.17 billion views. The stats are collected from 2800+ media servers (Nimble Streamer and Wowza).

    Protocols share remain stable, i.e. HLS share is about 73% with progressive download near 11% and RTMP around 10%.

    The State of Streaming Protocols, August 2016

    You can compare that to July stats below.





    The State of Streaming Protocols, July 2016

    This report is brought to you by WMSPanel team. We'll keep analyzing protocols to see the dynamics. Check our updates at FacebookTwitterGoogle+ or LinkedIn.
    Viewing all 436 articles
    Browse latest View live