Storyboard Suite includes a media plugin for Storyboard Engine that supports the playback of several different types of audio and video formats. The media plugin provides a common action interface for creating and controlling media playback using a variety of different backend media player services.
It is the backend services that perform the decoding of content. The display of that content may be to a Storyboard External Buffer render extension which will be composited into the Storyboard application, or the service may have the ability to render to discrete hardware layers available on the platform and composited to the Storyboard application directly in hardware.
External media backend services are provided for particular operating system and hardware based on their underlying media support. Currently Storyboard provides two external media backends, which are described in the section called “Media Backend Services”.
Audio and Video support is not universally available on all embedded target platforms. It is however available on MacOS, Linux and Windows desktop simulation platforms.
Video and Audio playback and control are accomplished by using a set of defined actions. When creating a media application you must
include these media actions in your Storyboard project. The media actions must be installed on a project by project basis and are defined as
Action Templates. The media action template is located in your Storyboard Suite installation root under
the directory Samples/ActionDefinitions/media.sbat
. Copy this file to the templates
directory
of your project in order to be able to use media actions in your project. Once you have copied the file over, you will have to close and re-open
your application to see the media actions.
In order to create a Storyboard application that will play video, you must first add a control to your project containing an External Render Extension. This render extension will be the display target where the video content will be rendered, so size and position it appropriately in your application. In the properties of the external render extension you must set the Buffer Name and Object Path values. These values will be required to play the video properly and will be used to point to where the video content should be displayed.
To initiate video playback, you will use one of the media actions. You will need to add the predefined media action
gra.media.new.video
to your application, which will tell the media plugin to play your video.
As discussed in Chapter 8, Connecting Events to Actions this action can be configured to be a response to any event.
In the action parameters, you will specify the Channel_name parameter, which will be used as the name of the
Storyboard IO channel that will be created for communication with the media plugin. The channel name should not
conflict with the application name and should be relevant to its use, a good suggestion might be media
. Specify the project relative name
of the video file to play in the parameter Media_name, for example video/myvideo.webm
.
Additionally fill in the parameters that will link the media command to the external buffer where the playback is to occur. For this you configure
the External_buffer_name with the same name as the Buffer Name used in the external render extension and
the Object_name with the same name as the Object Path. The Output_width and
Output_height should also match the dimensions of the external buffer. Finally the parameter Output_depth
should be set to 4.
Other media actions allow you to control the video playback operation once it has been started, including providing pause, resume, stop and seek action commands. Storyboard Designer includes a media application in its samples named . For more information on loading samples, refer to the section called “New Project from a Storyboard Sample”.
The media backend service does the work of decoding, playing and controlling the media based on requests from the Storyboard application
over a Storyboard IO channel. The default Storyboard IO channel name is com.crank.media_backend
. This value can be overwritten by
setting the SBMEDIA_CHANNEL_NAME environment variable to a new value. The FFmpeg plugin is loaded automatically when it is present in the Storyboard plugins directory. If the gstreamer-backend is going to be used to play media, the FFmpeg plugin needs to be removed from the Storyboard plugins directory or else both backends will compete to service media requests.
This media backend uses the gstreamer framework to play and control audio and video files. In order to use this backend the platform must have gstreamer and the required plugins installed. It is a good idea to try and play content with the “gst-launch” application to ensure a proper installation before running gstreamer-backend. This backend also uses Storyboard IO for communication with the Storyboard application so please ensure Storyboard IO is functional and the application has the “greio” plugin loaded.
Options:
-e : Render the video content with an external buffer -p pipeline: Use the defined gstreamer pipeline to play the media -v: increase verbosity, debug output
The “new.audio” and “new.video” actions take an extra_data argument. This argument is a string which can contain the following options which must be separated by a “;”.
You can specify the gstreamer pipeline used to play the particular media
by either passing it on the command line to gstreamer-backend with the -p
option or by passing it to the actions. The pipeline can be passed in
as:
“pipeline:[your pipeline]”
This pipeline can be similar to the one used with the gst-launch
application with a few minor modifications. In order to allow the changing
of the media file the first part of the pipeline must contain a named
filesrc element as follows:
“pipeline:filesrc location=video.mov name=media-src”
Doing this will allow the code to find the named element and replace the location with a new video file.
This is a plugin to Storyboard which uses the FFmpeg libraries to play and control audio and video files. In order to use this backend, the plugin must be included with in the runtime engine. You can play a video from the Storyboard Designer Simulator, as well as using the Storyboard Engine on supported platforms. Note that at this time, FFmpeg ships with only WebM video format support and Ogg audio format support.
The “new.audio” and “new.video” actions take an extra_data argument. When using FFmpeg, no extra data is required and this argument can be empty.