blob: 27d2d1ff9d814abb37b582a1a8d34429e4c0acb3 [file] [log] [blame]
<chapter id="chapter-playback-components">
<title>Playback Components</title>
<para>
&GStreamer; includes several higher-level components to simplify an
application developer's life. All of the components discussed here (for now) are
targetted at media playback. The idea of each of these components is
to integrate as closely as possible with a &GStreamer; pipeline, but
to hide the complexity of media type detection and several other
rather complex topics that have been discussed in <xref
linkend="part-advanced"/>.
</para>
<para>
We currently recommend people to use either playbin (see <xref
linkend="section-components-playbin"/>) or decodebin (see <xref
linkend="section-components-decodebin"/>), depending on their needs.
Playbin is the recommended solution for everything related to simple
playback of media that should just work. Decodebin is a more flexible
autoplugger that could be used to add more advanced features, such
as playlist support, crossfading of audio tracks and so on. Its
programming interface is more low-level than that of playbin, though.
</para>
<sect1 id="section-components-playbin">
<title>Playbin</title>
<para>
Playbin is an element that can be created using the standard &GStreamer;
API (e.g. <function>gst_element_factory_make ()</function>). The factory
is conveniently called <quote>playbin</quote>. By being a
<classname>GstPipeline</classname> (and thus a
<classname>GstElement</classname>), playbin automatically supports all
of the features of this class, including error handling, tag support,
state handling, getting stream positions, seeking, and so on.
</para>
<para>
Setting up a playbin pipeline is as simple as creating an instance of
the playbin element, setting a file location using the
<quote>uri</quote> property on playbin, and then setting the element
to the <classname>GST_STATE_PLAYING</classname> state (the location has to be a valid
URI, so <quote>&lt;protocol&gt;://&lt;location&gt;</quote>, e.g.
file:///tmp/my.ogg or http://www.example.org/stream.ogg). Internally,
playbin will set up a pipeline to playback the media location.
</para>
<programlisting><!-- example-begin playbin.c a -->
#include &lt;gst/gst.h&gt;
<!-- example-end playbin.c a -->
[.. my_bus_callback goes here ..]<!-- example-begin playbin.c b --><!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
--><!-- example-end playbin.c b -->
<!-- example-begin playbin.c c -->
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *play;
GstBus *bus;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have a URI */
if (argc != 2) {
g_print ("Usage: %s &lt;URI&gt;\n", argv[0]);
return -1;
}
/* set up */
play = gst_element_factory_make ("playbin", "play");
g_object_set (G_OBJECT (play), "uri", argv[1], NULL);
bus = gst_pipeline_get_bus (GST_PIPELINE (play));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
gst_element_set_state (play, GST_STATE_PLAYING);
/* now run */
g_main_loop_run (loop);
/* also clean up */
gst_element_set_state (play, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (play));
return 0;
}
<!-- example-end playbin.c c --></programlisting>
<para>
Playbin has several features that have been discussed previously:
</para>
<itemizedlist>
<listitem>
<para>
Settable video and audio output (using the <quote>video-sink</quote>
and <quote>audio-sink</quote> properties).
</para>
</listitem>
<listitem>
<para>
Mostly controllable and trackable as a
<classname>GstElement</classname>, including error handling, eos
handling, tag handling, state handling (through the
<classname>GstBus</classname>), media position handling and
seeking.
</para>
</listitem>
<listitem>
<para>
Buffers network-sources, with buffer fullness notifications being
passed through the <classname>GstBus</classname>.
</para>
</listitem>
<listitem>
<para>
Supports visualizations for audio-only media.
</para>
</listitem>
<listitem>
<para>
Supports subtitles, both in the media as well as from separate
files. For separate subtitle files, use the <quote>suburi</quote>
property.
</para>
</listitem>
<listitem>
<para>
Supports stream selection and disabling. If your media has
multiple audio or subtitle tracks, you can dynamically choose
which one to play back, or decide to turn it off altogether
(which is especially useful to turn off subtitles). For each
of those, use the <quote>current-text</quote> and other related
properties.
</para>
</listitem>
</itemizedlist>
<para>
For convenience, it is possible to test <quote>playbin</quote> on
the commandline, using the command <quote>gst-launch-1.0 playbin
uri=file:///path/to/file</quote>.
</para>
</sect1>
<sect1 id="section-components-decodebin">
<title>Decodebin</title>
<para>
Decodebin is the actual autoplugger backend of playbin, which was
discussed in the previous section. Decodebin will, in short, accept
input from a source that is linked to its sinkpad and will try to
detect the media type contained in the stream, and set up decoder
routines for each of those. It will automatically select decoders.
For each decoded stream, it will emit the <quote>pad-added</quote>
signal, to let the client know about the newly found decoded stream.
For unknown streams (which might be the whole stream), it will emit
the <quote>unknown-type</quote> signal. The application is then
responsible for reporting the error to the user.
</para>
<programlisting><!-- example-begin decodebin.c a -->
<![CDATA[
#include <gst/gst.h>
]]>
<!-- example-end decodebin.c a -->
[.. my_bus_callback goes here ..]<!-- example-begin decodebin.c b -->
<!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
--><!-- example-end decodebin.c b -->
<!-- example-begin decodebin.c c -->
<![CDATA[
GstElement *pipeline, *audio;
static void
cb_newpad (GstElement *decodebin,
GstPad *pad,
gpointer data)
{
GstCaps *caps;
GstStructure *str;
GstPad *audiopad;
/* only link once */
audiopad = gst_element_get_static_pad (audio, "sink");
if (GST_PAD_IS_LINKED (audiopad)) {
g_object_unref (audiopad);
return;
}
/* check media type */
caps = gst_pad_query_caps (pad, NULL);
str = gst_caps_get_structure (caps, 0);
if (!g_strrstr (gst_structure_get_name (str), "audio")) {
gst_caps_unref (caps);
gst_object_unref (audiopad);
return;
}
gst_caps_unref (caps);
/* link'n'play */
gst_pad_link (pad, audiopad);
g_object_unref (audiopad);
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *src, *dec, *conv, *sink;
GstPad *audiopad;
GstBus *bus;
/* init GStreamer */
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have input */
if (argc != 2) {
g_print ("Usage: %s <filename>\n", argv[0]);
return -1;
}
/* setup */
pipeline = gst_pipeline_new ("pipeline");
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
src = gst_element_factory_make ("filesrc", "source");
g_object_set (G_OBJECT (src), "location", argv[1], NULL);
dec = gst_element_factory_make ("decodebin", "decoder");
g_signal_connect (dec, "pad-added", G_CALLBACK (cb_newpad), NULL);
gst_bin_add_many (GST_BIN (pipeline), src, dec, NULL);
gst_element_link (src, dec);
/* create audio output */
audio = gst_bin_new ("audiobin");
conv = gst_element_factory_make ("audioconvert", "aconv");
audiopad = gst_element_get_static_pad (conv, "sink");
sink = gst_element_factory_make ("alsasink", "sink");
gst_bin_add_many (GST_BIN (audio), conv, sink, NULL);
gst_element_link (conv, sink);
gst_element_add_pad (audio,
gst_ghost_pad_new ("sink", audiopad));
gst_object_unref (audiopad);
gst_bin_add (GST_BIN (pipeline), audio);
/* run */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (loop);
/* cleanup */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
}
]]>
<!-- example-end decodebin.c c --></programlisting>
<para>
Decodebin, similar to playbin, supports the following features:
</para>
<itemizedlist>
<listitem>
<para>
Can decode an unlimited number of contained streams to decoded
output pads.
</para>
</listitem>
<listitem>
<para>
Is handled as a <classname>GstElement</classname> in all ways,
including tag or error forwarding and state handling.
</para>
</listitem>
</itemizedlist>
<para>
Although decodebin is a good autoplugger, there's a whole lot of
things that it does not do and is not intended to do:
</para>
<itemizedlist>
<listitem>
<para>
Taking care of input streams with a known media type (e.g. a DVD,
an audio-CD or such).
</para>
</listitem>
<listitem>
<para>
Selection of streams (e.g. which audio track to play in case of
multi-language media streams).
</para>
</listitem>
<listitem>
<para>
Overlaying subtitles over a decoded video stream.
</para>
</listitem>
</itemizedlist>
<para>
Decodebin can be easily tested on the commandline, e.g. by using the
command <command>gst-launch-1.0 filesrc location=file.ogg ! decodebin
! audioconvert ! audioresample ! autoaudiosink</command>.
</para>
</sect1>
<sect1 id="section-components-uridecodebin">
<title>URIDecodebin</title>
<para>
The uridecodebin element is very similar to decodebin, only that it
automatically plugs a source plugin based on the protocol of the URI
given.
</para>
<para>
Uridecodebin will also automatically insert buffering elements when
the uri is a slow network source. The buffering element will post
BUFFERING messages that the application needs to handle as explained
in <xref linkend="chapter-buffering"/>.
The following properties can be used to configure the buffering method:
</para>
<itemizedlist>
<listitem>
<para>
The buffer-size property allows you to configure a maximum size in
bytes for the buffer element.
</para>
</listitem>
<listitem>
<para>
The buffer-duration property allows you to configure a maximum size
in time for the buffer element. The time will be estimated based on
the bitrate of the network.
</para>
</listitem>
<listitem>
<para>
With the download property you can enable the download buffering
method as described in <xref linkend="section-buffering-download"/>.
Setting this option to TRUE will only enable download buffering
for selected formats such as quicktime, flash video, avi and
webm.
</para>
</listitem>
<listitem>
<para>
You can also enable buffering on the parsed/demuxed data with the
use-buffering property. This is interesting to enable buffering
on slower random access media such as a network file server.
</para>
</listitem>
</itemizedlist>
<para>
URIDecodebin can be easily tested on the commandline, e.g. by using the
command <command>gst-launch-1.0 uridecodebin uri=file:///file.ogg !
! audioconvert ! audioresample ! autoaudiosink</command>.
</para>
</sect1>
<sect1 id="section-components-playsink">
<title>Playsink</title>
<para>
The playsink element is a powerful sink element. It has request pads
for raw decoded audio, video and text and it will configure itself to
play the media streams. It has the following features:
</para>
<itemizedlist>
<listitem>
<para>
It exposes GstStreamVolume, GstVideoOverlay, GstNavigation and
GstColorBalance interfaces and automatically plugs software
elements to implement the interfaces when needed.
</para>
</listitem>
<listitem>
<para>
It will automatically plug conversion elements.
</para>
</listitem>
<listitem>
<para>
Can optionally render visualizations when there is no video input.
</para>
</listitem>
<listitem>
<para>
Configurable sink elements.
</para>
</listitem>
<listitem>
<para>
Configurable audio/video sync offset to fine-tune synchronization
in badly muxed files.
</para>
</listitem>
<listitem>
<para>
Support for taking a snapshot of the last video frame.
</para>
</listitem>
</itemizedlist>
<para>
Below is an example of how you can use playsink. We use a uridecodebin
element to decode into raw audio and video streams which we then link
to the playsink request pads. We only link the first audio and video
pads, you could use an input-selector to link all pads.
</para>
<programlisting>
<!-- example-begin playsink.c a -->
<![CDATA[
#include <gst/gst.h>
]]>
<!-- example-end playsink.c a -->
[.. my_bus_callback goes here ..]
<!-- example-begin playsink.c b -->
<!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
-->
<!-- example-end playsink.c b -->
<!-- example-begin playsink.c c -->
<![CDATA[
GstElement *pipeline, *sink;
static void
cb_pad_added (GstElement *dec,
GstPad *pad,
gpointer data)
{
GstCaps *caps;
GstStructure *str;
const gchar *name;
GstPadTemplate *templ;
GstElementClass *klass;
/* check media type */
caps = gst_pad_query_caps (pad, NULL);
str = gst_caps_get_structure (caps, 0);
name = gst_structure_get_name (str);
klass = GST_ELEMENT_GET_CLASS (sink);
if (g_str_has_prefix (name, "audio")) {
templ = gst_element_class_get_pad_template (klass, "audio_sink");
} else if (g_str_has_prefix (name, "video")) {
templ = gst_element_class_get_pad_template (klass, "video_sink");
} else if (g_str_has_prefix (name, "text")) {
templ = gst_element_class_get_pad_template (klass, "text_sink");
} else {
templ = NULL;
}
if (templ) {
GstPad *sinkpad;
sinkpad = gst_element_request_pad (sink, templ, NULL, NULL);
if (!gst_pad_is_linked (sinkpad))
gst_pad_link (pad, sinkpad);
gst_object_unref (sinkpad);
}
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *dec;
GstBus *bus;
/* init GStreamer */
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have input */
if (argc != 2) {
g_print ("Usage: %s <uri>\n", argv[0]);
return -1;
}
/* setup */
pipeline = gst_pipeline_new ("pipeline");
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
dec = gst_element_factory_make ("uridecodebin", "source");
g_object_set (G_OBJECT (dec), "uri", argv[1], NULL);
g_signal_connect (dec, "pad-added", G_CALLBACK (cb_pad_added), NULL);
/* create audio output */
sink = gst_element_factory_make ("playsink", "sink");
gst_util_set_object_arg (G_OBJECT (sink), "flags",
"soft-colorbalance+soft-volume+vis+text+audio+video");
gst_bin_add_many (GST_BIN (pipeline), dec, sink, NULL);
/* run */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (loop);
/* cleanup */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
}
]]>
<!-- example-end playsink.c c -->
</programlisting>
<para>
This example will show audio and video depending on what you
give it. Try this example on an audio file and you will see that
it shows visualizations. You can change the visualization at runtime by
changing the vis-plugin property.
</para>
</sect1>
</chapter>