| <chapter id="chapter-helloworld"> |
| <title>Your first application</title> |
| <para> |
| This chapter will summarize everything you've learned in the previous |
| chapters. It describes all aspects of a simple &GStreamer; application, |
| including initializing libraries, creating elements, packing elements |
| together in a pipeline and playing this pipeline. By doing all this, |
| you will be able to build a simple Ogg/Vorbis audio player. |
| </para> |
| |
| <sect1 id="section-helloworld"> |
| <title>Hello world</title> |
| <para> |
| We're going to create a simple first application, a simple Ogg/Vorbis |
| command-line audio player. For this, we will use only standard |
| &GStreamer; components. The player will read a file specified on |
| the command-line. Let's get started! |
| </para> |
| <para> |
| We've learned, in <xref linkend="chapter-init"/>, that the first thing |
| to do in your application is to initialize &GStreamer; by calling |
| <function>gst_init ()</function>. Also, make sure that the application |
| includes <filename>gst/gst.h</filename> so all function names and |
| objects are properly defined. Use <function>#include |
| <gst/gst.h></function> to do that. |
| </para> |
| <para> |
| Next, you'll want to create the different elements using |
| <function>gst_element_factory_make ()</function>. For an Ogg/Vorbis |
| audio player, we'll need a source element that reads files from a |
| disk. &GStreamer; includes this element under the name |
| <quote>filesrc</quote>. Next, we'll need something to parse the |
| file and decode it into raw audio. &GStreamer; has two elements |
| for this: the first parses Ogg streams into elementary streams (video, |
| audio) and is called <quote>oggdemux</quote>. The second is a Vorbis |
| audio decoder, it's conveniently called <quote>vorbisdec</quote>. |
| Since <quote>oggdemux</quote> creates dynamic pads for each elementary |
| stream, you'll need to set a <quote>pad-added</quote> event handler |
| on the <quote>oggdemux</quote> element, like you've learned in |
| <xref linkend="section-pads-dynamic"/>, to link the Ogg demuxer and |
| the Vorbis decoder elements together. At last, we'll also need an |
| audio output element, we will use <quote>autoaudiosink</quote>, which |
| automatically detects your audio device. |
| </para> |
| <para> |
| The last thing left to do is to add all elements into a container |
| element, a <classname>GstPipeline</classname>, and iterate this |
| pipeline until we've played the whole song. We've previously |
| learned how to add elements to a container bin in <xref |
| linkend="chapter-bins"/>, and we've learned about element states |
| in <xref linkend="section-elements-states"/>. We will also attach |
| a message handler to the pipeline bus so we can retrieve errors |
| and detect the end-of-stream. |
| </para> |
| <para> |
| Let's now add all the code together to get our very first audio |
| player: |
| </para> |
| <programlisting> |
| <!-- example-begin helloworld.c --> |
| #include <gst/gst.h> |
| #include <glib.h> |
| |
| |
| static gboolean |
| bus_call (GstBus *bus, |
| GstMessage *msg, |
| gpointer data) |
| { |
| GMainLoop *loop = (GMainLoop *) data; |
| |
| switch (GST_MESSAGE_TYPE (msg)) { |
| |
| case GST_MESSAGE_EOS: |
| g_print ("End of stream\n"); |
| g_main_loop_quit (loop); |
| break; |
| |
| case GST_MESSAGE_ERROR: { |
| gchar *debug; |
| GError *error; |
| |
| gst_message_parse_error (msg, &error, &debug); |
| g_free (debug); |
| |
| g_printerr ("Error: %s\n", error->message); |
| g_error_free (error); |
| |
| g_main_loop_quit (loop); |
| break; |
| } |
| default: |
| break; |
| } |
| |
| return TRUE; |
| } |
| |
| |
| static void |
| on_pad_added (GstElement *element, |
| GstPad *pad, |
| gpointer data) |
| { |
| GstPad *sinkpad; |
| GstElement *decoder = (GstElement *) data; |
| |
| /* We can now link this pad with the vorbis-decoder sink pad */ |
| g_print ("Dynamic pad created, linking demuxer/decoder\n"); |
| |
| sinkpad = gst_element_get_static_pad (decoder, "sink"); |
| |
| gst_pad_link (pad, sinkpad); |
| |
| gst_object_unref (sinkpad); |
| } |
| |
| |
| |
| int |
| main (int argc, |
| char *argv[]) |
| { |
| GMainLoop *loop; |
| |
| GstElement *pipeline, *source, *demuxer, *decoder, *conv, *sink; |
| GstBus *bus; |
| |
| /* Initialisation */ |
| gst_init (&argc, &argv); |
| |
| loop = g_main_loop_new (NULL, FALSE); |
| |
| |
| /* Check input arguments */ |
| if (argc != 2) { |
| g_printerr ("Usage: %s <Ogg/Vorbis filename>\n", argv[0]); |
| return -1; |
| } |
| |
| |
| /* Create gstreamer elements */ |
| pipeline = gst_pipeline_new ("audio-player"); |
| source = gst_element_factory_make ("filesrc", "file-source"); |
| demuxer = gst_element_factory_make ("oggdemux", "ogg-demuxer"); |
| decoder = gst_element_factory_make ("vorbisdec", "vorbis-decoder"); |
| conv = gst_element_factory_make ("audioconvert", "converter"); |
| sink = gst_element_factory_make ("autoaudiosink", "audio-output"); |
| |
| if (!pipeline || !source || !demuxer || !decoder || !conv || !sink) { |
| g_printerr ("One element could not be created. Exiting.\n"); |
| return -1; |
| } |
| |
| /* Set up the pipeline */ |
| |
| /* we set the input filename to the source element */ |
| g_object_set (G_OBJECT (source), "location", argv[1], NULL); |
| |
| /* we add a message handler */ |
| bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline)); |
| gst_bus_add_watch (bus, bus_call, loop); |
| gst_object_unref (bus); |
| |
| /* we add all elements into the pipeline */ |
| /* file-source | ogg-demuxer | vorbis-decoder | converter | alsa-output */ |
| gst_bin_add_many (GST_BIN (pipeline), |
| source, demuxer, decoder, conv, sink, NULL); |
| |
| /* we link the elements together */ |
| /* file-source -> ogg-demuxer ~> vorbis-decoder -> converter -> alsa-output */ |
| gst_element_link (source, demuxer); |
| gst_element_link_many (decoder, conv, sink, NULL); |
| g_signal_connect (demuxer, "pad-added", G_CALLBACK (on_pad_added), decoder); |
| |
| /* note that the demuxer will be linked to the decoder dynamically. |
| The reason is that Ogg may contain various streams (for example |
| audio and video). The source pad(s) will be created at run time, |
| by the demuxer when it detects the amount and nature of streams. |
| Therefore we connect a callback function which will be executed |
| when the "pad-added" is emitted.*/ |
| |
| |
| /* Set the pipeline to "playing" state*/ |
| g_print ("Now playing: %s\n", argv[1]); |
| gst_element_set_state (pipeline, GST_STATE_PLAYING); |
| |
| |
| /* Iterate */ |
| g_print ("Running...\n"); |
| g_main_loop_run (loop); |
| |
| |
| /* Out of the main loop, clean up nicely */ |
| g_print ("Returned, stopping playback\n"); |
| gst_element_set_state (pipeline, GST_STATE_NULL); |
| |
| g_print ("Deleting pipeline\n"); |
| gst_object_unref (GST_OBJECT (pipeline)); |
| |
| return 0; |
| } |
| <!-- example-end helloworld.c --> |
| </programlisting> |
| <para> |
| We now have created a complete pipeline. We can visualise the |
| pipeline as follows: |
| </para> |
| |
| <figure float="1" id="section-hello-img"> |
| <title>The "hello world" pipeline</title> |
| <mediaobject> |
| <imageobject> |
| <imagedata scale="75" fileref="images/hello-world.ℑ" format="&IMAGE;" /> |
| </imageobject> |
| </mediaobject> |
| </figure> |
| |
| </sect1> |
| |
| <sect1 id="section-helloworld-compilerun"> |
| <title>Compiling and Running helloworld.c</title> |
| <para> |
| To compile the helloworld example, use: <command>gcc -Wall |
| helloworld.c -o helloworld |
| $(pkg-config --cflags --libs gstreamer-&GST_MAJORMINOR;)</command>. |
| &GStreamer; makes use of <command>pkg-config</command> to get compiler |
| and linker flags needed to compile this application. |
| </para> |
| <para> |
| If you're running a non-standard installation (ie. you've installed |
| GStreamer from source yourself instead of using pre-built packages), |
| make sure the <classname>PKG_CONFIG_PATH</classname> environment variable |
| is set to the correct location (<filename>$libdir/pkgconfig</filename>). |
| </para> |
| <para> |
| In the unlikely case that you are using an uninstalled GStreamer |
| setup (ie. gst-uninstalled), you will need to use libtool to build the |
| hello world program, like this: <command>libtool --mode=link gcc -Wall |
| helloworld.c -o helloworld |
| $(pkg-config --cflags --libs gstreamer-&GST_MAJORMINOR;)</command>. |
| </para> |
| <para> |
| You can run this example application with <command>./helloworld |
| file.ogg</command>. Substitute <filename>file.ogg</filename> |
| with your favourite Ogg/Vorbis file. |
| </para> |
| </sect1> |
| |
| <sect1 id="section-hello-world-conclusion"> |
| <title>Conclusion</title> |
| <para> |
| This concludes our first example. As you see, setting up a pipeline |
| is very low-level but powerful. You will see later in this manual how |
| you can create a more powerful media player with even less effort |
| using higher-level interfaces. We will discuss all that in <xref |
| linkend="part-highlevel"/>. We will first, however, go more in-depth |
| into more advanced &GStreamer; internals. |
| </para> |
| <para> |
| It should be clear from the example that we can very easily replace |
| the <quote>filesrc</quote> element with some other element that |
| reads data from a network, or some other data source element that |
| is better integrated with your desktop environment. Also, you can |
| use other decoders and parsers/demuxers to support other media types. You |
| can use another audio sink if you're not running Linux, but Mac OS X, |
| Windows or FreeBSD, or you can instead use a filesink to write audio |
| files to disk instead of playing them back. By using an audio card |
| source, you can even do audio capture instead of playback. All this |
| shows the reusability of &GStreamer; elements, which is its greatest |
| advantage. |
| </para> |
| </sect1> |
| </chapter> |