You are on page 1of 44

Python GStreamer Tutorial

Jens Persson
Version 0.10.3
February 13, 2009
Abstract
This tutorial aims at giving a brief introduction to the GStreamer multimedia framework.

Table of Contents
1. Introduction
2. Playbin
3. Pipeline
4. Src, sink, pad ... oh my!
5. Seeking
6. Capabilities
7. Videomixer
8. Webcam Viewer
A. ChangeLog
B. Third Party Tutorials

1
2
1. Introduction
This tutorial is meant to be a quick way to get to know more about Gstreamer but it'll take some time to write it though because we don't know it
ourselves ... yet. We're usually using GNU/Linux and GTK in the examples but we try to keep the GUI code to an absolute minimum so it should not
get in the way. Just remember that Gstreamer depends heavily on Glib so you must make sure that the Glib Mainloop is running if you want to catch
events on the bus. We take for granted that you are at least a fairly descent Python coder. For problems related to the Python language we redirect you
over to here. :D
As you may see this tutorial is far from done and we are always looking for new people to join this project. If you want to write a chapter just do it and
submit it to some of the people having commiting access to the source. You find us here. TIA
There are also some example coding distributed with the PyGST source which you may browse here.
Gstreamer comes with a few handy CLI programs that helps you find and try out what you need in a very fast and convenient way. With "gst­inspect"
you can track highlevel elements which are shipped with the various plugins packages.
$ man gst-inspect-0.10

If you are looking for an element but you don't know its name you can use it with grep. Getting the elements that handles ie mp3 is done like this:
$ gst-inspect-0.10 | grep mp3

Playbin is an autoplugger element which usually plays anything you throw at it, if you have the appropriate plugins installed.
$ gst-inspect-0.10 playbin

You can also run pipelines directly in a terminal with "gst­launch":


$ man gst-launch-0.10

For playing a file with playbin:


$ gst-launch-0.10 playbin uri=file:///path/to/file.mkv

3
It's also possible to link elements together with "!":
$ gst-launch-0.10 audiotestsrc ! alsasink

You may also make different streams in the pipeline:


$ gst-launch-0.10 audiotestsrc ! alsasink videotestsrc ! xvimagesink

If you are using the "name" property you may use the same element more than once. Just put a "." after its name, eg with oggmux here.
$ gst-launch-0.10 audiotestsrc ! vorbisenc ! oggmux name=mux ! filesink location=file.ogg videotestsrc ! theoraenc ! mux.

In the next chapter we will show you more examples with Playbin.

4
2. Playbin
A playbin is a highlevel, automatic, audio and video player. You create a playbin object with:
my_playbin = gst.element_factory_make("playbin2", "my-playbin")

To get information about a playbin run:


$ gst-inspect-0.10 playbin2

This figure shows how playbin is built internally. The "optional stuff" are things that could be platform specific or things that you may set with
properties.

playbin2
____________________________________________________________________
| ________________ _______________ |
| | | | | |
| ->-| optional stuff |->-| autoaudiosink |->-|->- Audio Output
| ________________ | |________________| |_______________| |
| | |-- |
uri ->-|->-| uridecodebin | |
| |________________|-- ________________ _______________ |
| | | | | | |
| ->-| optional stuff |->-| autovideosink |->-|->- Video Output
| |________________| |_______________| |
|____________________________________________________________________|

The "uri" property should take any possible protocol supported by your Gstreamer plugins. One nice feature is that you may switch the sinks out for
your own bins as shown below. Playbin always tries to set up the best possible pipeline for your specific environment so if you don't need any special
features that are not implemented in playbin, it should in most cases just work "out of the box". Ok, time for a few examples.
This first example is just a simple audio player, insert a file with absolute path and it'll play.
Example 2.1

5
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Audio-Player")
14 window.set_default_size(300, -1)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.entry = gtk.Entry()
19 vbox.pack_start(self.entry, False, True)
20 self.button = gtk.Button("Start")
21 self.button.connect("clicked", self.start_stop)
22 vbox.add(self.button)
23 window.show_all()
24
25 self.player = gst.element_factory_make("playbin2", "player")
26 fakesink = gst.element_factory_make("fakesink", "fakesink")
27 self.player.set_property("video-sink", fakesink)
28 bus = self.player.get_bus()
29 bus.add_signal_watch()
30 bus.connect("message", self.on_message)
31
32 def start_stop(self, w):
33 if self.button.get_label() == "Start":
34 filepath = self.entry.get_text()
35 if os.path.isfile(filepath):
36 self.button.set_label("Stop")
37 self.player.set_property("uri", "file://" + filepath)
38 self.player.set_state(gst.STATE_PLAYING)
39 else:

6
40 self.player.set_state(gst.STATE_NULL)
41 self.button.set_label("Start")
42
43 def on_message(self, bus, message):
44 t = message.type
45 if t == gst.MESSAGE_EOS:
46 self.player.set_state(gst.STATE_NULL)
47 self.button.set_label("Start")
48 elif t == gst.MESSAGE_ERROR:
49 self.player.set_state(gst.STATE_NULL)
50 err, debug = message.parse_error()
51 print "Error: %s" % err, debug
52 self.button.set_label("Start")
53
54 GTK_Main()
55 gtk.gdk.threads_init()
56 gtk.main()

A playbin plugs both audio and video streams automagically so I've switched the videosink out to a fakesink element which is Gstreamer's answer to
/dev/null. If you want to enable video playback just comment out the following lines:
fakesink = gst.element_factory_make("fakesink", "my-fakesink")
self.player.set_property("video-sink", fakesink)

If you want to show the video output in a specified window you'll have to use the enable_sync_message_emission() method on the bus. Here is an
example with the video window embedded in the program.
Example 2.2
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:

7
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Video-Player")
14 window.set_default_size(500, 400)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 hbox = gtk.HBox()
19 vbox.pack_start(hbox, False)
20 self.entry = gtk.Entry()
21 hbox.add(self.entry)
22 self.button = gtk.Button("Start")
23 hbox.pack_start(self.button, False)
24 self.button.connect("clicked", self.start_stop)
25 self.movie_window = gtk.DrawingArea()
26 vbox.add(self.movie_window)
27 window.show_all()
28
29 self.player = gst.element_factory_make("playbin2", "player")
30 bus = self.player.get_bus()
31 bus.add_signal_watch()
32 bus.enable_sync_message_emission()
33 bus.connect("message", self.on_message)
34 bus.connect("sync-message::element", self.on_sync_message)
35
36 def start_stop(self, w):
37 if self.button.get_label() == "Start":
38 filepath = self.entry.get_text()
39 if os.path.isfile(filepath):
40 self.button.set_label("Stop")
41 self.player.set_property("uri", "file://" + filepath)
42 self.player.set_state(gst.STATE_PLAYING)
43 else:
44 self.player.set_state(gst.STATE_NULL)
45 self.button.set_label("Start")
46
47 def on_message(self, bus, message):
48 t = message.type

8
49 if t == gst.MESSAGE_EOS:
50 self.player.set_state(gst.STATE_NULL)
51 self.button.set_label("Start")
52 elif t == gst.MESSAGE_ERROR:
53 self.player.set_state(gst.STATE_NULL)
54 err, debug = message.parse_error()
55 print "Error: %s" % err, debug
56 self.button.set_label("Start")
57
58 def on_sync_message(self, bus, message):
59 if message.structure is None:
60 return
61 message_name = message.structure.get_name()
62 if message_name == "prepare-xwindow-id":
63 imagesink = message.src
64 imagesink.set_property("force-aspect-ratio", True)
65 gtk.gdk.threads_enter()
66 imagesink.set_xwindow_id(self.movie_window.window.xid)
67 gtk.gdk.threads_leave()
68
69 GTK_Main()
70 gtk.gdk.threads_init()
71 gtk.main()

And just to make things a little more complicated you can switch the playbins videosink to a gst.Bin with a gst.GhostPad on it. Here's an example with
a timeoverlay.
bin = gst.Bin("my-bin")
timeoverlay = gst.element_factory_make("timeoverlay")
bin.add(timeoverlay)
pad = timeoverlay.get_pad("video_sink")
ghostpad = gst.GhostPad("sink", pad)
bin.add_pad(ghostpad)
videosink = gst.element_factory_make("autovideosink")
bin.add(videosink)
gst.element_link_many(timeoverlay, videosink)
self.player.set_property("video-sink", bin)

9
Add that code to the example above and you'll get a timeoverlay too. We'll talk more about ghostpads later.
On peoples requests we add a CLI example which plays music, just run it with:
$ python cliplayer.py /path/to/file1.mp3 /path/to/file2.ogg

Example 2.3
1 #!/usr/bin/env python
2
3 import sys, os, time, thread
4 import glib, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class CLI_Main:
10
11 def __init__(self):
12 self.player = gst.element_factory_make("playbin2", "player")
13 fakesink = gst.element_factory_make("fakesink", "fakesink")
14 self.player.set_property("video-sink", fakesink)
15 bus = self.player.get_bus()
16 bus.add_signal_watch()
17 bus.connect("message", self.on_message)
18
19 def on_message(self, bus, message):
20 t = message.type
21 if t == gst.MESSAGE_EOS:
22 self.player.set_state(gst.STATE_NULL)
23 self.playmode = False
24 elif t == gst.MESSAGE_ERROR:
25 self.player.set_state(gst.STATE_NULL)
26 err, debug = message.parse_error()
27 print "Error: %s" % err, debug
28 self.playmode = False
29
30 def start(self):
31 for filepath in sys.argv[1:]:
32 if os.path.isfile(filepath):

10
33 self.playmode = True
34 self.player.set_property("uri", "file://" + filepath)
35 self.player.set_state(gst.STATE_PLAYING)
36 while self.playmode:
37 time.sleep(1)
38
39 time.sleep(1)
40 loop.quit()
41
42 mainclass = CLI_Main()
43 thread.start_new_thread(mainclass.start, ())
44 gobject.threads_init()
45 loop = glib.MainLoop()
46 loop.run()

A playbin implements a gst.Pipeline element and that's what the next chapter is going to tell you more about.

11
3. Pipeline
A gst.Pipeline is a toplevel bin with its own bus and clock. If your program only contains one bin­like object, this is what you're looking for. You create
a pipeline object with:
my_pipeline = gst.Pipeline("my-pipeline")

A pipeline is just a "container" where you can put other objects and when everything is in place and the file to play is specified you just set the
pipelines state to gst.STATE_PLAYING and there should be multimedia coming out of it.
In this first example I have taken the Audio­Player from the Playbin chapter and switched the playbin out for my own mp3 decoding capable pipeline.
You can also testdrive pipelines with a program called gst­launch directly in a shell. IE the next example below would look like this:
$ gst-launch-0.10 filesrc location=file.mp3 ! mad ! audioconvert ! alsasink

or ASCII style:
gst.Pipeline
_____________________________________________________________________
| |
| _________ _______ ______________ __________ |
| | | | | | | | | |
file.mp3 ->-|---| filesrc |-->--| mad |-->--| audioconvert |-->--| alsasink |---|->- Audio Output
| |_________| |_______| |______________| |__________| |
| |
|_____________________________________________________________________|

and the source:


Example 3.1
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst

12
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("MP3-Player")
14 window.set_default_size(400, 200)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.entry = gtk.Entry()
19 vbox.pack_start(self.entry, False, True)
20 self.button = gtk.Button("Start")
21 self.button.connect("clicked", self.start_stop)
22 vbox.add(self.button)
23 window.show_all()
24
25 self.player = gst.Pipeline("player")
26 source = gst.element_factory_make("filesrc", "file-source")
27 decoder = gst.element_factory_make("mad", "mp3-decoder")
28 conv = gst.element_factory_make("audioconvert", "converter")
29 sink = gst.element_factory_make("alsasink", "alsa-output")
30
31 self.player.add(source, decoder, conv, sink)
32 gst.element_link_many(source, decoder, conv, sink)
33
34 bus = self.player.get_bus()
35 bus.add_signal_watch()
36 bus.connect("message", self.on_message)
37
38 def start_stop(self, w):
39 if self.button.get_label() == "Start":
40 filepath = self.entry.get_text()
41 if os.path.isfile(filepath):
42 self.button.set_label("Stop")
43 self.player.get_by_name("file-source").set_property("location", filepath)
44 self.player.set_state(gst.STATE_PLAYING)

13
45 else:
46 self.player.set_state(gst.STATE_NULL)
47 self.button.set_label("Start")
48
49 def on_message(self, bus, message):
50 t = message.type
51 if t == gst.MESSAGE_EOS:
52 self.player.set_state(gst.STATE_NULL)
53 self.button.set_label("Start")
54 elif t == gst.MESSAGE_ERROR:
55 self.player.set_state(gst.STATE_NULL)
56 self.button.set_label("Start")
57 err, debug = message.parse_error()
58 print "Error: %s" % err, debug
59
60 GTK_Main()
61 gtk.gdk.threads_init()
62 gtk.main()

The next example is playing Mpeg2 videos. Some demuxers, such as mpegdemux, uses dynamic pads which are created at runtime and therefor you
can't link between the demuxer and the next element in the pipeline before the pad has been created at runtime. Watch out for the demuxer_callback()
method below.
THIS EXAMPLE IS NOT WORKING YET!!! You may submit a solution for it and we will announce a winner that gets, at your option, a date with
Richard M Stallman, Eric S Raymond or Scarlett Johansson. And before anyone asks, NO, you may only choose ONE of the above choices! TIA
UPDATE! The competition is over. Mike Auty fixed it with a few queues. He passed on the grand prize though saying he's too busy coding so no time
for dating. :D

14
Example 3.2
gst.Pipeline
___________________________________________________________________________________________________________
|
|
| _______ _______ ________________ _______________
|
| | | | | | | | |
|
| -->-| queue |->-| mad |->-| audioconvert |->-| autoaudiosink |-->--
>--|->- Audio Output
| _________ ___________ | |_______| |_______| |________________| |_______________|
|
| | | | |->--
|
file.mpg ->-|->-| filesrc |->-| mpegdemux |
|
| |_________| |___________|->-- _______ __________ __________________ _______________
|
| | | | | | | | | |
|
| -->-| queue |->-| mpeg2dec |->-| ffmpegcolorspace |->-| autovideosink |-
>-|->- Video Output
| |_______| |__________| |__________________| |_______________|
|
|
|
|
___________________________________________________________________________________________________________|

1 #!/usr/bin/env python

15
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Mpeg2-Player")
14 window.set_default_size(500, 400)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 hbox = gtk.HBox()
19 vbox.pack_start(hbox, False)
20 self.entry = gtk.Entry()
21 hbox.add(self.entry)
22 self.button = gtk.Button("Start")
23 hbox.pack_start(self.button, False)
24 self.button.connect("clicked", self.start_stop)
25 self.movie_window = gtk.DrawingArea()
26 vbox.add(self.movie_window)
27 window.show_all()
28
29 self.player = gst.Pipeline("player")
30 source = gst.element_factory_make("filesrc", "file-source")
31 demuxer = gst.element_factory_make("mpegdemux", "demuxer")
32 demuxer.connect("pad-added", self.demuxer_callback)
33 self.video_decoder = gst.element_factory_make("mpeg2dec", "video-decoder")
34 self.audio_decoder = gst.element_factory_make("mad", "audio-decoder")
35 audioconv = gst.element_factory_make("audioconvert", "converter")
36 audiosink = gst.element_factory_make("autoaudiosink", "audio-output")
37 videosink = gst.element_factory_make("autovideosink", "video-output")
38 self.queuea = gst.element_factory_make("queue", "queuea")
39 self.queuev = gst.element_factory_make("queue", "queuev")
40 colorspace = gst.element_factory_make("ffmpegcolorspace", "colorspace")

16
41
42 self.player.add(source, demuxer, self.video_decoder, self.audio_decoder, audioconv,
43 audiosink, videosink, self.queuea, self.queuev, colorspace)
44 gst.element_link_many(source, demuxer)
45 gst.element_link_many(self.queuev, self.video_decoder, colorspace, videosink)
46 gst.element_link_many(self.queuea, self.audio_decoder, audioconv, audiosink)
47
48 bus = self.player.get_bus()
49 bus.add_signal_watch()
50 bus.enable_sync_message_emission()
51 bus.connect("message", self.on_message)
52 bus.connect("sync-message::element", self.on_sync_message)
53
54 def start_stop(self, w):
55 if self.button.get_label() == "Start":
56 filepath = self.entry.get_text()
57 if os.path.isfile(filepath):
58 self.button.set_label("Stop")
59 self.player.get_by_name("file-source").set_property("location", filepath)
60 self.player.set_state(gst.STATE_PLAYING)
61 else:
62 self.player.set_state(gst.STATE_NULL)
63 self.button.set_label("Start")
64
65 def on_message(self, bus, message):
66 t = message.type
67 if t == gst.MESSAGE_EOS:
68 self.player.set_state(gst.STATE_NULL)
69 self.button.set_label("Start")
70 elif t == gst.MESSAGE_ERROR:
71 err, debug = message.parse_error()
72 print "Error: %s" % err, debug
73 self.player.set_state(gst.STATE_NULL)
74 self.button.set_label("Start")
75
76 def on_sync_message(self, bus, message):
77 if message.structure is None:
78 return
79 message_name = message.structure.get_name()

17
80 if message_name == "prepare-xwindow-id":
81 imagesink = message.src
82 imagesink.set_property("force-aspect-ratio", True)
83 gtk.gdk.threads_enter()
84 imagesink.set_xwindow_id(self.movie_window.window.xid)
85 gtk.gdk.threads_leave()
86
87 def demuxer_callback(self, demuxer, pad):
88 if pad.get_property("template").name_template == "video_%02d":
89 qv_pad = self.queuev.get_pad("sink")
90 pad.link(qv_pad)
91 elif pad.get_property("template").name_template == "audio_%02d":
92 qa_pad = self.queuea.get_pad("sink")
93 pad.link(qa_pad)
94
95 GTK_Main()
96 gtk.gdk.threads_init()
97 gtk.main()

The elements in a pipeline connects to each other with pads and that's what the next chapter will tell you more about.

18
4. Src, sink, pad ... oh my!
Hehe, now this isn't so complicated as it may seem at a first glance. A src is an object that is "sending" data and a sink is an object that is "recieving"
data. These objects connects to each other with pads. Pads could be either "src" or "sink". Most elements have both src and sink pads. IE a mad element
looks something like the ASCII figure below:
mad
________________________________________________________________
| |
|_______________ _____________________ ______________|
| | | | | |
-->-->--| pad name=sink |-->--| internal stuff here |-->--| pad name=src |-->-->--
|_______________| |_____________________| |______________|
| |
|________________________________________________________________|

And as always if you want to know more about highlevel elements gst­inspect is your friend:
$ gst-inspect-0.10 mad

There are many different ways to link elements together. In example 3.1 we used the element_link_many() function. You can also make a complete ready
to go pipeline with the parse_launch() function. The pipeline from example 3.1 would be done like this:
mp3_pipeline = gst.parse_launch("filesrc name=source ! mad name=decoder ! audioconvert name=conv ! alsasink name=sink")

You can of course also link pads manually with the link() method. Just make sure that you try to link a src­pad to a sink­pad. No rule though without
exceptions. :D A gst.GhostPad should be linked to a pad of the same kind as it self. We have already showed how a ghostpad works in the addition to
example 2.2. A gst.Bin can't link to other objects if you don't link a gst.GhostPad to an element inside the bin. In ASCII style example 2.2 should look
something like this:

19
gst.Bin
_______________________________________________________________________________/\
| \/
| timeoverlay
| ____________________________________________________________
|__________ |_______________ _____________________ ______________|
| | | | | | | |
-->-->--| ghostpad |-->--| pad name=sink |---| internal stuff here |---| pad name=src |-->--
|__________| |_______________| |_____________________| |______________|
| |____________________________________________________________|
|
|_______________________________________________________________________________/\
\/

And the ghostpad above should be created as type "sink"!!!


Some pads are not always available and are only created when they are in use. Such pads are called "dynamical pads". The next example will show how
to use dynamically created pads with an oggdemux. The link between the demuxer and the decoder is created with the demuxer_callback() method,
which is called whenever a pad is created in the demuxer using the "pad­added" signal.
Example 4.1
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)

20
13 window.set_title("Vorbis-Player")
14 window.set_default_size(500, 200)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.entry = gtk.Entry()
19 vbox.pack_start(self.entry, False)
20 self.button = gtk.Button("Start")
21 vbox.add(self.button)
22 self.button.connect("clicked", self.start_stop)
23 window.show_all()
24
25 self.player = gst.Pipeline("player")
26 source = gst.element_factory_make("filesrc", "file-source")
27 demuxer = gst.element_factory_make("oggdemux", "demuxer")
28 demuxer.connect("pad-added", self.demuxer_callback)
29 self.audio_decoder = gst.element_factory_make("vorbisdec", "vorbis-decoder")
30 audioconv = gst.element_factory_make("audioconvert", "converter")
31 audiosink = gst.element_factory_make("autoaudiosink", "audio-output")
32
33 self.player.add(source, demuxer, self.audio_decoder, audioconv, audiosink)
34 gst.element_link_many(source, demuxer)
35 gst.element_link_many(self.audio_decoder, audioconv, audiosink)
36
37 bus = self.player.get_bus()
38 bus.add_signal_watch()
39 bus.connect("message", self.on_message)
40
41 def start_stop(self, w):
42 if self.button.get_label() == "Start":
43 filepath = self.entry.get_text()
44 if os.path.isfile(filepath):
45 self.button.set_label("Stop")
46 self.player.get_by_name("file-source").set_property("location", filepath)
47 self.player.set_state(gst.STATE_PLAYING)
48 else:
49 self.player.set_state(gst.STATE_NULL)
50 self.button.set_label("Start")
51

21
52 def on_message(self, bus, message):
53 t = message.type
54 if t == gst.MESSAGE_EOS:
55 self.player.set_state(gst.STATE_NULL)
56 self.button.set_label("Start")
57 elif t == gst.MESSAGE_ERROR:
58 err, debug = message.parse_error()
59 print "Error: %s" % err, debug
60 self.player.set_state(gst.STATE_NULL)
61 self.button.set_label("Start")
62
63 def demuxer_callback(self, demuxer, pad):
64 adec_pad = self.audio_decoder.get_pad("sink")
65 pad.link(adec_pad)
66
67 GTK_Main()
68 gtk.gdk.threads_init()
69 gtk.main()

Now after reading through these four chapters you could need a break. Happy hacking and stay tuned for more interesting chapters to come.

22
5. Seeking
Seeking in Gstreamer is done with the seek() and seek_simple() methods. To be able to seek you will also need a gst.Format to tell Gstreamer what kind
of seek it should do. In the following example we will use a gst.FORMAT_TIME format constant which will as you may guess do a time seek. :D We
will also use the query_duration() and query_position() methods to get the file length and how long the file has currently played. Gstreamer uses
nanoseconds by default so you have to adjust to that.
In this next example we take the Vorbis­Player from example 4.1 and update it with some more stuff so it's able to seek and show duration and position.
Example 5.1

1 #!/usr/bin/env python
2
3 import sys, os, thread, time
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Vorbis-Player")
14 window.set_default_size(500, -1)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.entry = gtk.Entry()
19 vbox.pack_start(self.entry, False)
20 hbox = gtk.HBox()
21 vbox.add(hbox)
22 buttonbox = gtk.HButtonBox()
23 hbox.pack_start(buttonbox, False)
24 rewind_button = gtk.Button("Rewind")

23
25 rewind_button.connect("clicked", self.rewind_callback)
26 buttonbox.add(rewind_button)
27 self.button = gtk.Button("Start")
28 self.button.connect("clicked", self.start_stop)
29 buttonbox.add(self.button)
30 forward_button = gtk.Button("Forward")
31 forward_button.connect("clicked", self.forward_callback)
32 buttonbox.add(forward_button)
33 self.time_label = gtk.Label()
34 self.time_label.set_text("00:00 / 00:00")
35 hbox.add(self.time_label)
36 window.show_all()
37
38 self.player = gst.Pipeline("player")
39 source = gst.element_factory_make("filesrc", "file-source")
40 demuxer = gst.element_factory_make("oggdemux", "demuxer")
41 demuxer.connect("pad-added", self.demuxer_callback)
42 self.audio_decoder = gst.element_factory_make("vorbisdec", "vorbis-decoder")
43 audioconv = gst.element_factory_make("audioconvert", "converter")
44 audiosink = gst.element_factory_make("autoaudiosink", "audio-output")
45
46 self.player.add(source, demuxer, self.audio_decoder, audioconv, audiosink)
47 gst.element_link_many(source, demuxer)
48 gst.element_link_many(self.audio_decoder, audioconv, audiosink)
49
50 bus = self.player.get_bus()
51 bus.add_signal_watch()
52 bus.connect("message", self.on_message)
53
54 self.time_format = gst.Format(gst.FORMAT_TIME)
55
56 def start_stop(self, w):
57 if self.button.get_label() == "Start":
58 filepath = self.entry.get_text()
59 if os.path.isfile(filepath):
60 self.button.set_label("Stop")
61 self.player.get_by_name("file-source").set_property("location", filepath)
62 self.player.set_state(gst.STATE_PLAYING)
63 self.play_thread_id = thread.start_new_thread(self.play_thread, ())

24
64 else:
65 self.play_thread_id = None
66 self.player.set_state(gst.STATE_NULL)
67 self.button.set_label("Start")
68 self.time_label.set_text("00:00 / 00:00")
69
70 def play_thread(self):
71 play_thread_id = self.play_thread_id
72 gtk.gdk.threads_enter()
73 self.time_label.set_text("00:00 / 00:00")
74 gtk.gdk.threads_leave()
75
76 while play_thread_id == self.play_thread_id:
77 try:
78 time.sleep(0.2)
79 dur_int = self.player.query_duration(self.time_format, None)[0]
80 dur_str = self.convert_ns(dur_int)
81 gtk.gdk.threads_enter()
82 self.time_label.set_text("00:00 / " + dur_str)
83 gtk.gdk.threads_leave()
84 break
85 except:
86 pass
87
88 time.sleep(0.2)
89 while play_thread_id == self.play_thread_id:
90 pos_int = self.player.query_position(self.time_format, None)[0]
91 pos_str = self.convert_ns(pos_int)
92 if play_thread_id == self.play_thread_id:
93 gtk.gdk.threads_enter()
94 self.time_label.set_text(pos_str + " / " + dur_str)
95 gtk.gdk.threads_leave()
96 time.sleep(1)
97
98 def on_message(self, bus, message):
99 t = message.type
100 if t == gst.MESSAGE_EOS:
101 self.play_thread_id = None
102 self.player.set_state(gst.STATE_NULL)

25
103 self.button.set_label("Start")
104 self.time_label.set_text("00:00 / 00:00")
105 elif t == gst.MESSAGE_ERROR:
106 err, debug = message.parse_error()
107 print "Error: %s" % err, debug
108 self.play_thread_id = None
109 self.player.set_state(gst.STATE_NULL)
110 self.button.set_label("Start")
111 self.time_label.set_text("00:00 / 00:00")
112
113 def demuxer_callback(self, demuxer, pad):
114 adec_pad = self.audio_decoder.get_pad("sink")
115 pad.link(adec_pad)
116
117 def rewind_callback(self, w):
118 pos_int = self.player.query_position(self.time_format, None)[0]
119 seek_ns = pos_int - (10 * 1000000000)
120 self.player.seek_simple(self.time_format, gst.SEEK_FLAG_FLUSH, seek_ns)
121
122 def forward_callback(self, w):
123 pos_int = self.player.query_position(self.time_format, None)[0]
124 seek_ns = pos_int + (10 * 1000000000)
125 self.player.seek_simple(self.time_format, gst.SEEK_FLAG_FLUSH, seek_ns)
126
127 def convert_ns(self, time_int):
128 time_int = time_int / 1000000000
129 time_str = ""
130 if time_int >= 3600:
131 _hours = time_int / 3600
132 time_int = time_int - (_hours * 3600)
133 time_str = str(_hours) + ":"
134 if time_int >= 600:
135 _mins = time_int / 60
136 time_int = time_int - (_mins * 60)
137 time_str = time_str + str(_mins) + ":"
138 elif time_int >= 60:
139 _mins = time_int / 60
140 time_int = time_int - (_mins * 60)
141 time_str = time_str + "0" + str(_mins) + ":"

26
142 else:
143 time_str = time_str + "00:"
144 if time_int > 9:
145 time_str = time_str + str(time_int)
146 else:
147 time_str = time_str + "0" + str(time_int)
148
149 return time_str
150
151 GTK_Main()
152 gtk.gdk.threads_init()
153 gtk.main()

27
6. Capabilities
Capabilities, gst.Caps, is a container where you may store information that you may pass on to a gst.PadTemplate. When you set the pipeline state to
either playing or paused the elements pads negotiates what caps to use for the stream. Now the following pipeline works perfectly:
$ gst-launch-0.10 videotestsrc ! video/x-raw-yuv, width=320, height=240 ! xvimagesink

But if you try to switch out the xvimagesink for a ximagesink you will notice that it wouldn't work. That's because ximagesink can't handle video/x­
raw­yuv so you must put in an element BEFORE in the pipeline that does.
$ gst-launch-0.10 videotestsrc ! video/x-raw-yuv, width=320, height=240 ! ffmpegcolorspace ! ximagesink

And as ximagesink does not support hardware scaling you have to throw in a videoscale element too if you want software scaling.
$ gst-launch-0.10 videotestsrc ! video/x-raw-yuv, width=320, height=240 ! videoscale ! ffmpegcolorspace ! ximagesink

To put the above examples in code you have to put the caps in a capsfilter element.
Example 6.1
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Videotestsrc-Player")
14 window.set_default_size(300, -1)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)

28
18 self.button = gtk.Button("Start")
19 self.button.connect("clicked", self.start_stop)
20 vbox.add(self.button)
21 window.show_all()
22
23 self.player = gst.Pipeline("player")
24 source = gst.element_factory_make("videotestsrc", "video-source")
25 sink = gst.element_factory_make("xvimagesink", "video-output")
26 caps = gst.Caps("video/x-raw-yuv, width=320, height=230")
27 filter = gst.element_factory_make("capsfilter", "filter")
28 filter.set_property("caps", caps)
29
30 self.player.add(source, filter, sink)
31 gst.element_link_many(source, filter, sink)
32
33 def start_stop(self, w):
34 if self.button.get_label() == "Start":
35 self.button.set_label("Stop")
36 self.player.set_state(gst.STATE_PLAYING)
37 else:
38 self.player.set_state(gst.STATE_NULL)
39 self.button.set_label("Start")
40
41 GTK_Main()
42 gtk.gdk.threads_init()
43 gtk.main()

A frequently asked question is how to find out what resolution a file has and one way to do it is to check the caps on a decodebin element in paused
state.
Example 6.2
1 #!/usr/bin/env python
2
3 import sys, os, time
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8

29
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Resolutionchecker")
14 window.set_default_size(300, -1)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.entry = gtk.Entry()
19 vbox.pack_start(self.entry, False, True)
20 self.button = gtk.Button("Check")
21 self.button.connect("clicked", self.start_stop)
22 vbox.add(self.button)
23 window.show_all()
24
25 self.player = gst.Pipeline("player")
26 source = gst.element_factory_make("filesrc", "file-source")
27 decoder = gst.element_factory_make("decodebin", "decoder")
28 decoder.connect("new-decoded-pad", self.decoder_callback)
29 self.fakea = gst.element_factory_make("fakesink", "fakea")
30 self.fakev = gst.element_factory_make("fakesink", "fakev")
31
32 self.player.add(source, decoder, self.fakea, self.fakev)
33 gst.element_link_many(source, decoder)
34
35 bus = self.player.get_bus()
36 bus.add_signal_watch()
37 bus.connect("message", self.on_message)
38
39 def start_stop(self, w):
40 filepath = self.entry.get_text()
41 if os.path.isfile(filepath):
42 self.player.set_state(gst.STATE_NULL)
43 self.player.get_by_name("file-source").set_property("location", filepath)
44 self.player.set_state(gst.STATE_PAUSED)
45
46 def on_message(self, bus, message):
47 t = message.type
48 if t == gst.MESSAGE_STATE_CHANGED:
49 if message.parse_state_changed()[1] == gst.STATE_PAUSED:
50 for i in self.player.get_by_name("decoder").src_pads():
51 structure_name = i.get_caps()[0].get_name()
52 if structure_name.startswith("video") and len(str(i.get_caps()[0]["width"])) < 6:

30
53 print "Width:%s, Height:%s" %(i.get_caps()[0]["width"], i.get_caps()[0]["height"])
54 self.player.set_state(gst.STATE_NULL)
55 break
56 elif t == gst.MESSAGE_ERROR:
57 err, debug = message.parse_error()
58 print "Error: %s" % err, debug
59 self.player.set_state(gst.STATE_NULL)
60
61 def decoder_callback(self, decoder, pad, data):
62 structure_name = pad.get_caps()[0].get_name()
63 if structure_name.startswith("video"):
64 fv_pad = self.fakev.get_pad("sink")
65 pad.link(fv_pad)
66 elif structure_name.startswith("audio"):
67 fa_pad = self.fakea.get_pad("sink")
68 pad.link(fa_pad)
69
70 GTK_Main()
71 gtk.gdk.threads_init()
72 gtk.main()

Note: The examples here in this tutorial have grown a bit lately and it's not easy to show working gstreamer stuff in not so many lines but I'll try as
hard as I can to do just that. Maybe we have to have the code in separate files that you may take a look at if you find anything interesting. The examples
share much code too and when the numbers of chapters beefs up so does the readers knowledge and they can make sense out of just a few lines of code
instead of runable code examples. Well, time will tell how far we gets.
In this example we will use the playbin from example 2.2 and switch its video­sink out for our own homemade bin, stuffed with some goodies. Now,
let's say that you run a tv­station and you want to have your logo in the top right corner of the screen. For that you can use a textoverlay but for the
fonts to be the exact same size on the screen no matter what kind of resolution the source has you have to specify a width so everything is scaled
according to that.
Example 6.3
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")

31
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Video-Player")
14 window.set_default_size(500, 400)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 hbox = gtk.HBox()
19 vbox.pack_start(hbox, False)
20 self.entry = gtk.Entry()
21 hbox.add(self.entry)
22 self.button = gtk.Button("Start")
23 hbox.pack_start(self.button, False)
24 self.button.connect("clicked", self.start_stop)
25 self.movie_window = gtk.DrawingArea()
26 vbox.add(self.movie_window)
27 window.show_all()
28
29 self.player = gst.element_factory_make("playbin", "player")
30 self.bin = gst.Bin("my-bin")
31 videoscale = gst.element_factory_make("videoscale")
32 videoscale.set_property("method", 1)
33 pad = videoscale.get_pad("sink")
34 ghostpad = gst.GhostPad("sink", pad)
35 self.bin.add_pad(ghostpad)
36 caps = gst.Caps("video/x-raw-yuv, width=720")
37 filter = gst.element_factory_make("capsfilter", "filter")
38 filter.set_property("caps", caps)
39 textoverlay = gst.element_factory_make('textoverlay')
40 textoverlay.set_property("text", "GNUTV")
41 textoverlay.set_property("font-desc", "normal 14")
42 textoverlay.set_property("halign", "right")
43 textoverlay.set_property("valign", "top")
44 conv = gst.element_factory_make ("ffmpegcolorspace", "conv")
45 videosink = gst.element_factory_make("autovideosink")

32
46
47 self.bin.add(videoscale, filter, textoverlay, conv, videosink)
48 gst.element_link_many(videoscale, filter, textoverlay, conv, videosink)
49 self.player.set_property("video-sink", self.bin)
50
51 bus = self.player.get_bus()
52 bus.add_signal_watch()
53 bus.enable_sync_message_emission()
54 bus.connect("message", self.on_message)
55 bus.connect("sync-message::element", self.on_sync_message)
56
57 def start_stop(self, w):
58 if self.button.get_label() == "Start":
59 filepath = self.entry.get_text()
60 if os.path.exists(filepath):
61 self.button.set_label("Stop")
62 self.player.set_property("uri", "file://" + filepath)
63 self.player.set_state(gst.STATE_PLAYING)
64 else:
65 self.player.set_state(gst.STATE_NULL)
66 self.button.set_label("Start")
67
68 def on_message(self, bus, message):
69 t = message.type
70 if t == gst.MESSAGE_EOS:
71 self.player.set_state(gst.STATE_NULL)
72 self.button.set_label("Start")
73 elif t == gst.MESSAGE_ERROR:
74 self.player.set_state(gst.STATE_NULL)
75 self.button.set_label("Start")
76 err, debug = message.parse_error()
77 print "Error: %s" % err, debug
78
79 def on_sync_message(self, bus, message):
80 if message.structure is None:
81 return
82 message_name = message.structure.get_name()
83 if message_name == "prepare-xwindow-id":
84 imagesink = message.src

33
85 imagesink.set_property("force-aspect-ratio", True)
86 imagesink.set_xwindow_id(self.movie_window.window.xid)
87
88 GTK_Main()
89 gtk.gdk.threads_init()
90 gtk.main()

34
7. Videomixer
The videomixer element makes it possible to mix different videostreams together. Here is a CLI example:
$ gst-launch-0.10 filesrc location=tvlogo.png ! pngdec ! alphacolor ! ffmpegcolorspace ! videobox border-alpha=0
alpha=0.5 top=-20 left=-200 ! \
videomixer name=mix ! ffmpegcolorspace ! autovideosink videotestsrc ! video/x-raw-yuv, width=320, height=240 !
mix.

You have to make a PNG image (100x100 px) to be able to run it. With the videobox element you can move the image around and add more alpha
channels.
NEW COMPETITION! I can't figure out how to make a transparent alpha channel for the PNG in the pipeline above so if anyone makes it work the
prize from example 3.2 will be given to this winner instead as the other one passed. TIA
UPDATE! An anonymous dude solved the problem with an alphacolor element, many thanks man.
In the next example we take the now working Mpeg2­Player from example 3.2 and pump it with the stuff shown above.
Example 7.1
1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Mpeg2-Player")
14 window.set_default_size(500, 400)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()

35
17 window.add(vbox)
18 hbox = gtk.HBox()
19 vbox.pack_start(hbox, False)
20 self.entry = gtk.Entry()
21 hbox.add(self.entry)
22 self.button = gtk.Button("Start")
23 hbox.pack_start(self.button, False)
24 self.button.connect("clicked", self.start_stop)
25 self.movie_window = gtk.DrawingArea()
26 vbox.add(self.movie_window)
27 window.show_all()
28
29 self.player = gst.Pipeline("player")
30 source = gst.element_factory_make("filesrc", "file-source")
31 demuxer = gst.element_factory_make("mpegdemux", "demuxer")
32 demuxer.connect("pad-added", self.demuxer_callback)
33 self.video_decoder = gst.element_factory_make("mpeg2dec", "video-decoder")
34 png_decoder = gst.element_factory_make("pngdec", "png-decoder")
35 png_source = gst.element_factory_make("filesrc", "png-source")
36 png_source.set_property("location", "tvlogo.png")
37 mixer = gst.element_factory_make("videomixer", "mixer")
38 self.audio_decoder = gst.element_factory_make("mad", "audio-decoder")
39 audioconv = gst.element_factory_make("audioconvert", "converter")
40 audiosink = gst.element_factory_make("autoaudiosink", "audio-output")
41 videosink = gst.element_factory_make("autovideosink", "video-output")
42 self.queuea = gst.element_factory_make("queue", "queuea")
43 self.queuev = gst.element_factory_make("queue", "queuev")
44 ffmpeg1 = gst.element_factory_make("ffmpegcolorspace", "ffmpeg1")
45 ffmpeg2 = gst.element_factory_make("ffmpegcolorspace", "ffmpeg2")
46 ffmpeg3 = gst.element_factory_make("ffmpegcolorspace", "ffmpeg3")
47 videobox = gst.element_factory_make("videobox", "videobox")
48 alphacolor = gst.element_factory_make("alphacolor", "alphacolor")
49
50 self.player.add(source, demuxer, self.video_decoder, png_decoder, png_source, mixer,
51 self.audio_decoder, audioconv, audiosink, videosink, self.queuea, self.queuev,
52 ffmpeg1, ffmpeg2, ffmpeg3, videobox, alphacolor)
53 gst.element_link_many(source, demuxer)
54 gst.element_link_many(self.queuev, self.video_decoder, ffmpeg1, mixer, ffmpeg2, videosink)
55 gst.element_link_many(png_source, png_decoder, alphacolor, ffmpeg3, videobox, mixer)

36
56 gst.element_link_many(self.queuea, self.audio_decoder, audioconv, audiosink)
57
58 bus = self.player.get_bus()
59 bus.add_signal_watch()
60 bus.enable_sync_message_emission()
61 bus.connect("message", self.on_message)
62 bus.connect("sync-message::element", self.on_sync_message)
63
64 videobox.set_property("border-alpha", 0)
65 videobox.set_property("alpha", 0.5)
66 videobox.set_property("left", -10)
67 videobox.set_property("top", -10)
68
69 def start_stop(self, w):
70 if self.button.get_label() == "Start":
71 filepath = self.entry.get_text()
72 if os.path.isfile(filepath):
73 self.button.set_label("Stop")
74 self.player.get_by_name("file-source").set_property("location", filepath)
75 self.player.set_state(gst.STATE_PLAYING)
76 else:
77 self.player.set_state(gst.STATE_NULL)
78 self.button.set_label("Start")
79
80 def on_message(self, bus, message):
81 t = message.type
82 if t == gst.MESSAGE_EOS:
83 self.player.set_state(gst.STATE_NULL)
84 self.button.set_label("Start")
85 elif t == gst.MESSAGE_ERROR:
86 err, debug = message.parse_error()
87 print "Error: %s" % err, debug
88 self.player.set_state(gst.STATE_NULL)
89 self.button.set_label("Start")
90
91 def on_sync_message(self, bus, message):
92 if message.structure is None:
93 return
94 message_name = message.structure.get_name()

37
95 if message_name == "prepare-xwindow-id":
96 imagesink = message.src
97 imagesink.set_property("force-aspect-ratio", True)
98 imagesink.set_xwindow_id(self.movie_window.window.xid)
99
100 def demuxer_callback(self, demuxer, pad):
101 if pad.get_property("template").name_template == "video_%02d":
102 queuev_pad = self.queuev.get_pad("sink")
103 pad.link(queuev_pad)
104 elif pad.get_property("template").name_template == "audio_%02d":
105 queuea_pad = self.queuea.get_pad("sink")
106 pad.link(queuea_pad)
107
108 GTK_Main()
109 gtk.gdk.threads_init()
110 gtk.main()

38
8. Webcam Viewer
A simple webcam example I found on the net, more text later.
Example 8.1

1 #!/usr/bin/env python
2
3 import sys, os
4 import pygtk, gtk, gobject

39
5 import pygst
6 pygst.require("0.10")
7 import gst
8
9 class GTK_Main:
10
11 def __init__(self):
12 window = gtk.Window(gtk.WINDOW_TOPLEVEL)
13 window.set_title("Webcam-Viewer")
14 window.set_default_size(500, 400)
15 window.connect("destroy", gtk.main_quit, "WM destroy")
16 vbox = gtk.VBox()
17 window.add(vbox)
18 self.movie_window = gtk.DrawingArea()
19 vbox.add(self.movie_window)
20 hbox = gtk.HBox()
21 vbox.pack_start(hbox, False)
22 hbox.set_border_width(10)
23 hbox.pack_start(gtk.Label())
24 self.button = gtk.Button("Start")
25 self.button.connect("clicked", self.start_stop)
26 hbox.pack_start(self.button, False)
27 self.button2 = gtk.Button("Quit")
28 self.button2.connect("clicked", self.exit)
29 hbox.pack_start(self.button2, False)
30 hbox.add(gtk.Label())
31 window.show_all()
32
33 # Set up the gstreamer pipeline
34 self.player = gst.parse_launch ("v4l2src ! autovideosink")
35
36 bus = self.player.get_bus()
37 bus.add_signal_watch()
38 bus.enable_sync_message_emission()
39 bus.connect("message", self.on_message)
40 bus.connect("sync-message::element", self.on_sync_message)
41
42 def start_stop(self, w):
43 if self.button.get_label() == "Start":

40
44 self.button.set_label("Stop")
45 self.player.set_state(gst.STATE_PLAYING)
46 else:
47 self.player.set_state(gst.STATE_NULL)
48 self.button.set_label("Start")
49
50 def exit(self, widget, data=None):
51 gtk.main_quit()
52
53 def on_message(self, bus, message):
54 t = message.type
55 if t == gst.MESSAGE_EOS:
56 self.player.set_state(gst.STATE_NULL)
57 self.button.set_label("Start")
58 elif t == gst.MESSAGE_ERROR:
59 err, debug = message.parse_error()
60 print "Error: %s" % err, debug
61 self.player.set_state(gst.STATE_NULL)
62 self.button.set_label("Start")
63
64 def on_sync_message(self, bus, message):
65 if message.structure is None:
66 return
67 message_name = message.structure.get_name()
68 if message_name == "prepare-xwindow-id":
69 # Assign the viewport
70 imagesink = message.src
71 imagesink.set_property("force-aspect-ratio", True)
72 imagesink.set_xwindow_id(self.movie_window.window.xid)
73
74 GTK_Main()
75 gtk.gdk.threads_init()
76 gtk.main()

41
Appendix A. ChangeLog

2009­02­13 Jens Persson

* Added Webcam­Viewer chapter


* Added syntax highlighting with Pygments
* Some more examples and other fixes

2008­03­10 Jens Persson

* Make it validate as HTML­4.01


* Added Videomixer chapter
* Added Mike Auty as the winner in the first competition
* Various other fixes

2006­12­29 Jens Persson

* Added Capabilities chapter


* Various other fixes

2006­12­19 Jens Persson

* Added Seeking chapter

2006­12­17 Jens Persson

* Added chapter about sinks and stuff

42
* Various other fixes

2006­12­16 Jens Persson

*AddedPipeline chapter
*Updated introduction

2006­12­15 Jens Persson


*Initial release

43
Appendix B. Third Party Tutorials

PiTiVi developers PyGST tutorial wiki.

44

You might also like