• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1OUTDATED
2--------
3
4
5problem
6-------
7
8Since gstreamer uses a hierarchical pipeline layout individual elements
9can be inside N levels of containers (bins). Elements can also produce
10interesting information for the user app or for its parent.
11
12Consider the mp3parse element that could detect id3 tags in the stream.
13One way to let the app know about those tags is by emitting a signal. The
14problem with this signal is that the app has to perform a g_signal_connect
15on this element. This might not always be possible/feasible because
16the APP might not know about the mp3parse element (eg. an autoplugged
17pipeline or a compound object). The app could instrospect each element
18in the pipeline and look for known properties/signals to connect to,
19but that looks a bit ugly IMO.
20
21Signal proxying is also not very feasible because signals are tied to
22class instances.
23
24let's take the following use case:
25
26 - the user autoplugs an mpeg1 pipeline
27
28 - the autoplugged pipeline most likely contains an mpegdemuxer, an mp3
29   decoder, mpegdecoder etc.
30
31 - the mpegdemuxer knows the (average) bitrate of the stream.
32
33 - the mpegdecoder knows the framerate of the stream
34
35 - the mp3 decoder has some neat stuff too (bitrate, layer etc..)
36
37how are we going to get all those properties to the app? each element
38could fire a signal with the data. It the app were able to connect to
39every signal in each element this would work somewhat.
40
41
42Requirements
43------------
44
45The application can listen to an arbitrary bin in the pipeline to collect
46information about that bins children. The app can listen on the top
47level bin to collect all of the elements messages.
48
49The data sent out by the elements must not be limited to a fixed set of
50messages; it must be extensible.
51
52
53proposed solution
54-----------------
55
56We propose another way of propagating these element messages to the
57application.
58
59An element can send a message to its parent using a
60gst_element_send_message (element, message). The message would be of type
61GstMessage and would be similar to a GstEvent type (maybe even the same).
62
63The message would contain GstProps, which can be anything (a string, an
64int, a range etc..). It would also contain the originator of the message.
65
66The parent would just simply accept the message (and do something with it)
67or the default handler would just forward the message to its parent etc..
68
69The message would bubble up the pipeline. When an element doesn't have
70a parent, the message is converted to a GSignal. The signal ("message")
71would just forward the message to any listening apps.
72
73The app can then use the originator field of the message to find out
74where it came from, possibly using the elementfactories klass field to
75find out what type of plugin created this message.
76
77For an autoplugged mpeg1 pipeline the following messages could be
78signalled to the app:
79
80element klass        element      property      value
81                     name
82
83stream/mpeg/demuxer: (mpegdemux) "bitrate",   GST_PROPS_INT (1000000)
84stream/mpeg/demuxer: (mpegdemux) "type",      GST_PROPS_STRING ("mpeg1")
85video/mpeg/decoder:  (mpeg2dec)  "type",      GST_PROPS_STRING ("mpeg1")
86video/mpeg/decoder:  (mpeg2dec)  "frame_rate",GST_PROPS_INT (25)
87video/mpeg/decoder:  (mpeg2dec)  "size",      GST_PROPS_LIST (
88                                                 GST_PROPS_INT (320),
89					         GST_PROPS_INT (200)
90					       )
91audio/mp3/decoder:   (mad)	 "layer",     GST_PROPS_INT (2)
92audio/mp3/decoder:   (mad)	 "bitrate",   GST_PROPS_INT (128000)
93audio/mp3/decoder:   (mad)	 "channels",  GST_PROPS_INT (2)
94
95other possibilities:
96
97video/render/X:     (xvideosink) "frames_dropped", GST_PROPS_INT (4)
98video/render/X:     (xvideosink) "frames_shown",   GST_PROPS_INT (254)
99video/mpeg/decoder: (mpeg2dec)	 "frames_dropped", GST_PROPS_INT (2)
100video/avi/demuxer:  (avidemux)   "codec",          GST_PROPS_FOURCC ("DIVX")
101
102or
103
104video/mpeg/decoder: (mpeg2dec)  "state_changed", GST_PROPS_INT (GST_STATE_PAUSED)
105
106or even
107
108audio/render/oss:    (osssink)	"master_clock", GST_PROPS_OBJECT (osssink_clock)
109....
110
111or even even:
112
113input/file/filesrc:  (filesrc)	"here_i_am", GST_PROPS_STRING ("alive and kicking")
114
115
116With standard naming conventions for the element klass type and the
117messages ids, the player can easily create an info dialog to show various
118properties of the stream.
119
120The benefits are that we don't need to define N-thousand methods on
121elements, the messages can be anything and we don't have to use the
122heavyweight GObject signals in the core library.
123
124
125what about?
126-----------
127
128- Threads? do we queue events and let the top half collect the messages
129  or do we send them to the app in the thread context?
130
131- do we need a similar system for core functionalities (clocks, states,
132  ...) or do we define methods for those?
133
134
135
136