• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1page.title=Audio Terminology
2@jd:body
3
4<!--
5    Copyright 2015 The Android Open Source Project
6
7    Licensed under the Apache License, Version 2.0 (the "License");
8    you may not use this file except in compliance with the License.
9    You may obtain a copy of the License at
10
11        http://www.apache.org/licenses/LICENSE-2.0
12
13    Unless required by applicable law or agreed to in writing, software
14    distributed under the License is distributed on an "AS IS" BASIS,
15    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16    See the License for the specific language governing permissions and
17    limitations under the License.
18-->
19<div id="qv-wrapper">
20  <div id="qv">
21    <h2>In this document</h2>
22    <ol id="auto-toc">
23    </ol>
24  </div>
25</div>
26
27<p>
28This glossary of audio-related terminology includes widely-used generic terms
29and Android-specific terms.
30</p>
31
32<h2 id="genericTerm">Generic Terms</h2>
33
34<p>
35Generic audio-related terms have conventional meanings.
36</p>
37
38<h3 id="digitalAudioTerms">Digital Audio</h3>
39<p>
40Digital audio terms relate to handling sound using audio signals encoded
41in digital form. For details, refer to
42<a href="http://en.wikipedia.org/wiki/Digital_audio">Digital Audio</a>.
43</p>
44
45<dl>
46
47<dt>acoustics</dt>
48<dd>
49Study of the mechanical properties of sound, such as how the physical
50placement of transducers (speakers, microphones, etc.) on a device affects
51perceived audio quality.
52</dd>
53
54<dt>attenuation</dt>
55<dd>
56Multiplicative factor less than or equal to 1.0, applied to an audio signal
57to decrease the signal level. Compare to <em>gain</em>.
58</dd>
59
60<dt>audiophile</dt>
61<dd>
62Person concerned with a superior music reproduction experience, especially
63willing to make substantial tradeoffs (expense, component size, room design,
64etc.) for sound quality. For details, refer to
65<a href="http://en.wikipedia.org/wiki/Audiophile">audiophile</a>.
66</dd>
67
68<dt>bits per sample or bit depth</dt>
69<dd>
70Number of bits of information per sample.
71</dd>
72
73<dt>channel</dt>
74<dd>
75Single stream of audio information, usually corresponding to one location of
76recording or playback.
77</dd>
78
79<dt>downmixing</dt>
80<dd>
81Decrease the number of channels, such as from stereo to mono or from 5.1 to
82stereo. Accomplished by dropping channels, mixing channels, or more advanced
83signal processing. Simple mixing without attenuation or limiting has the
84potential for overflow and clipping. Compare to <em>upmixing</em>.
85</dd>
86
87<dt>DSD</dt>
88<dd>
89Direct Stream Digital. Proprietary audio encoding based on
90<a href="http://en.wikipedia.org/wiki/Pulse-density_modulation">pulse-density
91modulation</a>. While Pulse Code Modulation (PCM) encodes a waveform as a
92sequence of individual audio samples of multiple bits, DSD encodes a waveform as
93a sequence of bits at a very high sample rate (without the concept of samples).
94Both PCM and DSD represent multiple channels by independent sequences. DSD is
95better suited to content distribution than as an internal representation for
96processing as it can be difficult to apply traditional digital signal processing
97(DSP) algorithms to DSD. DSD is used in <a href="http://en.wikipedia.org/wiki/Super_Audio_CD">Super Audio CD (SACD)</a> and in DSD over PCM (DoP) for USB. For details, refer
98to <a href="http://en.wikipedia.org/wiki/Direct_Stream_Digital">Digital Stream
99Digital</a>.
100</dd>
101
102<dt>duck</dt>
103<dd>
104Temporarily reduce the volume of a stream when another stream becomes active.
105For example, if music is playing when a notification arrives, the music ducks
106while the notification plays. Compare to <em>mute</em>.
107</dd>
108
109<dt>FIFO</dt>
110<dd>
111First In, First Out. Hardware module or software data structure that implements
112<a href="http://en.wikipedia.org/wiki/FIFO">First In, First Out</a>
113queueing of data. In an audio context, the data stored in the queue are
114typically audio frames. FIFO can be implemented by a
115<a href="http://en.wikipedia.org/wiki/Circular_buffer">circular buffer</a>.
116</dd>
117
118<dt>frame</dt>
119<dd>
120Set of samples, one per channel, at a point in time.
121</dd>
122
123<dt>frames per buffer</dt>
124<dd>
125Number of frames handed from one module to the next at one time. The audio HAL
126interface uses the concept of frames per buffer.
127</dd>
128
129<dt>gain</dt>
130<dd>
131Multiplicative factor greater than or equal to 1.0, applied to an audio signal
132to increase the signal level. Compare to <em>attenuation</em>.
133</dd>
134
135<dt>HD audio</dt>
136<dd>
137High-Definition audio. Synonym for high-resolution audio (but different than
138Intel High Definition Audio).
139</dd>
140
141<dt>Hz</dt>
142<dd>
143Units for sample rate or frame rate.
144</dd>
145
146<dt>high-resolution audio</dt>
147<dd>
148Representation with greater bit-depth and sample rate than CDs (stereo 16-bit
149PCM at 44.1 kHz) and without lossy data compression. Equivalent to HD audio.
150For details, refer to
151<a href="http://en.wikipedia.org/wiki/High-resolution_audio">high-resolution
152audio</a>.
153</dd>
154
155<dt>latency</dt>
156<dd>
157Time delay as a signal passes through a system.
158</dd>
159
160<dt>lossless</dt>
161<dd>
162A <a href="http://en.wikipedia.org/wiki/Lossless_compression">lossless data
163compression algorithm</a> that preserves bit accuracy across encoding and
164decoding, where the result of decoding previously encoded data is equivalent
165to the original data. Examples of lossless audio content distribution formats
166include <a href="http://en.wikipedia.org/wiki/Compact_disc">CDs</a>, PCM within
167<a href="http://en.wikipedia.org/wiki/WAV">WAV</a>, and
168<a href="http://en.wikipedia.org/wiki/FLAC">FLAC</a>.
169The authoring process may reduce the bit depth or sample rate from that of the
170<a href="http://en.wikipedia.org/wiki/Audio_mastering">masters</a>; distribution
171formats that preserve the resolution and bit accuracy of masters are the subject
172of high-resolution audio.
173</dd>
174
175<dt>lossy</dt>
176<dd>
177A <a href="http://en.wikipedia.org/wiki/Lossy_compression">lossy data
178compression algorithm</a> that attempts to preserve the most important features
179of media across encoding and decoding where the result of decoding previously
180encoded data is perceptually similar to the original data but not identical.
181Examples of lossy audio compression algorithms include MP3 and AAC. As analog
182values are from a continuous domain and digital values are discrete, ADC and DAC
183are lossy conversions with respect to amplitude. See also <em>transparency</em>.
184</dd>
185
186<dt>mono</dt>
187<dd>
188One channel.
189</dd>
190
191<dt>multichannel</dt>
192<dd>
193See <em>surround sound</em>. In strict terms, <em>stereo</em> is more than one
194channel and could be considered multichannel; however, such usage is confusing
195and thus avoided.
196</dd>
197
198<dt>mute</dt>
199<dd>
200Temporarily force volume to be zero, independent from the usual volume controls.
201</dd>
202
203<dt>overrun</dt>
204<dd>
205Audible <a href="http://en.wikipedia.org/wiki/Glitch">glitch</a> caused by
206failure to accept supplied data in sufficient time. For details, refer to
207<a href="http://en.wikipedia.org/wiki/Buffer_underrun">buffer underrun</a>.
208Compare to <em>underrun</em>.
209</dd>
210
211<dt>panning</dt>
212<dd>
213Direct a signal to a desired position within a stereo or multichannel field.
214</dd>
215
216<dt>PCM</dt>
217<dd>
218Pulse Code Modulation. Most common low-level encoding of digital audio. The
219audio signal is sampled at a regular interval, called the sample rate, then
220quantized to discrete values within a particular range depending on the bit
221depth. For example, for 16-bit PCM the sample values are integers between
222-32768 and +32767.
223</dd>
224
225<dt>ramp</dt>
226<dd>
227Gradually increase or decrease the level of a particular audio parameter, such
228as the volume or the strength of an effect. A volume ramp is commonly applied
229when pausing and resuming music to avoid a hard audible transition.
230</dd>
231
232<dt>sample</dt>
233<dd>
234Number representing the audio value for a single channel at a point in time.
235</dd>
236
237<dt>sample rate or frame rate</dt>
238<dd>
239Number of frames per second. While <em>frame rate</em> is more accurate,
240<em>sample rate</em> is conventionally used to mean frame rate.
241</dd>
242
243<dt>sonification</dt>
244<dd>
245Use of sound to express feedback or information, such as touch sounds and
246keyboard sounds.
247</dd>
248
249<dt>stereo</dt>
250<dd>
251Two channels.
252</dd>
253
254<dt>stereo widening</dt>
255<dd>
256Effect applied to a stereo signal to make another stereo signal that sounds
257fuller and richer. The effect can also be applied to a mono signal, where it is
258a type of upmixing.
259</dd>
260
261<dt>surround sound</dt>
262<dd>
263Techniques for increasing the ability of a listener to perceive sound position
264beyond stereo left and right.
265</dd>
266
267<dt>transparency</dt>
268<dd>
269Ideal result of lossy data compression. Lossy data conversion is transparent if
270it is perceptually indistinguishable from the original by a human subject. For
271details, refer to
272<a href="http://en.wikipedia.org/wiki/Transparency_%28data_compression%29">Transparency</a>.
273
274</dd>
275
276<dt>underrun</dt>
277<dd>
278Audible <a href="http://en.wikipedia.org/wiki/Glitch">glitch</a> caused by
279failure to supply needed data in sufficient time. For details, refer to
280<a href="http://en.wikipedia.org/wiki/Buffer_underrun">buffer underrun</a>.
281Compare to <em>overrun</em>.
282</dd>
283
284<dt>upmixing</dt>
285<dd>
286Increase the number of channels, such as from mono to stereo or from stereo to
287surround sound. Accomplished by duplication, panning, or more advanced signal
288processing. Compare to <em>downmixing</em>.
289</dd>
290
291<dt>virtualizer</dt>
292<dd>
293Effect that attempts to spatialize audio channels, such as trying to simulate
294more speakers or give the illusion that sound sources have position.
295</dd>
296
297<dt>volume</dt>
298<dd>
299Loudness, the subjective strength of an audio signal.
300</dd>
301
302</dl>
303
304<h3 id="interDeviceTerms">Inter-device interconnect</h3>
305
306<p>
307Inter-device interconnection technologies connect audio and video components
308between devices and are readily visible at the external connectors. The HAL
309implementer and end user should be aware of these terms.
310</p>
311
312<dl>
313
314<dt>Bluetooth</dt>
315<dd>
316Short range wireless technology. For details on the audio-related
317<a href="http://en.wikipedia.org/wiki/Bluetooth_profile">Bluetooth profiles</a>
318and
319<a href="http://en.wikipedia.org/wiki/Bluetooth_protocols">Bluetooth protocols</a>,
320refer to <a href="http://en.wikipedia.org/wiki/Bluetooth_profile#Advanced_Audio_Distribution_Profile_.28A2DP.29">A2DP</a> for
321music, <a href="http://en.wikipedia.org/wiki/Bluetooth_protocols#Synchronous_connection-oriented_.28SCO.29_link">SCO</a> for telephony, and <a href="http://en.wikipedia.org/wiki/List_of_Bluetooth_profiles#Audio.2FVideo_Remote_Control_Profile_.28AVRCP.29">Audio/Video Remote Control Profile (AVRCP)</a>.
322</dd>
323
324<dt>DisplayPort</dt>
325<dd>
326Digital display interface by the Video Electronics Standards Association (VESA).
327</dd>
328
329<dt>dongle</dt>
330<dd>
331A <a href="https://en.wikipedia.org/wiki/Dongle">dongle</a>
332is a small gadget, especially one that hangs off another device.
333</dd>
334
335<dt>HDMI</dt>
336<dd>
337High-Definition Multimedia Interface. Interface for transferring audio and
338video data. For mobile devices, a micro-HDMI (type D) or MHL connector is used.
339</dd>
340
341<dt>Intel HDA</dt>
342<dd>
343Intel High Definition Audio (do not confuse with generic <em>high-definition
344audio</em> or <em>high-resolution audio</em>). Specification for a front-panel
345connector. For details, refer to
346<a href="http://en.wikipedia.org/wiki/Intel_High_Definition_Audio">Intel High
347Definition Audio</a>.
348</dd>
349
350<dt>interface</dt>
351<dd>
352An <a href="https://en.wikipedia.org/wiki/Interface_(computing)">interface</a>
353converts a signal from one representation to another.  Common interfaces
354include a USB audio interface and MIDI interface.
355</dd>
356
357<dt>line level</dt>
358<dd>
359<a href="http://en.wikipedia.org/wiki/Line_level">Line level</a> is the strength
360of an analog audio signal that passes between audio components, not transducers.
361</dd>
362
363<dt>MHL</dt>
364<dd>
365Mobile High-Definition Link. Mobile audio/video interface, often over micro-USB
366connector.
367</dd>
368
369<dt>phone connector</dt>
370<dd>
371Mini or sub-mini component that connects a device to wired headphones, headset,
372or line-level amplifier.
373</dd>
374
375<dt>SlimPort</dt>
376<dd>
377Adapter from micro-USB to HDMI.
378</dd>
379
380<dt>S/PDIF</dt>
381<dd>
382Sony/Philips Digital Interface Format. Interconnect for uncompressed PCM. For
383details, refer to <a href="http://en.wikipedia.org/wiki/S/PDIF">S/PDIF</a>.
384S/PDIF is the consumer grade variant of <a href="https://en.wikipedia.org/wiki/AES3">AES3</a>.
385</dd>
386
387<dt>Thunderbolt</dt>
388<dd>
389Multimedia interface that competes with USB and HDMI for connecting to high-end
390peripherals. For details, refer to <a href="http://en.wikipedia.org/wiki/Thunderbolt_%28interface%29">Thunderbolt</a>.
391</dd>
392
393<dt>TOSLINK</dt>
394<dd>
395<a href="https://en.wikipedia.org/wiki/TOSLINK">TOSLINK</a> is an optical audio cable
396used with <em>S/PDIF</em>.
397</dd>
398
399<dt>USB</dt>
400<dd>
401Universal Serial Bus. For details, refer to
402<a href="http://en.wikipedia.org/wiki/USB">USB</a>.
403</dd>
404
405</dl>
406
407<h3 id="intraDeviceTerms">Intra-device interconnect</h3>
408
409<p>
410Intra-device interconnection technologies connect internal audio components
411within a given device and are not visible without disassembling the device. The
412HAL implementer may need to be aware of these, but not the end user. For details
413on intra-device interconnections, refer to the following articles:
414</p>
415<ul>
416<li><a href="http://en.wikipedia.org/wiki/General-purpose_input/output">GPIO</a></li>
417<li><a href="http://en.wikipedia.org/wiki/I%C2%B2C">I²C</a>, for control channel</li>
418<li><a href="http://en.wikipedia.org/wiki/I%C2%B2S">I²S</a>, for audio data, simpler than SLIMbus</li>
419<li><a href="http://en.wikipedia.org/wiki/McASP">McASP</a></li>
420<li><a href="http://en.wikipedia.org/wiki/SLIMbus">SLIMbus</a></li>
421<li><a href="http://en.wikipedia.org/wiki/Serial_Peripheral_Interface_Bus">SPI</a></li>
422<li><a href="http://en.wikipedia.org/wiki/AC%2797">AC'97</a></li>
423<li><a href="http://en.wikipedia.org/wiki/Intel_High_Definition_Audio">Intel HDA</a></li>
424<li><a href="http://mipi.org/specifications/soundwire">SoundWire</a></li>
425</ul>
426
427<p>
428In
429<a href="http://www.alsa-project.org/main/index.php/ASoC">ALSA System on Chip (ASoC)</a>,
430these are collectively called
431<a href="https://www.kernel.org/doc/Documentation/sound/alsa/soc/DAI.txt">Digital Audio Interfaces</a>
432(DAI).
433</p>
434
435<h3 id="signalTerms">Audio Signal Path</h3>
436
437<p>
438Audio signal path terms relate to the signal path that audio data follows from
439an application to the transducer or vice-versa.
440</p>
441
442<dl>
443
444<dt>ADC</dt>
445<dd>
446Analog-to-digital converter. Module that converts an analog signal (continuous
447in time and amplitude) to a digital signal (discrete in time and amplitude).
448Conceptually, an ADC consists of a periodic sample-and-hold followed by a
449quantizer, although it does not have to be implemented that way. An ADC is
450usually preceded by a low-pass filter to remove any high frequency components
451that are not representable using the desired sample rate. For details, refer to
452<a href="http://en.wikipedia.org/wiki/Analog-to-digital_converter">Analog-to-digital
453converter</a>.
454</dd>
455
456<dt>AP</dt>
457<dd>
458Application processor. Main general-purpose computer on a mobile device.
459</dd>
460
461<dt>codec</dt>
462<dd>
463Coder-decoder. Module that encodes and/or decodes an audio signal from one
464representation to another (typically analog to PCM or PCM to analog). In strict
465terms, <em>codec</em> is reserved for modules that both encode and decode but
466can be used loosely to refer to only one of these. For details, refer to
467<a href="http://en.wikipedia.org/wiki/Audio_codec">Audio codec</a>.
468</dd>
469
470<dt>DAC</dt>
471<dd>
472Digital-to-analog converter. Module that converts a digital signal (discrete in
473time and amplitude) to an analog signal (continuous in time and amplitude).
474Often followed by a low-pass filter to remove high-frequency components
475introduced by digital quantization. For details, refer to
476<a href="http://en.wikipedia.org/wiki/Digital-to-analog_converter">Digital-to-analog
477converter</a>.
478</dd>
479
480<dt>DSP</dt>
481<dd>
482Digital Signal Processor. Optional component typically located after the
483application processor (for output) or before the application processor (for
484input). Primary purpose is to off-load the application processor and provide
485signal processing features at a lower power cost.
486</dd>
487
488<dt>PDM</dt>
489<dd>
490Pulse-density modulation. Form of modulation used to represent an analog signal
491by a digital signal, where the relative density of 1s versus 0s indicates the
492signal level. Commonly used by digital to analog converters. For details, refer
493to <a href="http://en.wikipedia.org/wiki/Pulse-density_modulation">Pulse-density
494modulation</a>.
495</dd>
496
497<dt>PWM</dt>
498<dd>
499Pulse-width modulation. Form of modulation used to represent an analog signal by
500a digital signal, where the relative width of a digital pulse indicates the
501signal level. Commonly used by analog-to-digital converters. For details, refer
502to <a href="http://en.wikipedia.org/wiki/Pulse-width_modulation">Pulse-width
503modulation</a>.
504</dd>
505
506<dt>transducer</dt>
507<dd>
508Converts variations in physical real-world quantities to electrical signals. In
509audio, the physical quantity is sound pressure, and the transducers are the
510loudspeaker and microphone. For details, refer to
511<a href="http://en.wikipedia.org/wiki/Transducer">Transducer</a>.
512</dd>
513
514</dl>
515
516<h3 id="srcTerms">Sample Rate Conversion</h3>
517<p>
518Sample rate conversion terms relate to the process of converting from one
519sampling rate to another.
520</p>
521
522<dl>
523
524<dt>downsample</dt>
525<dd>Resample, where sink sample rate &lt; source sample rate.</dd>
526
527<dt>Nyquist frequency</dt>
528<dd>
529Maximum frequency component that can be represented by a discretized signal at
5301/2 of a given sample rate. For example, the human hearing range extends to
531approximately 20 kHz, so a digital audio signal must have a sample rate of at
532least 40 kHz to represent that range. In practice, sample rates of 44.1 kHz and
53348 kHz are commonly used, with Nyquist frequencies of 22.05 kHz and 24 kHz
534respectively. For details, refer to
535<a href="http://en.wikipedia.org/wiki/Nyquist_frequency">Nyquist frequency</a>
536and
537<a href="http://en.wikipedia.org/wiki/Hearing_range">Hearing range</a>.
538</dd>
539
540<dt>resampler</dt>
541<dd>Synonym for sample rate converter.</dd>
542
543<dt>resampling</dt>
544<dd>Process of converting sample rate.</dd>
545
546<dt>sample rate converter</dt>
547<dd>Module that resamples.</dd>
548
549<dt>sink</dt>
550<dd>Output of a resampler.</dd>
551
552<dt>source</dt>
553<dd>Input to a resampler.</dd>
554
555<dt>upsample</dt>
556<dd>Resample, where sink sample rate &gt; source sample rate.</dd>
557
558</dl>
559
560<h2 id="androidSpecificTerms">Android-Specific Terms</h2>
561
562<p>
563Android-specific terms include terms used only in the Android audio framework
564and generic terms that have special meaning within Android.
565</p>
566
567<dl>
568
569<dt>ALSA</dt>
570<dd>
571Advanced Linux Sound Architecture. An audio framework for Linux that has also
572influenced other systems. For a generic definition, refer to
573<a href="http://en.wikipedia.org/wiki/Advanced_Linux_Sound_Architecture">ALSA</a>.
574In Android, ALSA refers to the kernel audio framework and drivers and not to the
575user-mode API. See also <em>tinyalsa</em>.
576</dd>
577
578<dt>audio device</dt>
579<dd>
580Audio I/O endpoint backed by a HAL implementation.
581</dd>
582
583<dt>AudioEffect</dt>
584<dd>
585API and implementation framework for output (post-processing) effects and input
586(pre-processing) effects. The API is defined at
587<a href="http://developer.android.com/reference/android/media/audiofx/AudioEffect.html">android.media.audiofx.AudioEffect</a>.
588</dd>
589
590<dt>AudioFlinger</dt>
591<dd>
592Android sound server implementation. AudioFlinger runs within the mediaserver
593process. For a generic definition, refer to
594<a href="http://en.wikipedia.org/wiki/Sound_server">Sound server</a>.
595</dd>
596
597<dt>audio focus</dt>
598<dd>
599Set of APIs for managing audio interactions across multiple independent apps.
600For details, see <a href="http://developer.android.com/training/managing-audio/audio-focus.html">Managing Audio Focus</a> and the focus-related methods and constants of
601<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>.
602</dd>
603
604<dt>AudioMixer</dt>
605<dd>
606Module in AudioFlinger responsible for combining multiple tracks and applying
607attenuation (volume) and effects. For a generic definition, refer to
608<a href="http://en.wikipedia.org/wiki/Audio_mixing_(recorded_music)">Audio mixing (recorded music)</a> (discusses a mixer as a hardware device or software application, rather
609than a software module within a system).
610</dd>
611
612<dt>audio policy</dt>
613<dd>
614Service responsible for all actions that require a policy decision to be made
615first, such as opening a new I/O stream, re-routing after a change, and stream
616volume management.
617</dd>
618
619<dt>AudioRecord</dt>
620<dd>
621Primary low-level client API for receiving data from an audio input device such
622as a microphone. The data is usually PCM format. The API is defined at
623<a href="http://developer.android.com/reference/android/media/AudioRecord.html">android.media.AudioRecord</a>.
624</dd>
625
626<dt>AudioResampler</dt>
627<dd>
628Module in AudioFlinger responsible for <a href="src.html">sample rate conversion</a>.
629</dd>
630
631<dt>audio source</dt>
632<dd>
633An enumeration of constants that indicates the desired use case for capturing
634audio input. For details, see <a href="http://developer.android.com/reference/android/media/MediaRecorder.AudioSource.html">audio source</a>. As of API level 21 and above,
635<a href="attributes.html">audio attributes</a> are preferred.
636</dd>
637
638<dt>AudioTrack</dt>
639<dd>
640Primary low-level client API for sending data to an audio output device such as
641a speaker. The data is usually in PCM format. The API is defined at
642<a href="http://developer.android.com/reference/android/media/AudioTrack.html">android.media.AudioTrack</a>.
643</dd>
644
645<dt>audio_utils</dt>
646<dd>
647Audio utility library for features such as PCM format conversion, WAV file I/O,
648and
649<a href="avoiding_pi.html#nonBlockingAlgorithms">non-blocking FIFO</a>, which is
650largely independent of the Android platform.
651</dd>
652
653<dt>client</dt>
654<dd>
655Usually an application or app client. However, an AudioFlinger client can be a
656thread running within the mediaserver system process, such as when playing media
657decoded by a MediaPlayer object.
658</dd>
659
660<dt>HAL</dt>
661<dd>
662Hardware Abstraction Layer. HAL is a generic term in Android; in audio, it is a
663layer between AudioFlinger and the kernel device driver with a C API (which
664replaces the C++ libaudio).
665</dd>
666
667<dt>FastCapture</dt>
668<dd>
669Thread within AudioFlinger that sends audio data to lower latency fast tracks
670and drives the input device when configured for reduced latency.
671</dd>
672
673<dt>FastMixer</dt>
674<dd>
675Thread within AudioFlinger that receives and mixes audio data from lower latency
676fast tracks and drives the primary output device when configured for reduced
677latency.
678</dd>
679
680<dt>fast track</dt>
681<dd>
682AudioTrack or AudioRecord client with lower latency but fewer features on some
683devices and routes.
684</dd>
685
686<dt>MediaPlayer</dt>
687<dd>
688Higher-level client API than AudioTrack. Plays encoded content or content that
689includes multimedia audio and video tracks.
690</dd>
691
692<dt>media.log</dt>
693<dd>
694AudioFlinger debugging feature available in custom builds only. Used for logging
695audio events to a circular buffer where they can then be retroactively dumped
696when needed.
697</dd>
698
699<dt>mediaserver</dt>
700<dd>
701Android system process that contains media-related services, including
702AudioFlinger.
703</dd>
704
705<dt>NBAIO</dt>
706<dd>
707Non-blocking audio input/output. Abstraction for AudioFlinger ports. The term
708can be misleading as some implementations of the NBAIO API support blocking. The
709key implementations of NBAIO are for different types of pipes.
710</dd>
711
712<dt>normal mixer</dt>
713<dd>
714Thread within AudioFlinger that services most full-featured AudioTrack clients.
715Directly drives an output device or feeds its sub-mix into FastMixer via a pipe.
716</dd>
717
718<dt>OpenSL ES</dt>
719<dd>
720Audio API standard by
721<a href="http://www.khronos.org/">The Khronos Group</a>. Android versions since
722API level 9 support a native audio API that is based on a subset of
723<a href="http://www.khronos.org/opensles/">OpenSL ES 1.0.1</a>.
724</dd>
725
726<dt>silent mode</dt>
727<dd>
728User-settable feature to mute the phone ringer and notifications without
729affecting media playback (music, videos, games) or alarms.
730</dd>
731
732<dt>SoundPool</dt>
733<dd>
734Higher-level client API than AudioTrack. Plays sampled audio clips. Useful for
735triggering UI feedback, game sounds, etc. The API is defined at
736<a href="http://developer.android.com/reference/android/media/SoundPool.html">android.media.SoundPool</a>.
737</dd>
738
739<dt>Stagefright</dt>
740<dd>
741See <a href="{@docRoot}devices/media.html">Media</a>.
742</dd>
743
744<dt>StateQueue</dt>
745<dd>
746Module within AudioFlinger responsible for synchronizing state among threads.
747Whereas NBAIO is used to pass data, StateQueue is used to pass control
748information.
749</dd>
750
751<dt>strategy</dt>
752<dd>
753Group of stream types with similar behavior. Used by the audio policy service.
754</dd>
755
756<dt>stream type</dt>
757<dd>
758Enumeration that expresses a use case for audio output. The audio policy
759implementation uses the stream type, along with other parameters, to determine
760volume and routing decisions. For a list of stream types, see
761<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>.
762</dd>
763
764<dt>tee sink</dt>
765<dd>
766See <a href="debugging.html#teeSink">Audio Debugging</a>.
767</dd>
768
769<dt>tinyalsa</dt>
770<dd>
771Small user-mode API above ALSA kernel with BSD license. Recommended for HAL
772implementations.
773</dd>
774
775<dt>ToneGenerator</dt>
776<dd>
777Higher-level client API than AudioTrack. Plays dual-tone multi-frequency (DTMF)
778signals. For details, refer to
779<a href="http://en.wikipedia.org/wiki/Dual-tone_multi-frequency_signaling">Dual-tone
780multi-frequency signaling</a> and the API definition at
781<a href="http://developer.android.com/reference/android/media/ToneGenerator.html">android.media.ToneGenerator</a>.
782</dd>
783
784<dt>track</dt>
785<dd>
786Audio stream. Controlled by the AudioTrack or AudioRecord API.
787</dd>
788
789<dt>volume attenuation curve</dt>
790<dd>
791Device-specific mapping from a generic volume index to a specific attenuation
792factor for a given output.
793</dd>
794
795<dt>volume index</dt>
796<dd>
797Unitless integer that expresses the desired relative volume of a stream. The
798volume-related APIs of
799<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>
800operate in volume indices rather than absolute attenuation factors.
801</dd>
802
803</dl>
804