• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1page.title=Audio
2@jd:body
3
4<!--
5    Copyright 2010 The Android Open Source Project
6
7    Licensed under the Apache License, Version 2.0 (the "License");
8    you may not use this file except in compliance with the License.
9    You may obtain a copy of the License at
10
11        http://www.apache.org/licenses/LICENSE-2.0
12
13    Unless required by applicable law or agreed to in writing, software
14    distributed under the License is distributed on an "AS IS" BASIS,
15    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16    See the License for the specific language governing permissions and
17    limitations under the License.
18-->
19<div id="qv-wrapper">
20  <div id="qv">
21    <h2>In this document</h2>
22    <ol id="auto-toc">
23    </ol>
24  </div>
25</div>
26
27<p>
28  Android's audio HAL connects the higher level, audio-specific
29  framework APIs in <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a>
30  to the underlying audio driver and hardware.
31</p>
32
33<p>
34  The following figure and list describe how audio functionality is implemented and the relevant
35  source code that is involved in the implementation:
36</p>
37<p>
38  <img src="images/audio_hal.png">
39</p>
40<dl>
41  <dt>
42    Application framework
43  </dt>
44  <dd>
45    At the application framework level is the app code, which utilizes the
46    <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a>
47    APIs to interact with the audio hardware. Internally, this code calls corresponding JNI glue
48    classes to access the native code that interacts with the auido hardware.
49  </dd>
50  <dt>
51    JNI
52  </dt>
53  <dd>
54    The JNI code associated with <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> is located in the
55    <code>frameworks/base/core/jni/</code> and <code>frameworks/base/media/jni</code> directories.
56    This code calls the lower level native code to obtain access to the audio hardware.
57  </dd>
58  <dt>
59    Native framework
60  </dt>
61  <dd>
62    The native framework is defined in <code>frameworks/av/media/libmedia</code> and provides a
63    native equivalent to the <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> package. The native framework calls the Binder
64    IPC proxies to obtain access to audio-specific services of the media server.
65  </dd>
66  <dt>
67    Binder IPC
68  </dt>
69  <dd>
70    The Binder IPC proxies facilitate communication over process boundaries. They are located in
71    the <code>frameworks/av/media/libmedia</code> directory and begin with the letter "I".
72  </dd>
73  <dt>
74    Media Server
75  </dt>
76  <dd>
77    The audio services in the media server, located in
78    <code>frameworks/av/services/audioflinger</code>, is the actual code that interacts with your
79    HAL implementations.
80  </dd>
81  <dt>
82    HAL
83  </dt>
84  <dd>
85    The hardware abstraction layer defines the standard interface that audio services calls into
86    and that you must implement to have your audio hardware function correctly. The audio HAL
87    interfaces are located in <code>hardware/libhardware/include/hardware</code>.
88  </dd>
89  <dt>
90    Kernel Driver
91  </dt>
92  <dd>
93    The audio driver interacts with the hardware and your implementation of the HAL. You can choose
94    to use ALSA, OSS, or a custom driver of your own at this level. The HAL is driver-agnostic.
95    <p>
96  <strong>Note:</strong> If you do choose ALSA, we recommend using <code>external/tinyalsa</code>
97  for the user portion of the driver because of its compatible licensing (The standard user-mode
98  library is GPL licensed).
99</p>
100  </dd>
101</dl>
102<h2 id="implementing">
103  Implementing the HAL
104</h2>
105<p>
106  The audio HAL is composed of three different interfaces that you must implement:
107</p>
108<ul>
109  <li>
110    <code>hardware/libhardware/include/hardware/audio.h</code> - represents the main functions of
111    an audio device.
112  </li>
113  <li>
114    <code>hardware/libhardware/include/hardware/audio_policy.h</code> - represents the audio policy
115    manager, which handles things like audio routing and volume control policies.
116  </li>
117  <li>
118    <code>hardware/libhardware/include/hardware/audio_effect.h</code> - represents effects that can
119    be applied to audio such as downmixing, echo, or noise suppression.
120  </li>
121</ul>
122<p>See the implementation for the Galaxy Nexus at <code>device/samsung/tuna/audio</code> for an example.</p>
123
124<p>In addition to implementing the HAL, you need to create a
125  <code>device/&lt;company_name&gt;/&lt;device_name&gt;/audio/audio_policy.conf</code> file
126  that declares the audio devices present on your product. For an example, see the file for
127  the Galaxy Nexus audio hardware in <code>device/samsung/tuna/audio/audio_policy.conf</code>.
128Also, see
129  the <code>system/core/include/system/audio.h</code> and <code>system/core/include/system/audio_policy.h</code>
130   header files for a reference of the properties that you can define.
131</p>
132<h3 id="multichannel">Multi-channel support</h3>
133<p>If your hardware and driver supports multi-channel audio via HDMI, you can output the audio stream
134  directly to the audio hardware. This bypasses the AudioFlinger mixer so it doesn't get downmixed to two channels.
135
136  <p>
137  The audio HAL must expose whether an output stream profile supports multi-channel audio capabilities.
138  If the HAL exposes its capabilities, the default policy manager allows multichannel playback over
139  HDMI.</p>
140 <p>For more implementation details, see the <code>device/samsung/tuna/audio/audio_hw.c</code> in the Jellybean release.</p>
141
142  <p>
143  To specify that your product contains a multichannel audio output, edit the <code>audio_policy.conf</code> file to describe the multichannel
144  output for your product. The following is an example from the Galaxy Nexus that shows a "dynamic" channel mask, which means the audio policy manager
145  queries the actual channel masks supported by the HDMI sink after connection. You can also specify a static channel mask like <code>AUDIO_CHANNEL_OUT_5POINT1</code>
146  </p>
147<pre>
148audio_hw_modules {
149  primary {
150    outputs {
151        ...
152        hdmi {
153          sampling_rates 44100|48000
154          channel_masks dynamic
155          formats AUDIO_FORMAT_PCM_16_BIT
156          devices AUDIO_DEVICE_OUT_AUX_DIGITAL
157          flags AUDIO_OUTPUT_FLAG_DIRECT
158        }
159        ...
160    }
161    ...
162  }
163  ...
164}
165</pre>
166
167
168  <p>If your product does not support multichannel audio, AudioFlinger's mixer downmixes the content to stereo
169    automatically when sent to an audio device that does not support multichannel audio.</p>
170</p>
171
172<h3 id="codecs">Media Codecs</h3>
173
174<p>Ensure that the audio codecs that your hardware and drivers support are properly declared for your product. See
175  <a href="media.html#expose"> Exposing Codecs to the Framework</a> for information on how to do this.
176</p>
177
178<h2 id="configuring">
179  Configuring the Shared Library
180</h2>
181<p>
182  You need to package the HAL implementation into a shared library and copy it to the
183  appropriate location by creating an <code>Android.mk</code> file:
184</p>
185<ol>
186  <li>Create a <code>device/&lt;company_name&gt;/&lt;device_name&gt;/audio</code> directory
187  to contain your library's source files.
188  </li>
189  <li>Create an <code>Android.mk</code> file to build the shared library. Ensure that the
190  Makefile contains the following line:
191<pre>
192LOCAL_MODULE := audio.primary.&lt;device_name&gt;
193</pre>
194    <p>
195      Notice that your library must be named <code>audio_primary.&lt;device_name&gt;.so</code> so
196      that Android can correctly load the library. The "<code>primary</code>" portion of this
197      filename indicates that this shared library is for the primary audio hardware located on the
198      device. The module names <code>audio.a2dp.&lt;device_name&gt;</code> and
199      <code>audio.usb.&lt;device_name&gt;</code> are also available for bluetooth and USB audio
200      interfaces. Here is an example of an <code>Android.mk</code> from the Galaxy
201      Nexus audio hardware:
202    </p>
203    <pre>
204LOCAL_PATH := $(call my-dir)
205
206include $(CLEAR_VARS)
207
208LOCAL_MODULE := audio.primary.tuna
209LOCAL_MODULE_PATH := $(TARGET_OUT_SHARED_LIBRARIES)/hw
210LOCAL_SRC_FILES := audio_hw.c ril_interface.c
211LOCAL_C_INCLUDES += \
212        external/tinyalsa/include \
213        $(call include-path-for, audio-utils) \
214        $(call include-path-for, audio-effects)
215LOCAL_SHARED_LIBRARIES := liblog libcutils libtinyalsa libaudioutils libdl
216LOCAL_MODULE_TAGS := optional
217
218include $(BUILD_SHARED_LIBRARY)
219</pre>
220  </li>
221  <li>If your product supports low latency audio as specified by the Android CDD, copy the
222  corresponding XML feature file into your product. For example, in your product's
223   <code>device/&lt;company_name&gt;/&lt;device_name&gt;/device.mk</code>
224  Makefile:
225    <pre>
226PRODUCT_COPY_FILES := ...
227
228PRODUCT_COPY_FILES += \
229frameworks/native/data/etc/android.android.hardware.audio.low_latency.xml:system/etc/permissions/android.hardware.audio.low_latency.xml \
230</pre>
231  </li>
232
233  <li>Copy the <code>audio_policy.conf</code> file that you created earlier to the <code>system/etc/</code> directory
234  in your product's <code>device/&lt;company_name&gt;/&lt;device_name&gt;/device.mk</code>
235  Makefile. For example:
236    <pre>
237PRODUCT_COPY_FILES += \
238        device/samsung/tuna/audio/audio_policy.conf:system/etc/audio_policy.conf
239</pre>
240  </li>
241  <li>Declare the shared modules of your audio HAL that are required by your product in the product's
242    <code>device/&lt;company_name&gt;/&lt;device_name&gt;/device.mk</code> Makefile. For example, the
243  Galaxy Nexus requires the primary and bluetooth audio HAL modules:
244<pre>
245PRODUCT_PACKAGES += \
246        audio.primary.tuna \
247        audio.a2dp.default
248</pre>
249  </li>
250</ol>
251
252<h2 id="preprocessing">Audio preprocessing effects</h2>
253<p>
254The Android platform supports audio effects on supported devices in the
255<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx</a>
256package, which is available for developers to access. For example, on the Nexus 10, the following pre-processing effects are supported: </p>
257<ul>
258  <li><a href="http://developer.android.com/reference/android/media/audiofx/AcousticEchoCanceler.html">Acoustic Echo Cancellation</a></li>
259  <li><a href="http://developer.android.com/reference/android/media/audiofx/AutomaticGainControl.html">Automatic Gain Control</a></li>
260  <li><a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">Noise Suppression</a></li>
261</ul>
262</p>
263
264
265<p>Pre-processing effects are always paired with the use case mode in which the pre-processing is requested. In Android
266app development, a use case is referred to as an <code>AudioSource</code>, and app developers
267request to use the <code>AudioSource</code> abstraction instead of the actual audio hardware device to use.
268The Android Audio Policy Manager maps an <code>AudioSource</code> to the actual hardware with <code>AudioPolicyManagerBase::getDeviceForInputSource(int
269inputSource)</code>. In Android 4.2, the following sources are exposed to developers:
270</p>
271<ul>
272<code><li>android.media.MediaRecorder.AudioSource.CAMCORDER</li></code>
273<code><li>android.media.MediaRecorder.AudioSource.VOICE_COMMUNICATION</li></code>
274<code><li>android.media.MediaRecorder.AudioSource.VOICE_CALL</li></code>
275<code><li>android.media.MediaRecorder.AudioSource.VOICE_DOWNLINK</li></code>
276<code><li>android.media.MediaRecorder.AudioSource.VOICE_UPLINK</li></code>
277<code><li>android.media.MediaRecorder.AudioSource.VOICE_RECOGNITION</li></code>
278<code><li>android.media.MediaRecorder.AudioSource.MIC</li></code>
279<code><li>android.media.MediaRecorder.AudioSource.DEFAULT</li></code>
280</ul>
281
282<p>The default pre-processing effects that are applied for each <code>AudioSource</code> are
283specified in the <code>/system/etc/audio_effects.conf</code> file. To specify
284your own default effects for every <code>AudioSource</code>, create a <code>/system/vendor/etc/audio_effects.conf</code> file
285and specify any pre-processing effects that you need to turn on. For an example,
286see the implementation for the Nexus 10 in <code>device/samsung/manta/audio_effects.conf</code></p>
287
288<p class="warning"><strong>Warning:</strong> For the <code>VOICE_RECOGNITION</code> use case, do not enable
289the noise suppression pre-processing effect. It should not be turned on by default when recording from this audio source,
290and you should not enable it in your own audio_effects.conf file. Turning on the effect by default will cause the device to fail
291the <a href="/compatibility/index.html"> compatibility requirement </a>
292regardless of whether is was on by default due to configuration file, or the audio HAL implementation's default behavior.</p>
293
294<p>The following example enables pre-processing for the VoIP <code>AudioSource</code> and Camcorder <code>AudioSource</code>.
295By declaring the <code>AudioSource</code> configuration in this manner, the framework will automatically request from the audio HAL the use of those effects</p>
296
297<pre>
298pre_processing {
299   voice_communication {
300       aec {}
301       ns {}
302   }
303   camcorder {
304       agc {}
305   }
306}
307</pre>
308
309<h3 id="tuning">Source tuning</h3>
310<p>For <code>AudioSource</code> tuning, there are no explicit requirements on audio gain or audio processing
311with the exception of voice recognition (<code>VOICE_RECOGNITION</code>).</p>
312
313<p>The following are the requirements for voice recognition:</p>
314
315<ul>
316<li>"flat" frequency response (+/- 3dB) from 100Hz to 4kHz</li>
317<li>close-talk config: 90dB SPL reads RMS of 2500 (16bit samples)</li>
318<li>level tracks linearly from -18dB to +12dB relative to 90dB SPL</li>
319<li>THD < 1% (90dB SPL in 100 to 4000Hz range)</li>
320<li>8kHz sampling rate (anti-aliasing)</li>
321<li>Effects / pre-processing must be disabled by default</li>
322</ul>
323
324<p>Examples of tuning different effects for different sources are:</p>
325
326<ul>
327  <li>Noise Suppressor
328    <ul>
329      <li>Tuned for wind noise suppressor for <code>CAMCORDER</code></li>
330      <li>Tuned for stationary noise suppressor for <code>VOICE_COMMUNICATION</code></li>
331    </ul>
332  </li>
333  <li>Automatic Gain Control
334    <ul>
335      <li>Tuned for close-talk for <code>VOICE_COMMUNICATION</code> and main phone mic</li>
336      <li>Tuned for far-talk for <code>CAMCORDER</code></li>
337    </ul>
338  </li>
339</ul>
340
341<h3 id="more">More information</h3>
342<p>For more information, see:</p>
343<ul>
344<li>Android documentation for <a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx
345package</a>
346
347<li>Android documentation for <a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">Noise Suppression audio effect</a></li>
348<li><code>device/samsung/manta/audio_effects.conf</code> file for the Nexus 10</li>
349</ul>
350