• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
2   "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
3
4<html xmlns="http://www.w3.org/1999/xhtml">
5
6<head>
7<title>OpenSL ES for Android</title>
8</head>
9
10<body>
11
12<h1 id="title">OpenSL ES for Android</h1>
13
14<p>
15This article describes the Android native audio APIs based on the
16Khronos Group OpenSL ES&#8482; 1.0.1 standard.
17</p>
18<p>
19Unless otherwise noted,
20all features are available at Android API level 9 (Android platform
21version 2.3) and higher.
22Some features are only available at Android API level 14 (Android
23platform version 4.0) and higher; these are noted.
24</p>
25<p>
26OpenSL ES provides a C language interface that is also callable from C++. It
27exposes features similar to the audio portions of these Android Java APIs:
28</p>
29<ul>
30<li><a href="http://developer.android.com/reference/android/media/MediaPlayer.html">
31android.media.MediaPlayer</a>
32</li>
33<li><a href="http://developer.android.com/reference/android/media/MediaRecorder.html">
34android.media.MediaRecorder</a>
35</li>
36</ul>
37
38<p>
39As with all of the Android Native Development Kit (NDK), the primary
40purpose of OpenSL ES for Android is to facilitate the implementation
41of shared libraries to be called using the Java Native
42Interface (JNI).  NDK is not intended for writing pure C/C++
43applications.  That said, OpenSL ES is a full-featured API, and we
44expect that you should be able to accomplish most of your audio
45needs using only this API, without up-calls to code running in the Android runtime.
46</p>
47
48<p>
49Note: though based on OpenSL ES, the Android native audio API
50is <i>not</i> a conforming implementation of any OpenSL ES 1.0.1
51profile (game, music, or phone). This is because Android does not
52implement all of the features required by any one of the profiles.
53Any known cases where Android behaves differently than the specification
54are described in section "Android extensions" below.
55</p>
56
57<h2 id="gettingStarted">Getting started</h2>
58
59<h3 id="exampleCode">Example code</h3>
60
61<h4 id="recommended">Recommended</h4>
62
63<p>
64Supported and tested example code, usable as a model
65for your own code, is located in NDK folder
66<code>platforms/android-9/samples/native-audio/</code> and the
67<a href="https://github.com/googlesamples/android-ndk/tree/master/audio-echo">audio-echo</a>
68and
69<a href="https://github.com/googlesamples/android-ndk/tree/master/native-audio">native-audio</a>
70folders of the repository
71<a href="https://github.com/googlesamples/android-ndk">https://github.com/googlesamples/android-ndk</a>.
72</p>
73
74<h4 id="notRecommended">Not recommended</h4>
75
76<p>
77The OpenSL ES 1.0.1 specification contains example code in the
78appendices (see section "References" below for the link to this
79specification).  However, the examples in Appendix B: Sample Code
80and Appendix C: Use Case Sample Code use features
81not supported by Android. Some examples also contain
82typographical errors, or use APIs that are likely to change.
83Proceed with caution in referring to these;
84though the code may be helpful in understanding the full OpenSL ES
85standard, it should not be used as is with Android.
86</p>
87
88<h3 id="adding">Adding OpenSL ES to your application source code</h3>
89
90<p>
91OpenSL ES is a C API, but is callable from both C and C++ code.
92</p>
93<p>
94At a minimum, add the following line to your code:
95</p>
96<pre>
97#include &lt;SLES/OpenSLES.h&gt;
98</pre>
99
100<p>
101If you use Android extensions, also include this header:
102</p>
103<pre>
104#include &lt;SLES/OpenSLES_Android.h&gt;
105</pre>
106<p>
107which automatically includes these headers as well (you don't need to
108include these, they are shown as an aid in learning the API):
109</p>
110<pre>
111#include &lt;SLES/OpenSLES_AndroidConfiguration.h&gt;
112#include &lt;SLES/OpenSLES_AndroidMetadata.h&gt;
113</pre>
114
115<h3 id="makefile">Makefile</h3>
116
117<p>
118Modify your Android.mk as follows:
119</p>
120<pre>
121LOCAL_LDLIBS += -lOpenSLES
122</pre>
123
124<h3 id="audioContent">Audio content</h3>
125
126<p>
127There are many ways to package audio content for your
128application, including:
129</p>
130
131<dl>
132
133<dt>Resources</dt>
134<dd>
135By placing your audio files into the <code>res/raw/</code> folder,
136they can be accessed easily by the associated APIs for
137<a href="http://developer.android.com/reference/android/content/res/Resources.html">
138Resources</a>.  However there is no direct native access to resources,
139so you will need to write Java programming language code to copy them out before use.
140</dd>
141
142<dt>Assets</dt>
143<dd>
144By placing your audio files into the <code>assets/</code> folder,
145they will be directly accessible by the Android native asset manager
146APIs.  See the header files <code>android/asset_manager.h</code>
147and <code>android/asset_manager_jni.h</code> for more information
148on these APIs.  The example code
149located in NDK folder
150<code>platforms/android-9/samples/native-audio/</code> uses these
151native asset manager APIs in conjunction with the Android file
152descriptor data locator.
153</dd>
154
155<dt>Network</dt>
156<dd>
157You can use the URI data locator to play audio content directly from the
158network. However, be sure to read section "Security and permissions" below.
159</dd>
160
161<dt>Local filesystem</dt>
162<dd>
163The URI data locator supports the <code>file:</code> scheme for local files,
164provided the files are accessible by the application.
165Note that the Android security framework restricts file access via
166the Linux user ID and group ID mechanism.
167</dd>
168
169<dt>Recorded</dt>
170<dd>Your application can record audio data from the microphone input,
171store this content, and then play it back later.
172The example code uses this method for the "Playback" clip.
173</dd>
174
175<dt>Compiled and linked inline</dt>
176<dd>
177You can link your audio content directly into the shared library,
178and then play it using an audio player with buffer queue data locator.  This is most
179suitable for short PCM format clips.  The example code uses this
180technique for the "Hello" and "Android" clips. The PCM data was
181converted to hex strings using a <code>bin2c</code> tool (not supplied).
182</dd>
183
184<dt>Real-time synthesis</dt>
185<dd>
186Your application can synthesize PCM data on the fly and then play it
187using an audio player with buffer queue data locator.  This is a
188relatively advanced technique, and the details of audio synthesis
189are beyond the scope of this article.
190</dd>
191
192</dl>
193
194<p>
195Finding or creating useful audio content for your application is
196beyond the scope of this article, but see the "References" section
197below for some suggested web search terms.
198</p>
199<p>
200Note that it is your responsibility to ensure that you are legally
201permitted to play or record content, and that there may be privacy
202considerations for recording content.
203</p>
204
205<h3 id="debugging">Debugging</h3>
206
207<p>
208For robustness, we recommend that you examine the <code>SLresult</code>
209value which is returned by most APIs. Use of <code>assert</code>
210vs. more advanced error handling logic is a matter of coding style
211and the particular API; see the Wikipedia article on
212<a href="http://en.wikipedia.org/wiki/Assertion_(computing)">assert</a>
213for more information. In the supplied example, we have used <code>assert</code>
214for "impossible" conditions which would indicate a coding error, and
215explicit error handling for others which are more likely to occur
216in production.
217</p>
218<p>
219Many API errors result in a log entry, in addition to the non-zero
220result code. These log entries provide additional detail which can
221be especially useful for the more complex APIs such as
222<code>Engine::CreateAudioPlayer</code>.
223</p>
224<p>
225Use <a href="http://developer.android.com/guide/developing/tools/adb.html">
226adb logcat</a>, the
227<a href="http://developer.android.com/tools/help/adt.html">
228Eclipse ADT plugin</a> LogCat pane, or
229<a href="http://developer.android.com/guide/developing/tools/ddms.html#logcat">
230ddms logcat</a> to see the log.
231</p>
232
233<h2 id="supportedFeatures">Supported features from OpenSL ES 1.0.1</h2>
234
235<p>
236This section summarizes available features. In some
237cases, there are limitations which are described in the next
238sub-section.
239</p>
240
241<h3 id="globalEntry">Global entry points</h3>
242
243<p>
244Supported global entry points:
245</p>
246<ul>
247<li><code>slCreateEngine</code>
248</li>
249<li><code>slQueryNumSupportedEngineInterfaces</code>
250</li>
251<li><code>slQuerySupportedEngineInterfaces</code>
252</li>
253</ul>
254
255<h3 id="objectsInterfaces">Objects and interfaces</h3>
256
257<p>
258The following figure indicates objects and interfaces supported by
259Android's OpenSL ES implementation.  A green cell means the feature
260is supported.
261</p>
262
263<p>
264<img src="chart1.png" alt="Supported objects and interfaces" />
265</p>
266
267<h3 id="limitations">Limitations</h3>
268
269<p>
270This section details limitations with respect to the supported
271objects and interfaces from the previous section.
272</p>
273
274<h4 id="bufferQueueDataLocator">Buffer queue data locator</h4>
275
276<p>
277An audio player or recorder with buffer queue data locator supports
278PCM data format only.
279</p>
280
281<h4 id="deviceDataLocator">Device data locator</h4>
282
283<p>
284The only supported use of an I/O device data locator is when it is
285specified as the data source for <code>Engine::CreateAudioRecorder</code>.
286It should be initialized using these values, as shown in the example:
287</p>
288<pre>
289SLDataLocator_IODevice loc_dev =
290  {SL_DATALOCATOR_IODEVICE, SL_IODEVICE_AUDIOINPUT,
291  SL_DEFAULTDEVICEID_AUDIOINPUT, NULL};
292</pre>
293
294<h4 id="dynamicInterfaceManagementSupported">Dynamic interface management</h4>
295
296<p>
297<code>RemoveInterface</code> and <code>ResumeInterface</code> are not supported.
298</p>
299
300<h4 id="effectCombinations">Effect combinations</h4>
301
302<p>
303It is meaningless to have both environmental reverb and preset
304reverb on the same output mix.
305</p>
306<p>
307The platform may ignore effect requests if it estimates that the
308CPU load would be too high.
309</p>
310
311<h4 id="effectSend">Effect send</h4>
312
313<p>
314<code>SetSendLevel</code> supports a single send level per audio player.
315</p>
316
317<h4 id="environmentalReverb">Environmental reverb</h4>
318
319<p>
320Environmental reverb does not support the <code>reflectionsDelay</code>,
321<code>reflectionsLevel</code>, or <code>reverbDelay</code> fields of
322<code>struct SLEnvironmentalReverbSettings</code>.
323</p>
324
325<h4 id="mimeDataFormat">MIME data format</h4>
326
327<p>
328The MIME data format can be used with URI data locator only, and only
329for player (not recorder).
330</p>
331<p>
332The Android implementation of OpenSL ES requires that <code>mimeType</code>
333be initialized to either <code>NULL</code> or a valid UTF-8 string,
334and that <code>containerType</code> be initialized to a valid value.
335In the absence of other considerations, such as portability to other
336implementations, or content format which cannot be identified by header,
337we recommend that you
338set the <code>mimeType</code> to <code>NULL</code> and <code>containerType</code>
339to <code>SL_CONTAINERTYPE_UNSPECIFIED</code>.
340</p>
341<p>
342Supported formats include WAV PCM, WAV alaw, WAV ulaw, MP3, Ogg
343Vorbis, AAC LC, HE-AACv1 (aacPlus), HE-AACv2 (enhanced aacPlus),
344AMR, and FLAC [provided these are supported by the overall platform,
345and AAC formats must be located within an MP4 or ADTS container].
346MIDI is not supported.
347WMA is not part of the open source release, and compatibility
348with Android OpenSL ES has not been verified.
349</p>
350<p>
351The Android implementation of OpenSL ES does not support direct
352playback of DRM or encrypted content; if you want to play this, you
353will need to convert to cleartext in your application before playing,
354and enforce any DRM restrictions in your application.
355</p>
356
357<h4 id="object">Object</h4>
358
359<p>
360<code>Resume</code>, <code>RegisterCallback</code>,
361<code>AbortAsyncOperation</code>, <code>SetPriority</code>,
362<code>GetPriority</code>, and <code>SetLossOfControlInterfaces</code>
363are not supported.
364</p>
365
366<h4 id="pcmDataFormat">PCM data format</h4>
367
368<p>
369The PCM data format can be used with buffer queues only. Supported PCM
370playback configurations are 8-bit unsigned or 16-bit signed, mono
371or stereo, little endian byte ordering, and these sample rates:
3728000, 11025, 12000, 16000, 22050, 24000, 32000, 44100, or 48000 Hz.
373For recording, the supported configurations are device-dependent,
374however generally 16000 Hz mono 16-bit signed is usually available.
375</p>
376<p>
377Note that the field <code>samplesPerSec</code> is actually in
378units of milliHz, despite the misleading name. To avoid accidentally
379using the wrong value, you should initialize this field using one
380of the symbolic constants defined for this purpose (such as
381<code>SL_SAMPLINGRATE_44_1</code> etc.)
382</p>
383<p>
384For API level 21 and above, see section "Floating-point data" below.
385</p>
386
387<h4 id="playbackRate">Playback rate</h4>
388
389<p>
390<b>Note:</b>
391An OpenSL ES <em>playback rate</em> indicates the speed at which an
392object presents data, expressed in
393<a href="https://en.wikipedia.org/wiki/Per_mille">per mille</a>
394units.  A <em>rate range</em> is a closed interval that expresses
395possible rate ranges.
396</p>
397<p>
398The supported playback rate range(s) and capabilities may vary depending
399on the platform version and implementation, and so should be determined
400at runtime by querying with <code>PlaybackRate::GetRateRange</code>
401or <code>PlaybackRate::GetCapabilitiesOfRate</code>.
402</p>
403<p>
404That said, some guidance on typical rate ranges may be useful:
405In Android 2.3 a single playback rate range from 500 per mille to 2000 per mille
406inclusive is typically supported, with property
407<code>SL_RATEPROP_NOPITCHCORAUDIO</code>.
408In Android 4.0 the same rate range is typically supported for a data source
409in PCM format, and a unity rate range of 1000 per mille to 1000 per mille for other formats.
410</p>
411
412<h4 id="record">Record</h4>
413
414<p>
415The <code>SL_RECORDEVENT_HEADATLIMIT</code> and
416<code>SL_RECORDEVENT_HEADMOVING</code> events are not supported.
417</p>
418
419<h4 id="seek">Seek</h4>
420
421<p>
422<code>SetLoop</code> can only loop the whole file and not a portion of it;
423the parameters should be set as follows: the <code>startPos</code>
424parameter should be zero and the <code>endPos</code> parameter should
425be <code>SL_TIME_UNKNOWN</code>.
426</p>
427
428<h4 id="uriDataLocator">URI data locator</h4>
429
430<p>
431The URI data locator can be used with MIME data format only, and
432only for an audio player (not audio recorder). Supported schemes
433are <code>http:</code> and <code>file:</code>.
434A missing scheme defaults to the <code>file:</code> scheme. Other
435schemes such as <code>https:</code>, <code>ftp:</code>, and
436<code>content:</code> are not supported.
437<code>rtsp:</code> is not verified.
438</p>
439
440<h3 id="dataStructures">Data structures</h3>
441
442<p>
443Android supports these OpenSL ES 1.0.1 data structures:
444</p>
445<ul>
446<li>SLDataFormat_MIME
447</li>
448<li>SLDataFormat_PCM
449</li>
450<li>SLDataLocator_BufferQueue
451</li>
452<li>SLDataLocator_IODevice
453</li>
454<li>SLDataLocator_OutputMix
455</li>
456<li>SLDataLocator_URI
457</li>
458<li>SLDataSink
459</li>
460<li>SLDataSource
461</li>
462<li>SLEngineOption
463</li>
464<li>SLEnvironmentalReverbSettings
465</li>
466<li>SLInterfaceID
467</li>
468</ul>
469
470<h3 id="platformConfiguration">Platform configuration</h3>
471
472<p>
473OpenSL ES for Android is designed for multi-threaded applications,
474and is thread-safe.
475</p>
476<p>
477OpenSL ES for Android supports a single engine per application, and
478up to 32 objects per engine. Available device memory and CPU may further
479restrict the usable number of objects.
480</p>
481<p>
482<code>slCreateEngine</code> recognizes, but ignores, these engine options:
483</p>
484<ul>
485<li><code>SL_ENGINEOPTION_THREADSAFE</code>
486</li>
487<li><code>SL_ENGINEOPTION_LOSSOFCONTROL</code>
488</li>
489</ul>
490
491<p>
492OpenMAX AL and OpenSL ES may be used together in the same application.
493In this case, there is internally a single shared engine object,
494and the 32 object limit is shared between OpenMAX AL and OpenSL ES.
495The application should first create both engines, then use both engines,
496and finally destroy both engines.  The implementation maintains a
497reference count on the shared engine, so that it is correctly destroyed
498at the second destroy.
499</p>
500
501<h2 id="planningFutureVersions">Planning for future versions of OpenSL ES</h2>
502
503<p>
504The Android native audio APIs are based on Khronos
505Group OpenSL ES 1.0.1 (see section "References" below).
506Khronos has released
507a revised version 1.1 of the standard. The revised version
508includes new features, clarifications, correction of
509typographical errors, and some incompatibilities. Most of the expected
510incompatibilities are relatively minor, or are in areas of OpenSL ES
511not supported by Android. However, even a small change
512can be significant for an application developer, so it is important
513to prepare for this.
514</p>
515<p>
516The Android team is committed to preserving future API binary
517compatibility for developers to the extent feasible. It is our
518intention to continue to support future binary compatibility of the
5191.0.1-based API, even as we add support for later versions of the
520standard. An application developed with this version should
521work on future versions of the Android platform, provided that
522you follow the guidelines listed in section "Planning for
523binary compatibility" below.
524</p>
525<p>
526Note that future source compatibility will <i>not</i> be a goal. That is,
527if you upgrade to a newer version of the NDK, you may need to modify
528your application source code to conform to the new API. We expect
529that most such changes will be minor; see details below.
530</p>
531
532<h3 id="planningBinary">Planning for binary compatibility</h3>
533
534<p>
535We recommend that your application follow these guidelines,
536to improve future binary compatibility:
537</p>
538<ul>
539<li>
540Use only the documented subset of Android-supported features from
541OpenSL ES 1.0.1.
542</li>
543<li>
544Do not depend on a particular result code for an unsuccessful
545operation; be prepared to deal with a different result code.
546</li>
547<li>
548Application callback handlers generally run in a restricted context,
549and should be written to perform their work quickly and then return
550as soon as possible. Do not do complex operations within a callback
551handler. For example, within a buffer queue completion callback,
552you can enqueue another buffer, but do not create an audio player.
553</li>
554<li>
555Callback handlers should be prepared to be called more or less
556frequently, to receive additional event types, and should ignore
557event types that they do not recognize. Callbacks that are configured
558with an event mask of enabled event types should be prepared to be
559called with multiple event type bits set simultaneously.
560Use "&amp;" to test for each event bit rather than a switch case.
561</li>
562<li>
563Use prefetch status and callbacks as a general indication of progress, but do
564not depend on specific hard-coded fill levels or callback sequence.
565The meaning of the prefetch status fill level, and the behavior for
566errors that are detected during prefetch, may change.
567</li>
568<li>
569See section "Buffer queue behavior" below.
570</li>
571</ul>
572
573<h3 id="planningSource">Planning for source compatibility</h3>
574
575<p>
576As mentioned, source code incompatibilities are expected in the next
577version of OpenSL ES from Khronos Group. Likely areas of change include:
578</p>
579
580<ul>
581<li>The buffer queue interface is expected to have significant changes,
582especially in the areas of <code>BufferQueue::Enqueue</code>, the parameter
583list for <code>slBufferQueueCallback</code>,
584and the name of field <code>SLBufferQueueState.playIndex</code>.
585We recommend that your application code use Android simple buffer
586queues instead, because we do not plan to change that API.
587In the example code supplied with the NDK, we have used
588Android simple buffer queues for playback for this reason.
589(We also use Android simple buffer queue for recording and decode to PCM, but
590that is because standard OpenSL ES 1.0.1 does not support record or decode to
591a buffer queue data sink.)
592</li>
593<li>Addition of <code>const</code> to input parameters passed by reference,
594and to <code>SLchar *</code> struct fields used as input values.
595This should not require any changes to your code.
596</li>
597<li>Substitution of unsigned types for some parameters that are
598currently signed.  You may need to change a parameter type from
599<code>SLint32</code> to <code>SLuint32</code> or similar, or add a cast.
600</li>
601<li><code>Equalizer::GetPresetName</code> will copy the string to
602application memory instead of returning a pointer to implementation
603memory. This will be a significant change, so we recommend that you
604either avoid calling this method, or isolate your use of it.
605</li>
606<li>Additional fields in struct types. For output parameters, these
607new fields can be ignored, but for input parameters the new fields
608will need to be initialized. Fortunately, these are expected to all
609be in areas not supported by Android.
610</li>
611<li>Interface
612<a href="http://en.wikipedia.org/wiki/Globally_unique_identifier">
613GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID
614to avoid a dependency.
615</li>
616<li><code>SLchar</code> will change from <code>unsigned char</code>
617to <code>char</code>. This primarily affects the URI data locator
618and MIME data format.
619</li>
620<li><code>SLDataFormat_MIME.mimeType</code> will be renamed to <code>pMimeType</code>,
621and <code>SLDataLocator_URI.URI</code> will be renamed to <code>pURI</code>.
622We recommend that you initialize the <code>SLDataFormat_MIME</code>
623and <code>SLDataLocator_URI</code>
624data structures using a brace-enclosed comma-separated list of values,
625rather than by field name, to isolate your code from this change.
626In the example code we have used this technique.
627</li>
628<li><code>SL_DATAFORMAT_PCM</code> does not permit the application
629to specify the representation of the data as signed integer, unsigned
630integer, or floating-point. The Android implementation assumes that
6318-bit data is unsigned integer and 16-bit is signed integer.  In
632addition, the field <code>samplesPerSec</code> is a misnomer, as
633the actual units are milliHz. These issues are expected to be
634addressed in the next OpenSL ES version, which will introduce a new
635extended PCM data format that permits the application to explicitly
636specify the representation, and corrects the field name.  As this
637will be a new data format, and the current PCM data format will
638still be available (though deprecated), it should not require any
639immediate changes to your code.
640</li>
641</ul>
642
643<h2 id="androidExtensions">Android extensions</h2>
644
645<p>
646The API for Android extensions is defined in <code>SLES/OpenSLES_Android.h</code>
647and the header files that it includes.
648Consult that file for details on these extensions. Unless otherwise
649noted, all interfaces are "explicit".
650</p>
651<p>
652Note that use these extensions will limit your application's
653portability to other OpenSL ES implementations. If this is a concern,
654we advise that you avoid using them, or isolate your use of these
655with <code>#ifdef</code> etc.
656</p>
657<p>
658The following figure shows which Android-specific interfaces and
659data locators are available for each object type.
660</p>
661
662<p>
663<img src="chart2.png" alt="Android extensions" />
664</p>
665
666<h3 id="androidConfiguration">Android configuration interface</h3>
667
668<p>
669The Android configuration interface provides a means to set
670platform-specific parameters for objects. Unlike other OpenSL ES
6711.0.1 interfaces, the Android configuration interface is available
672prior to object realization. This permits the object to be configured
673and then realized. Header file <code>SLES/OpenSLES_AndroidConfiguration.h</code>
674documents the available configuration keys and values:
675</p>
676<ul>
677<li>stream type for audio players (default <code>SL_ANDROID_STREAM_MEDIA</code>)
678</li>
679<li>record profile for audio recorders (default <code>SL_ANDROID_RECORDING_PRESET_GENERIC</code>)
680</li>
681</ul>
682<p>
683Here is an example code fragment that sets the Android audio stream type on an audio player:
684</p>
685<pre>
686// CreateAudioPlayer and specify SL_IID_ANDROIDCONFIGURATION
687// in the required interface ID array. Do not realize player yet.
688// ...
689SLAndroidConfigurationItf playerConfig;
690result = (*playerObject)-&gt;GetInterface(playerObject,
691    SL_IID_ANDROIDCONFIGURATION, &amp;playerConfig);
692assert(SL_RESULT_SUCCESS == result);
693SLint32 streamType = SL_ANDROID_STREAM_ALARM;
694result = (*playerConfig)-&gt;SetConfiguration(playerConfig,
695    SL_ANDROID_KEY_STREAM_TYPE, &amp;streamType, sizeof(SLint32));
696assert(SL_RESULT_SUCCESS == result);
697// ...
698// Now realize the player here.
699</pre>
700<p>
701Similar code can be used to configure the preset for an audio recorder.
702</p>
703
704<h3 id="androidEffects">Android effects interfaces</h3>
705
706<p>
707The Android effect, effect send, and effect capabilities interfaces provide
708a generic mechanism for an application to query and use device-specific
709audio effects. A device manufacturer should document any available
710device-specific audio effects.
711</p>
712<p>
713Portable applications should use the OpenSL ES 1.0.1 APIs
714for audio effects instead of the Android effect extensions.
715</p>
716
717<h3 id="androidFile">Android file descriptor data locator</h3>
718
719<p>
720The Android file descriptor data locator permits the source for an
721audio player to be specified as an open file descriptor with read
722access. The data format must be MIME.
723</p>
724<p>
725This is especially useful in conjunction with the native asset manager.
726</p>
727
728<h3 id="androidSimple">Android simple buffer queue data locator and interface</h3>
729
730<p>
731The Android simple buffer queue data locator and interface are
732identical to the OpenSL ES 1.0.1 buffer queue locator and interface,
733except that Android simple buffer queues may be used with both audio
734players and audio recorders, and are limited to PCM data format.
735[OpenSL ES 1.0.1 buffer queues are for audio players only, and are not
736restricted to PCM data format.]
737</p>
738<p>
739For recording, the application should enqueue empty buffers. Upon
740notification of completion via a registered callback, the filled
741buffer is available for the application to read.
742</p>
743<p>
744For playback there is no difference. But for future source code
745compatibility, we suggest that applications use Android simple
746buffer queues instead of OpenSL ES 1.0.1 buffer queues.
747</p>
748
749<h3 id="dynamicInterfaceObjectCreation">Dynamic interfaces at object creation</h3>
750
751<p>
752For convenience, the Android implementation of OpenSL ES 1.0.1
753permits dynamic interfaces to be specified at object creation time,
754as an alternative to adding these interfaces after object creation
755with <code>DynamicInterfaceManagement::AddInterface</code>.
756</p>
757
758<h3 id="bufferQueueBehavior">Buffer queue behavior</h3>
759
760<p>
761The OpenSL ES 1.0.1 specification requires that "On transition to
762the <code>SL_PLAYSTATE_STOPPED</code> state the play cursor is
763returned to the beginning of the currently playing buffer." The
764Android implementation does not necessarily conform to this
765requirement. For Android, it is unspecified whether a transition
766to <code>SL_PLAYSTATE_STOPPED</code> operates as described, or
767leaves the play cursor unchanged.
768</p>
769<p>
770We recommend that you do not rely on either behavior; after a
771transition to <code>SL_PLAYSTATE_STOPPED</code>, you should explicitly
772call <code>BufferQueue::Clear</code>. This will place the buffer
773queue into a known state.
774</p>
775<p>
776A corollary is that it is unspecified whether buffer queue callbacks
777are called upon transition to <code>SL_PLAYSTATE_STOPPED</code> or by
778<code>BufferQueue::Clear</code>.
779We recommend that you do not rely on either behavior; be prepared
780to receive a callback in these cases, but also do not depend on
781receiving one.
782</p>
783<p>
784It is expected that a future version of OpenSL ES will clarify these
785issues. However, upgrading to that version would result in source
786code incompatibilities (see section "Planning for source compatibility"
787above).
788</p>
789
790<h3 id="reportingExtensions">Reporting of extensions</h3>
791
792<p>
793<code>Engine::QueryNumSupportedExtensions</code>,
794<code>Engine::QuerySupportedExtension</code>,
795<code>Engine::IsExtensionSupported</code> report these extensions:
796</p>
797<ul>
798<li><code>ANDROID_SDK_LEVEL_#</code>
799where # is the platform API level, 9 or higher
800</li>
801</ul>
802
803<h3 id="decodeAudio">Decode audio to PCM</h3>
804
805<p>
806This section describes a deprecated Android-specific extension to OpenSL ES 1.0.1
807for decoding an encoded stream to PCM without immediate playback.
808The table below gives recommendations for use of this extension and alternatives.
809</p>
810<table>
811<tr>
812  <th>API level</th>
813  <th>Extension<br /> is available</th>
814  <th>Extension<br /> is recommended</th>
815  <th>Alternatives</th>
816</tr>
817<tr>
818  <td>13<br /> and below</td>
819  <td>no</td>
820  <td>N/A</td>
821  <td>open source codec with suitable license</td>
822</tr>
823<tr>
824  <td>14 to 15</td>
825  <td>yes</td>
826  <td>no</td>
827  <td>open source codec with suitable license</td>
828</tr>
829<tr>
830  <td>16 to 20</td>
831  <td>yes</td>
832  <td>no</td>
833  <td>
834    <a href="http://developer.android.com/reference/android/media/MediaCodec.html">android.media.MediaCodec</a><br />
835    or open source codec with suitable license
836  </td>
837</tr>
838<tr>
839  <td>21<br /> and above</td>
840  <td>yes</td>
841  <td>no</td>
842  <td>
843    NDK MediaCodec in &lt;media/NdkMedia*.h&gt;<br />
844    or <a href="http://developer.android.com/reference/android/media/MediaCodec.html">android.media.MediaCodec</a><br />
845    or open source codec with suitable license
846  </td>
847</tr>
848</table>
849
850<p>
851A standard audio player plays back to an audio device, and the data sink
852is specified as an output mix.
853However, as an Android extension, an audio player instead
854acts as a decoder if the data source is specified as a URI or Android
855file descriptor data locator with MIME data format, and the data sink is
856an Android simple buffer queue data locator with PCM data format.
857</p>
858<p>
859This feature is primarily intended for games to pre-load their
860audio assets when changing to a new game level, similar to
861<code>android.media.SoundPool</code>.
862</p>
863<p>
864The application should initially enqueue a set of empty buffers to the Android simple
865buffer queue. These buffers will be filled with the decoded PCM data.  The Android simple
866buffer queue callback is invoked after each buffer is filled. The
867callback handler should process the PCM data, re-enqueue the
868now-empty buffer, and then return.  The application is responsible for
869keeping track of decoded buffers; the callback parameter list does not include
870sufficient information to indicate which buffer was filled or which buffer to enqueue next.
871</p>
872<p>
873The end of stream is determined implicitly by the data source.
874At the end of stream a <code>SL_PLAYEVENT_HEADATEND</code> event is
875delivered. The Android simple buffer queue callback will no longer
876be called after all consumed data is decoded.
877</p>
878<p>
879The sink's PCM data format typically matches that of the encoded data source
880with respect to sample rate, channel count, and bit depth. However, the platform
881implementation is permitted to decode to a different sample rate, channel count, or bit depth.
882There is a provision to detect the actual PCM format; see section "Determining
883the format of decoded PCM data via metadata" below.
884</p>
885<p>
886Decode to PCM supports pause and initial seek.  Volume control, effects,
887looping, and playback rate are not supported.
888</p>
889<p>
890Depending on the platform implementation, decoding may require resources
891that cannot be left idle.  Therefore it is not recommended to starve the
892decoder by failing to provide a sufficient number of empty PCM buffers,
893e.g. by returning from the Android simple buffer queue callback without
894enqueueing another empty buffer.  The result of decoder starvation is
895unspecified; the implementation may choose to either drop the decoded
896PCM data, pause the decoding process, or in severe cases terminate
897the decoder.
898</p>
899
900<h3 id="decodeStreaming">Decode streaming ADTS AAC to PCM</h3>
901
902<p>
903Note: this feature is available at API level 14 and higher.
904</p>
905<p>
906An audio player acts as a streaming decoder if the data source is an
907Android buffer queue data locator with MIME data format, and the data
908sink is an Android simple buffer queue data locator with PCM data format.
909The MIME data format should be configured as:
910</p>
911<dl>
912<dt>container</dt>
913<dd><code>SL_CONTAINERTYPE_RAW</code>
914</dd>
915<dt>MIME type string
916</dt>
917<dd><code>"audio/vnd.android.aac-adts"</code> (macro <code>SL_ANDROID_MIME_AACADTS</code>)
918</dd>
919</dl>
920<p>
921This feature is primarily intended for streaming media applications that
922deal with AAC audio, but need to apply custom processing of the audio
923prior to playback.  Most applications that need to decode audio to PCM
924should use the method of the previous section "Decode audio to PCM",
925as it is simpler and handles more audio formats.  The technique described
926here is a more specialized approach, to be used only if both of these
927conditions apply:
928</p>
929<ul>
930<li>the compressed audio source is a stream of AAC frames contained by ADTS headers
931</li>
932<li>the application manages this stream, that is the data is <i>not</i> located within
933a network resource identified by URI or within a local file identified by file descriptor.
934</li>
935</ul>
936<p>
937The application should initially enqueue a set of filled buffers to the Android buffer queue.
938Each buffer contains one or more complete ADTS AAC frames.
939The Android buffer queue callback is invoked after each buffer is emptied.
940The callback handler should re-fill and re-enqueue the buffer, and then return.
941The application need not keep track of encoded buffers; the callback parameter
942list does include sufficient information to indicate which buffer to enqueue next.
943The end of stream is explicitly marked by enqueuing an EOS item.
944After EOS, no more enqueues are permitted.
945</p>
946<p>
947It is not recommended to starve the decoder by failing to provide full
948ADTS AAC buffers, e.g. by returning from the Android buffer queue callback
949without enqueueing another full buffer.  The result of decoder starvation
950is unspecified.
951</p>
952<p>
953In all respects except for the data source, the streaming decode method is similar
954to that of the previous section:
955</p>
956<ul>
957<li>initially enqueue a set of empty buffers to the Android simple buffer queue
958</li>
959<li>the Android simple buffer queue callback is invoked after each buffer is filled with PCM data;
960the callback handler should process the PCM data and then re-enqueue another empty buffer
961</li>
962<li>the <code>SL_PLAYEVENT_HEADATEND</code> event is delivered at end of stream
963</li>
964<li>the actual PCM format should be detected using metadata rather than by making an assumption
965</li>
966<li>the same limitations apply with respect to volume control, effects, etc.
967</li>
968<li>starvation for lack of empty PCM buffers is not recommended
969</li>
970</ul>
971<p>
972Despite the similarity in names, an Android buffer queue is <i>not</i>
973the same as an Android simple buffer queue.  The streaming decoder
974uses both kinds of buffer queues: an Android buffer queue for the ADTS
975AAC data source, and an Android simple buffer queue for the PCM data
976sink.  The Android simple buffer queue API is described in this document
977in section "Android simple buffer queue data locator and interface".
978The Android buffer queue API is described in the Android native media
979API documentation, located in $NDK_ROOT/docs/Additional_library_docs/openmaxal/index.html
980</p>
981
982<h3 id="determiningFormat">Determining the format of decoded PCM data via metadata</h3>
983
984<p>
985The metadata extraction interface <code>SLMetadataExtractionItf</code>
986is a standard OpenSL ES 1.0.1 interface, not an Android extension.
987However, the particular metadata keys that
988indicate the actual format of decoded PCM data are specific to Android,
989and are defined in header <code>SLES/OpenSLES_AndroidMetadata.h</code>.
990</p>
991<p>
992The metadata key indices are available immediately after
993<code>Object::Realize</code>. Yet the associated values are not
994available until after the first encoded data has been decoded.  A good
995practice is to query for the key indices in the main thread after Realize,
996and to read the PCM format metadata values in the Android simple
997buffer queue callback handler the first time it is called.
998</p>
999<p>
1000The OpenSL ES 1.0.1 metadata extraction interface
1001<code>SLMetadataExtractionItf</code> is admittedly cumbersome, as it
1002requires a multi-step process to first determine key indices and then
1003to get the key values.  Consult the example code for snippets showing
1004how to work with this interface.
1005</p>
1006<p>
1007Metadata key names are stable.  But the key indices are not documented
1008and are subject to change.  An application should not assume that indices
1009are persistent across different execution runs, and should not assume that
1010indices are shared for different object instances within the same run.
1011</p>
1012
1013<h3 id="floatingPoint">Floating-point data</h3>
1014
1015<p>
1016As of API level 21 and above, data can be supplied to an AudioPlayer in
1017single-precision floating-point format.
1018</p>
1019<p>
1020Example code fragment, to be used during the Engine::CreateAudioPlayer process:
1021</p>
1022<pre>
1023#include &lt;SLES/OpenSLES_Android.h&gt;
1024...
1025SLAndroidDataFormat_PCM_EX pcm;
1026pcm.formatType = SL_ANDROID_DATAFORMAT_PCM_EX;
1027pcm.numChannels = 2;
1028pcm.sampleRate = SL_SAMPLINGRATE_44_1;
1029pcm.bitsPerSample = 32;
1030pcm.containerSize = 32;
1031pcm.channelMask = SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT;
1032pcm.endianness = SL_BYTEORDER_LITTLEENDIAN;
1033pcm.representation = SL_ANDROID_PCM_REPRESENTATION_FLOAT;
1034...
1035SLDataSource audiosrc;
1036audiosrc.pLocator = ...
1037audiosrc.pFormat = &amp;pcm;
1038</pre>
1039
1040<h2 id="programmingNotes">Programming notes</h2>
1041
1042<p>
1043These notes supplement the OpenSL ES 1.0.1 specification,
1044available in the "References" section below.
1045</p>
1046
1047<h3 id="objectsInitialization">Objects and interface initialization</h3>
1048
1049<p>
1050Two aspects of the OpenSL ES programming model that may be unfamiliar
1051to new developers are the distinction between objects and interfaces,
1052and the initialization sequence.
1053</p>
1054<p>
1055Briefly, an OpenSL ES object is similar to the object concept
1056in programming languages such as Java and C++, except an OpenSL ES
1057object is <i>only</i> visible via its associated interfaces. This
1058includes the initial interface for all objects, called
1059<code>SLObjectItf</code>.  There is no handle for an object itself,
1060only a handle to the <code>SLObjectItf</code> interface of the object.
1061</p>
1062<p>
1063An OpenSL ES object is first "created", which returns an
1064<code>SLObjectItf</code>, then "realized". This is similar to the
1065common programming pattern of first constructing an object (which
1066should never fail other than for lack of memory or invalid parameters),
1067and then completing initialization (which may fail due to lack of
1068resources).  The realize step gives the implementation a
1069logical place to allocate additional resources if needed.
1070</p>
1071<p>
1072As part of the API to create an object, an application specifies
1073an array of desired interfaces that it plans to acquire later. Note
1074that this array does <i>not</i> automatically acquire the interfaces;
1075it merely indicates a future intention to acquire them.  Interfaces
1076are distinguished as "implicit" or "explicit".  An explicit interface
1077<i>must</i> be listed in the array if it will be acquired later.
1078An implicit interface need not be listed in the object create array,
1079but there is no harm in listing it there.  OpenSL ES has one more
1080kind of interface called "dynamic", which does not need to be
1081specified in the object create array, and can be added later after
1082the object is created.  The Android implementation provides a
1083convenience feature to avoid this complexity; see section "Dynamic
1084interfaces at object creation" above.
1085</p>
1086<p>
1087After the object is created and realized, the application should
1088acquire interfaces for each feature it needs, using
1089<code>GetInterface</code> on the initial <code>SLObjectItf</code>.
1090</p>
1091<p>
1092Finally, the object is available for use via its interfaces, though
1093note that some objects require further setup. In particular, an
1094audio player with URI data source needs a bit more preparation in
1095order to detect connection errors. See the next section
1096"Audio player prefetch" for details.
1097</p>
1098<p>
1099After your application is done with the object, you should explicitly
1100destroy it; see section "Destroy" below.
1101</p>
1102
1103<h3 id="audioPlayerPrefetch">Audio player prefetch</h3>
1104
1105<p>
1106For an audio player with URI data source, <code>Object::Realize</code> allocates resources
1107but does not connect to the data source (i.e. "prepare") or begin
1108pre-fetching data. These occur once the player state is set to
1109either <code>SL_PLAYSTATE_PAUSED</code> or <code>SL_PLAYSTATE_PLAYING</code>.
1110</p>
1111<p>
1112Note that some information may still be unknown until relatively
1113late in this sequence. In particular, initially
1114<code>Player::GetDuration</code> will return <code>SL_TIME_UNKNOWN</code>
1115and <code>MuteSolo::GetChannelCount</code> will either return successfully
1116with channel count zero
1117or the error result <code>SL_RESULT_PRECONDITIONS_VIOLATED</code>.
1118These APIs will return the proper values once they are known.
1119</p>
1120<p>
1121Other properties that are initially unknown include the sample rate
1122and actual media content type based on examining the content's header
1123(as opposed to the application-specified MIME type and container type).
1124These too, are determined later during prepare / prefetch, but there are
1125no APIs to retrieve them.
1126</p>
1127<p>
1128The prefetch status interface is useful for detecting when all
1129information is available. Or, your application can poll periodically.
1130Note that some information may <i>never</i> be known, for example,
1131the duration of a streaming MP3.
1132</p>
1133<p>
1134The prefetch status interface is also useful for detecting errors.
1135Register a callback and enable at least the
1136<code>SL_PREFETCHEVENT_FILLLEVELCHANGE</code> and
1137<code>SL_PREFETCHEVENT_STATUSCHANGE</code> events. If both of these
1138events are delivered simultaneously, and
1139<code>PrefetchStatus::GetFillLevel</code> reports a zero level, and
1140<code>PrefetchStatus::GetPrefetchStatus</code> reports
1141<code>SL_PREFETCHSTATUS_UNDERFLOW</code>, then this indicates a
1142non-recoverable error in the data source.
1143This includes the inability to connect to the data source because
1144the local filename does not exist or the network URI is invalid.
1145</p>
1146<p>
1147The next version of OpenSL ES is expected to add more explicit
1148support for handling errors in the data source. However, for future
1149binary compatibility, we intend to continue to support the current
1150method for reporting a non-recoverable error.
1151</p>
1152<p>
1153In summary, a recommended code sequence is:
1154</p>
1155<ol>
1156<li>Engine::CreateAudioPlayer
1157</li>
1158<li>Object:Realize
1159</li>
1160<li>Object::GetInterface for SL_IID_PREFETCHSTATUS
1161</li>
1162<li>PrefetchStatus::SetCallbackEventsMask
1163</li>
1164<li>PrefetchStatus::SetFillUpdatePeriod
1165</li>
1166<li>PrefetchStatus::RegisterCallback
1167</li>
1168<li>Object::GetInterface for SL_IID_PLAY
1169</li>
1170<li>Play::SetPlayState to SL_PLAYSTATE_PAUSED or SL_PLAYSTATE_PLAYING
1171</li>
1172<li>preparation and prefetching occur here; during this time your
1173callback will be called with periodic status updates
1174</li>
1175</ol>
1176
1177<h3 id="destroy">Destroy</h3>
1178
1179<p>
1180Be sure to destroy all objects on exit from your application.  Objects
1181should be destroyed in reverse order of their creation, as it is
1182not safe to destroy an object that has any dependent objects.
1183For example, destroy in this order: audio players and recorders,
1184output mix, then finally the engine.
1185</p>
1186<p>
1187OpenSL ES does not support automatic garbage collection or
1188<a href="http://en.wikipedia.org/wiki/Reference_counting">reference counting</a>
1189of interfaces. After you call <code>Object::Destroy</code>, all extant
1190interfaces derived from the associated object become <i>undefined</i>.
1191</p>
1192<p>
1193The Android OpenSL ES implementation does not detect the incorrect
1194use of such interfaces.
1195Continuing to use such interfaces after the object is destroyed will
1196cause your application to crash or behave in unpredictable ways.
1197</p>
1198<p>
1199We recommend that you explicitly set both the primary object interface
1200and all associated interfaces to NULL as part of your object
1201destruction sequence, to prevent the accidental misuse of a stale
1202interface handle.
1203</p>
1204
1205<h3 id="stereoPanning">Stereo panning</h3>
1206
1207<p>
1208When <code>Volume::EnableStereoPosition</code> is used to enable
1209stereo panning of a mono source, there is a 3 dB reduction in total
1210<a href="http://en.wikipedia.org/wiki/Sound_power_level">
1211sound power level</a>.  This is needed to permit the total sound
1212power level to remain constant as the source is panned from one
1213channel to the other. Therefore, don't enable stereo positioning
1214if you don't need it.  See the Wikipedia article on
1215<a href="http://en.wikipedia.org/wiki/Panning_(audio)">audio panning</a>
1216for more information.
1217</p>
1218
1219<h3 id="callbacksThreads">Callbacks and threads</h3>
1220
1221<p>
1222Callback handlers are generally called <i>synchronously</i> with
1223respect to the event, that is, at the moment and location where the
1224event is detected by the implementation. But this point is
1225<i>asynchronous</i> with respect to the application. Thus you should
1226use a non-blocking synchronization mechanism to control access
1227to any variables shared between the application and the callback
1228handler. In the example code, such as for buffer queues, we have
1229either omitted this synchronization or used blocking synchronization in
1230the interest of simplicity. However, proper non-blocking synchronization
1231would be critical for any production code.
1232</p>
1233<p>
1234Callback handlers are called from internal
1235non-application thread(s) which are not attached to the Android runtime and thus
1236are ineligible to use JNI. Because these internal threads are
1237critical to the integrity of the OpenSL ES implementation, a callback
1238handler should also not block or perform excessive work.
1239</p>
1240<p>
1241If your callback handler needs to use JNI, or execute work that is
1242not proportional to the callback, the handler should instead post an
1243event for another thread to process.  Examples of acceptable callback
1244workload include rendering and enqueuing the next output buffer (for an
1245AudioPlayer), processing the just-filled input buffer and enqueueing the
1246next empty buffer (for an AudioRecorder), or simple APIs such as most
1247of the "Get" family.  See section "Performance" below regarding the workload.
1248</p>
1249<p>
1250Note that the converse is safe: an Android application thread which has
1251entered JNI is allowed to directly call OpenSL ES APIs, including
1252those which block. However, blocking calls are not recommended from
1253the main thread, as they may result in "Application Not
1254Responding" (ANR).
1255</p>
1256<p>
1257The choice of which thread calls a callback handler is largely left up
1258to the implementation.  The reason for this flexibility is to permit
1259future optimizations, especially on multi-core devices.
1260</p>
1261<p>
1262The thread on which the callback handler runs is not guaranteed to have
1263the same identity across different calls.  Therefore do not rely on the
1264<code>pthread_t</code> returned by <code>pthread_self()</code>, or the
1265<code>pid_t</code> returned by <code>gettid()</code>, to be consistent
1266across calls.  Don't use the thread local storage (TLS) APIs such as
1267<code>pthread_setspecific()</code> and <code>pthread_getspecific()</code>
1268from a callback, for the same reason.
1269</p>
1270<p>
1271The implementation guarantees that concurrent callbacks of the same kind,
1272for the same object, will not occur.  However, concurrent callbacks of
1273<i>different</i> kinds for the same object are possible, on different threads.
1274</p>
1275
1276<h3 id="performance">Performance</h3>
1277
1278<p>
1279As OpenSL ES is a native C API, non-runtime application threads which
1280call OpenSL ES have no runtime-related overhead such as garbage
1281collection pauses. With one exception described below, there is no additional performance
1282benefit to the use of OpenSL ES other than this. In particular, use
1283of OpenSL ES does not guarantee a lower audio latency, higher scheduling
1284priority, etc. than what the platform generally provides.
1285On the other hand, as the Android platform and specific device
1286implementations continue to evolve, an OpenSL ES application can
1287expect to benefit from any future system performance improvements.
1288</p>
1289<p>
1290One such evolution is support for reduced audio output latency.
1291The underpinnings for reduced output latency were first included in
1292the Android 4.1 platform release ("Jellybean"), and then continued
1293progress occurred in the Android 4.2 platform.  These improvements
1294are available via OpenSL ES for device implementations that claim feature
1295"android.hardware.audio.low_latency". If the device doesn't claim this
1296feature but supports API level 9 (Android platform version 2.3) or later,
1297then you can still use the OpenSL ES APIs but the output latency may be higher.
1298The lower output latency path is used only if the application requests a
1299buffer size and sample rate that are
1300compatible with the device's native output configuration.
1301These parameters are device-specific and should be obtained as follows.
1302</p>
1303<p>
1304Beginning with API level 17 (Android platform version 4.2), an application
1305can query for the platform native or optimal output sample rate and buffer size
1306for the device's primary output stream.  When combined with the feature
1307test just mentioned, an app can now configure itself appropriately for
1308lower latency output on devices that claim support.
1309</p>
1310<p>
1311For API level 17 (Android platform version 4.2.2) and earlier,
1312a buffer count of 2 or more is required for lower latency.
1313Beginning with API level 18 (Android platform version 4.3),
1314a buffer count of 1 is sufficient for lower latency.
1315</p>
1316<p>
1317All
1318OpenSL ES interfaces for output effects preclude the lower latency path.
1319</p>
1320<p>
1321The recommended sequence is:
1322</p>
1323<ol>
1324<li>Check for API level 9 or higher, to confirm use of OpenSL ES.
1325</li>
1326<li>Check for feature "android.hardware.audio.low_latency" using code such as this:
1327<pre>
1328import android.content.pm.PackageManager;
1329...
1330PackageManager pm = getContext().getPackageManager();
1331boolean claimsFeature = pm.hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
1332</pre>
1333</li>
1334<li>Check for API level 17 or higher, to confirm use of
1335<code>android.media.AudioManager.getProperty()</code>.
1336</li>
1337<li>Get the native or optimal output sample rate and buffer size for this device's primary output
1338stream, using code such as this:
1339<pre>
1340import android.media.AudioManager;
1341...
1342AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
1343String sampleRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
1344String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER));
1345</pre>
1346Note that <code>sampleRate</code> and <code>framesPerBuffer</code>
1347are <code>String</code>s.  First check for <code>null</code>
1348and then convert to <code>int</code> using <code>Integer.parseInt()</code>.
1349</li>
1350<li>Now use OpenSL ES to create an AudioPlayer with PCM buffer queue data locator.
1351</li>
1352</ol>
1353<p>
1354The number of lower latency audio players is limited. If your application
1355requires more than a few audio sources, consider mixing your audio at
1356application level.  Be sure to destroy your audio players when your
1357activity is paused, as they are a global resource shared with other apps.
1358</p>
1359<p>
1360To avoid audible glitches, the buffer queue callback handler must execute
1361within a small and predictable time window. This typically implies no
1362unbounded blocking on mutexes, conditions, or I/O operations. Instead consider
1363"try locks", locks and waits with timeouts, and non-blocking algorithms.
1364</p>
1365<p>
1366The computation required to render the next buffer (for AudioPlayer) or
1367consume the previous buffer (for AudioRecord) should take approximately
1368the same amount of time for each callback.  Avoid algorithms that execute in
1369a non-deterministic amount of time, or are "bursty" in their computations.
1370A callback computation is bursty if the CPU time spent in any
1371given callback is significantly larger than the average.
1372In summary, the ideal is for the CPU execution time of the handler to have
1373variance near zero, and for the handler to not block for unbounded times.
1374</p>
1375<p>
1376Lower latency audio is possible for these outputs only: on-device speaker, wired
1377headphones, wired headset, line out, and USB digital audio.
1378On some devices, speaker latency is higher than other paths due to
1379digital signal processing for speaker correction and protection.
1380</p>
1381<p>
1382As of API level 21, lower latency audio input is supported on select
1383devices. To take advantage of this feature, first confirm that lower
1384latency output is available as described above. The capability for lower
1385latency output is a prerequisite for the lower latency input feature.
1386Then create an AudioRecorder with the same sample rate and buffer size
1387as would be used for output.
1388OpenSL ES interfaces for input effects preclude the lower latency path.
1389The record preset SL_ANDROID_RECORDING_PRESET_VOICE_RECOGNITION
1390must be used for lower latency; this preset disables device-specific
1391digital signal processing which may add latency to the input path.
1392For more information on record presets,
1393see section "Android configuration interface" above.
1394</p>
1395<p>
1396For simultaneous input and output, separate buffer queue
1397completion handlers are used for each side.  There is no guarantee of
1398the relative order of these callbacks, or the synchronization of the
1399audio clocks, even when both sides use the same sample rate.
1400Your application should buffer the data with proper buffer synchronization.
1401</p>
1402<p>
1403One consequence of potentially independent audio clocks is the need
1404for asynchronous sample rate conversion.  A simple (though not ideal
1405for audio quality) technique for asynchronous sample rate conversion
1406is to duplicate or drop samples as needed near a zero-crossing point.
1407More sophisticated conversions are possible.
1408</p>
1409
1410<h3 id="securityPermissions">Security and permissions</h3>
1411
1412<p>
1413As far as who can do what, security in Android is done at the
1414process level. Java programming language code can't do anything more than native code, nor
1415can native code do anything more than Java programming language code. The only differences
1416between them are what APIs are available.
1417</p>
1418<p>
1419Applications using OpenSL ES must request whatever permissions they
1420would need for similar non-native APIs. For example, if your application
1421records audio, then it needs the <code>android.permission.RECORD_AUDIO</code>
1422permission. Applications that use audio effects need
1423<code>android.permission.MODIFY_AUDIO_SETTINGS</code>. Applications that play
1424network URI resources need <code>android.permission.NETWORK</code>.
1425See <a href="https://developer.android.com/training/permissions/index.html">Working with System Permissions</a>
1426for more information.
1427</p>
1428<p>
1429Depending on the platform version and implementation,
1430media content parsers and software codecs may run within the context
1431of the Android application that calls OpenSL ES (hardware codecs
1432are abstracted, but are device-dependent). Malformed content
1433designed to exploit parser and codec vulnerabilities is a known attack
1434vector. We recommend that you play media only from trustworthy
1435sources, or that you partition your application such that code that
1436handles media from untrustworthy sources runs in a relatively
1437sandboxed environment.  For example you could process media from
1438untrustworthy sources in a separate process. Though both processes
1439would still run under the same UID, this separation does make an
1440attack more difficult.
1441</p>
1442
1443<h2 id="platformIssues">Platform issues</h2>
1444
1445<p>
1446This section describes known issues in the initial platform
1447release which supports these APIs.
1448</p>
1449
1450<h3 id="dynamicInterfaceManagementIssues">Dynamic interface management</h3>
1451
1452<p>
1453<code>DynamicInterfaceManagement::AddInterface</code> does not work.
1454Instead, specify the interface in the array passed to Create, as
1455shown in the example code for environmental reverb.
1456</p>
1457
1458<h2 id="references">References and resources</h2>
1459
1460<p>
1461Android:
1462</p>
1463<ul>
1464<li><a href="http://developer.android.com/resources/index.html">
1465Android developer resources</a>
1466</li>
1467<li><a href="http://groups.google.com/group/android-developers">
1468Android developers discussion group</a>
1469</li>
1470<li><a href="http://developer.android.com/sdk/ndk/index.html">Android NDK</a>
1471</li>
1472<li><a href="http://groups.google.com/group/android-ndk">
1473Android NDK discussion group</a> (for developers of native code, including OpenSL ES)
1474</li>
1475<li><a href="http://code.google.com/p/android/issues/">
1476Android open source bug database</a>
1477</li>
1478<li><a href="https://github.com/googlesamples/android-ndk">Android Studio NDK Samples</a>
1479</li>
1480<li><a href="http://developer.android.com/samples/index.html">Android Studio Samples</a>
1481</li>
1482</ul>
1483
1484<p>
1485Khronos Group:
1486</p>
1487<ul>
1488<li><a href="http://www.khronos.org/opensles/">
1489Khronos Group OpenSL ES Overview</a>
1490</li>
1491<li><a href="http://www.khronos.org/registry/sles/">
1492Khronos Group OpenSL ES 1.0.1 specification</a>
1493</li>
1494<li><a href="https://forums.khronos.org/forumdisplay.php/91-OpenSL-ES-embedded-audio-acceleration">
1495Khronos Group public message board for OpenSL ES</a>
1496(please limit to non-Android questions)
1497</li>
1498</ul>
1499<p>
1500For convenience, we have included a copy of the OpenSL ES 1.0.1
1501specification with the NDK in
1502<code>docs/opensles/OpenSL_ES_Specification_1.0.1.pdf</code>.
1503</p>
1504
1505<p>
1506Miscellaneous:
1507</p>
1508<ul>
1509<li><a href="http://en.wikipedia.org/wiki/Java_Native_Interface">JNI</a>
1510</li>
1511<li><a href="http://stackoverflow.com/search?q=android+audio">
1512Stack Overflow</a>
1513</li>
1514<li>web search for "interactive audio", "game audio", "sound design",
1515"audio programming", "audio content", "audio formats", etc.
1516</li>
1517<li><a href="http://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a>
1518</li>
1519</ul>
1520
1521</body>
1522</html>
1523