1# Full Guide To Oboe 2Oboe is a C++ library which makes it easy to build high-performance audio apps on Android. Apps communicate with Oboe by reading and writing data to streams. 3 4## Audio streams 5 6Oboe moves audio data between your app and the audio inputs and outputs on your Android device. Your app passes data in and out using a callback function or by reading from and writing to *audio streams*, represented by the class `AudioStream`. The read/write calls can be blocking or non-blocking. 7 8A stream is defined by the following: 9 10* The *audio* *device* that is the source or sink for the data in the stream. 11* The *sharing mode* that determines whether a stream has exclusive access to an audio device that might otherwise be shared among multiple streams. 12* The *format* of the audio data in the stream. 13 14### Audio device 15 16Each stream is attached to a single audio device. 17 18An audio device is a hardware interface or virtual endpoint that acts as a source or sink for a continuous stream of digital audio data. Don't confuse an *audio device* 19(a built-in mic or bluetooth headset) with the *Android device* (the phone or watch) that is running your app. 20 21On API 23 and above you can use the `AudioManager` method [getDevices()](https://developer.android.com/reference/android/media/AudioManager.html#getDevices(int)) to discover the audio devices that are available on your Android device. The method returns information about the [type](https://developer.android.com/reference/android/media/AudioDeviceInfo.html) of each device. 22 23Each audio device has a unique ID on the Android device. You can use the ID to bind an audio stream to a specific audio device. However, in most cases you can let Oboe choose the default primary device rather than specifying one yourself. 24 25The audio device attached to a stream determines whether the stream is for input or output. A stream can only move data in one direction. When you define a stream you also set its direction. When you open a stream Android checks to ensure that the audio device and stream direction agree. 26 27### Sharing mode 28 29A stream has a sharing mode: 30 31* `SharingMode::Exclusive` (available on API 26+) means the stream has exclusive access to an endpoint on its audio device; the endpoint cannot be used by any other audio stream. If the exclusive endpoint is already in use, it might not be possible for the stream to obtain access to it. Exclusive streams provide the lowest possible latency by bypassing the mixer stage, but they are also more likely to get disconnected. You should close exclusive streams as soon as you no longer need them, so that other apps can access that endpoint. Not all audio devices provide exclusive endpoints. System sounds and sounds from other apps can still be heard when an exclusive stream is in use as they use a different endpoint. 32 33![Oboe exclusive sharing mode diagram](images/oboe-sharing-mode-exclusive.jpg) 34 35* `SharingMode::Shared` allows Oboe streams to share an endpoint. The operating system will mix all the shared streams assigned to the same endpoint on the audio device. 36 37![Oboe exclusive sharing mode diagram](images/oboe-sharing-mode-shared.jpg) 38 39 40You can explicitly request the sharing mode when you create a stream, although you are not guaranteed to receive that mode. By default, the sharing mode is `Shared`. 41 42### Audio format 43 44The data passed through a stream has the usual digital audio attributes, which you must specify when you define a stream. These are as follows: 45 46* Sample format 47* Samples per frame 48* Sample rate 49 50Oboe permits these sample formats: 51 52| AudioFormat | C data type | Notes | 53| :------------ | :---------- | :---- | 54| I16 | int16_t | common 16-bit samples, [Q0.15 format](https://source.android.com/devices/audio/data_formats#androidFormats) | 55| Float | float | -1.0 to +1.0 | 56 57Oboe might perform sample conversion on its own. For example, if an app is writing AudioFormat::Float data but the HAL uses AudioFormat::I16, Oboe might convert the samples automatically. Conversion can happen in either direction. If your app processes audio input, it is wise to verify the input format and be prepared to convert data if necessary, as in this example: 58 59 AudioFormat dataFormat = stream->getDataFormat(); 60 //... later 61 if (dataFormat == AudioFormat::I16) { 62 convertFloatToPcm16(...) 63 } 64 65## Creating an audio stream 66 67The Oboe library follows a [builder design pattern](https://en.wikipedia.org/wiki/Builder_pattern) and provides the class `AudioStreamBuilder`. 68 69### Set the audio stream configuration using an AudioStreamBuilder. 70 71Use the builder functions that correspond to the stream parameters. These optional set functions are available: 72 73 AudioStreamBuilder streamBuilder; 74 75 streamBuilder.setDeviceId(deviceId); 76 streamBuilder.setDirection(direction); 77 streamBuilder.setSharingMode(shareMode); 78 streamBuilder.setSampleRate(sampleRate); 79 streamBuilder.setChannelCount(channelCount); 80 streamBuilder.setFormat(format); 81 streamBuilder.setPerformanceMode(perfMode); 82 83Note that these methods do not report errors, such as an undefined constant or value out of range. They will be checked when the stream is opened. 84 85If you do not specify the deviceId, the default is the primary output device. 86If you do not specify the stream direction, the default is an output stream. 87For all parameters, you can explicitly set a value, or let the system 88assign the optimal value by not specifying the parameter at all or setting 89it to `kUnspecified`. 90 91To be safe, check the state of the audio stream after you create it, as explained in step 3, below. 92 93### Open the Stream 94 95After you've configured the `AudioStreamBuilder`, call `openStream()` to open the stream: 96 97 Result result = streamBuilder.openStream(&stream_); 98 if (result != OK){ 99 __android_log_print(ANDROID_LOG_ERROR, 100 "AudioEngine", 101 "Error opening stream %s", 102 convertToText(result)); 103 } 104 105 106### Verifying stream configuration and additional properties 107 108You should verify the stream's configuration after opening it. 109 110The following properties are guaranteed to be set. However, if these properties 111are unspecified, a default value will still be set, and should be queried by the 112appropriate accessor. 113 114* callback 115* framesPerCallback 116* sampleRate 117* channelCount 118* format 119* direction 120 121The following properties may be changed by the underlying stream construction 122*even if explicitly set* and therefore should always be queried by the appropriate 123accessor. The property settings will depend on device capabilities. 124 125* bufferCapacityInFrames 126* sharingMode (exclusive provides lowest latency) 127* performanceMode 128 129The following properties are only set by the underlying stream. They cannot be 130set by the application, but should be queried by the appropriate accessor. 131 132* framesPerBurst 133 134The following properties have unusual behavior 135 136* deviceId is respected when the underlying API is AAudio (API level >=28), but not when it 137is OpenSLES. It can be set regardless, but *will not* throw an error if an OpenSLES stream 138is used. The default device will be used, rather than whatever is specified. 139 140* mAudioApi is only a property of the builder, however 141AudioStream::getAudioApi() can be used to query the underlying API which the 142stream uses. The property set in the builder is not guaranteed, and in 143general, the API should be chosen by Oboe to allow for best performance and 144stability considerations. Since Oboe is designed to be as uniform across both 145APIs as possible, this property should not generally be needed. 146 147* mBufferSizeInFrames can only be set on an already open stream (as opposed to a 148builder), since it depends on run-time behavior. 149The actual size used may not be what was requested. 150Oboe or the underlyng API will limit the size between zero and the buffer capacity. 151It may also be limited further to reduce glitching on particular devices. 152This features is not supported when using OpenSL ES callbacks. 153 154Many of the stream's properties may vary (whether or not you set 155them) depending on the capabilities of the audio device and the Android device on 156which it's running. If you need to know these values then you must query them using 157the accessor after the stream has been opened. Additionally, 158the underlying parameters a stream is granted are useful to know if 159they have been left unspecified. As a matter of good defensive programming, you 160should check the stream's configuration before using it. 161 162 163There are functions to retrieve the stream setting that corresponds to each 164builder setting: 165 166 167| AudioStreamBuilder set methods | AudioStream get methods | 168| :------------------------ | :----------------- | 169| `setCallback()` | `getCallback()` | 170| `setDirection()` | `getDirection()` | 171| `setSharingMode()` | `getSharingMode()` | 172| `setPerformanceMode()` | `getPerformanceMode()` | 173| `setSampleRate()` | `getSampleRate()` | 174| `setChannelCount()` | `getChannelCount()` | 175| `setFormat()` | `getFormat()` | 176| `setBufferCapacityInFrames()` | `getBufferCapacityInFrames()` | 177| `setFramesPerCallback()` | `getFramesPerCallback()` | 178| -- | `getFramesPerBurst()` | 179| `setDeviceId()` (not respected on OpenSLES) | `getDeviceId()` | 180| `setAudioApi()` (mainly for debugging) | `getAudioApi()` | 181 182The following AudioStreamBuilder fields were added in API 28 to 183specify additional information about the AudioStream to the device. Currently, 184they have little effect on the stream, but setting them helps applications 185interact better with other services. 186 187For more information see: [Usage/ContentTypes](https://source.android.com/devices/audio/attributes). 188The InputPreset may be used by the device to process the input stream (such as gain control). By default 189it is set to VoiceRecognition, which is optimized for low latency. 190 191* `setUsage(oboe::Usage usage)` - The purpose for creating the stream. 192* `setContentType(oboe::ContentType contentType)` - The type of content carried 193 by the stream. 194* `setInputPreset(oboe::InputPreset inputPreset)` - The recording configuration 195 for an audio input. 196* `setSessionId(SessionId sessionId)` - Allocate SessionID to connect to the 197 Java AudioEffects API. 198 199 200## Using an audio stream 201 202### State transitions 203 204An Oboe stream is usually in one of five stable states (the error state, Disconnected, is described at the end of this section): 205 206* Open 207* Started 208* Paused 209* Flushed 210* Stopped 211 212Data only flows through a stream when the stream is in the Started state. To 213move a stream between states, use one of the functions that request a state 214transition: 215 216 Result result; 217 result = stream->requestStart(); 218 result = stream->requestStop(); 219 result = stream->requestPause(); 220 result = stream->requestFlush(); 221 222Note that you can only request pause or flush on an output stream: 223 224These functions are asynchronous, and the state change doesn't happen 225immediately. When you request a state change, the stream moves toone of the 226corresponding transient states: 227 228* Starting 229* Pausing 230* Flushing 231* Stopping 232* Closing 233 234The state diagram below shows the stable states as rounded rectangles, and the transient states as dotted rectangles. 235Though it's not shown, you can call `close()` from any state 236 237![Oboe Lifecycle](images/oboe-lifecycle.png) 238 239Oboe doesn't provide callbacks to alert you to state changes. One special 240function, 241`AudioStream::waitForStateChange()` can be used to wait for a state change. 242Note that most apps will not need to call `waitForStateChange()` and can just 243request state changes whenever they are needed. 244 245The function does not detect a state change on its own, and does not wait for a 246specific state. It waits until the current state 247is *different* than `inputState`, which you specify. 248 249For example, after requesting to pause, a stream should immediately enter 250the transient state Pausing, and arrive sometime later at the Paused state - though there's no guarantee it will. 251Since you can't wait for the Paused state, use `waitForStateChange()` to wait for *any state 252other than Pausing*. Here's how that's done: 253 254``` 255StreamState inputState = StreamState::Pausing; 256StreamState nextState = StreamState::Uninitialized; 257int64_t timeoutNanos = 100 * kNanosPerMillisecond; 258result = stream->requestPause(); 259result = stream->waitForStateChange(inputState, &nextState, timeoutNanos); 260``` 261 262 263If the stream's state is not Pausing (the `inputState`, which we assumed was the 264current state at call time), the function returns immediately. Otherwise, it 265blocks until the state is no longer Pausing or the timeout expires. When the 266function returns, the parameter `nextState` shows the current state of the 267stream. 268 269You can use this same technique after calling request start, stop, or flush, 270using the corresponding transient state as the inputState. Do not call 271`waitForStateChange()` after calling `AudioStream::close()` since the underlying stream resources 272will be deleted as soon as it closes. And do not call `close()` 273while `waitForStateChange()` is running in another thread. 274 275### Reading and writing to an audio stream 276 277There are two ways to move data in or out of a stream. 2781) Read from or write directly to the stream. 2792) Specify a callback object that will get called when the stream is ready. 280 281The callback technique offers the lowest latency performance because the callback code can run in a high priority thread. 282Also, attempting to open a low latency output stream without an audio callback (with the intent to use writes) 283may result in a non low latency stream. 284 285The read/write technique may be easier when you do not need low latency. Or, when doing both input and output, it is common to use a callback for output and then just do a non-blocking read from the input stream. Then you have both the input and output data available in one high priority thread. 286 287After the stream is started you can read or write to it using the methods 288`AudioStream::read(buffer, numFrames, timeoutNanos)` 289and 290`AudioStream::write(buffer, numFrames, timeoutNanos)`. 291 292For a blocking read or write that transfers the specified number of frames, set timeoutNanos greater than zero. For a non-blocking call, set timeoutNanos to zero. In this case the result is the actual number of frames transferred. 293 294When you read input, you should verify the correct number of 295frames was read. If not, the buffer might contain unknown data that could cause an 296audio glitch. You can pad the buffer with zeros to create a 297silent dropout: 298 299 Result result = stream.read(audioData, numFrames, timeout); 300 if (result < 0) { 301 // Error! 302 } 303 if (result != numFrames) { 304 // pad the buffer with zeros 305 memset(static_cast<sample_type*>(audioData) + result * samplesPerFrame, 0, 306 (numFrames - result) * stream.getBytesPerFrame()); 307 } 308 309You can prime the stream's buffer before starting the stream by writing data or silence into it. This must be done in a non-blocking call with timeoutNanos set to zero. 310 311The data in the buffer must match the data format returned by `stream.getDataFormat()`. 312 313### Closing an audio stream 314 315When you are finished using a stream, close it: 316 317 stream->close(); 318 319Do not close a stream while it is being written to or read from another thread as this will cause your app to crash. After you close a stream you should not call any of its methods except for quering it properties. 320 321### Disconnected audio stream 322 323An audio stream can become disconnected at any time if one of these events happens: 324 325* The associated audio device is no longer connected (for example when headphones are unplugged). 326* An error occurs internally. 327* An audio device is no longer the primary audio device. 328 329When a stream is disconnected, it has the state "Disconnected" and calls to `write()` or other functions will return `Result::ErrorDisconnected`. When a stream is disconnected, all you can do is close it. 330 331If you need to be informed when an audio device is disconnected, write a class 332which extends `AudioStreamCallback` and then register your class using `builder.setCallback(yourCallbackClass)`. 333If you register a callback, then it will automatically close the stream in a separate thread if the stream is disconnected. 334Note that registering this callback will enable callbacks for both data and errors. So `onAudioReady()` will be called. See the "high priority callback" section below. 335 336Your callback can implement the following methods (called in a separate thread): 337 338* `onErrorBeforeClose(stream, error)` - called when the stream has been disconnected but not yet closed, 339 so you can still reference the underlying stream (e.g.`getXRunCount()`). 340You can also inform any other threads that may be calling the stream to stop doing so. 341Do not delete the stream or modify its stream state in this callback. 342* `onErrorAfterClose(stream, error)` - called when the stream has been stopped and closed by Oboe so the stream cannot be used and calling getState() will return closed. 343During this callback, stream properties (those requested by the builder) can be queried, as well as frames written and read. 344The stream can be deleted at the end of this method (as long as it not referenced in other threads). 345Methods that reference the underlying stream should not be called (e.g. `getTimestamp()`, `getXRunCount()`, `read()`, `write()`, etc.). 346Opening a seperate stream is also a valid use of this callback, especially if the error received is `Error::Disconnected`. 347However, it is important to note that the new audio device may have vastly different properties than the stream that was disconnected. 348 349 350## Optimizing performance 351 352You can optimize the performance of an audio application by using special high-priority threads. 353 354### Using a high priority callback 355 356If your app reads or writes audio data from an ordinary thread, it may be preempted or experience timing jitter. This can cause audio glitches. 357Using larger buffers might guard against such glitches, but a large buffer also introduces longer audio latency. 358For applications that require low latency, an audio stream can use an asynchronous callback function to transfer data to and from your app. 359The callback runs in a high-priority thread that has better performance. 360 361Your code can access the callback mechanism by implementing the virtual class 362`AudioStreamCallback`. The stream periodically executes `onAudioReady()` (the 363callback function) to acquire the data for its next burst. 364 365 class AudioEngine : AudioStreamCallback { 366 public: 367 DataCallbackResult AudioEngine::onAudioReady( 368 AudioStream *oboeStream, 369 void *audioData, 370 int32_t numFrames){ 371 oscillator_->render(static_cast<float *>(audioData), numFrames); 372 return DataCallbackResult::Continue; 373 } 374 375 bool AudioEngine::start() { 376 ... 377 // register the callback 378 streamBuilder.setCallback(this); 379 } 380 private: 381 // application data 382 Oscillator* oscillator_; 383 } 384 385 386Note that the callback must be registered on the stream with `setCallback`. Any 387application-specific data (such as `oscillator_` in this case) 388can be included within the class itself. 389 390The callback function should not perform a read or write on the stream that invoked it. If the callback belongs to an input stream, your code should process the data that is supplied in the audioData buffer (specified as the second argument). If the callback belongs to an output stream, your code should place data into the buffer. 391 392It is possible to process more than one stream in the callback. You can use one stream as the master, and pass pointers to other streams in the class's private data. Register a callback for the master stream. Then use non-blocking I/O on the other streams. Here is an example of a round-trip callback that passes an input stream to an output stream. The master calling stream is the output 393stream. The input stream is included in the class. 394 395The callback does a non-blocking read from the input stream placing the data into the buffer of the output stream. 396 397 class AudioEngine : AudioStreamCallback { 398 public: 399 400 oboe_data_callback_result_t AudioEngine::onAudioReady( 401 AudioStream *oboeStream, 402 void *audioData, 403 int32_t numFrames) { 404 const int64_t timeoutNanos = 0; // for a non-blocking read 405 auto result = recordingStream->read(audioData, numFrames, timeoutNanos); 406 // result has type ResultWithValue<int32_t>, which for convenience is coerced 407 // to a Result type when compared with another Result. 408 if (result == Result::OK) { 409 if (result.value() < numFrames) { 410 // replace the missing data with silence 411 memset(static_cast<sample_type*>(audioData) + result.value() * samplesPerFrame, 0, 412 (numFrames - result.value()) * oboeStream->getBytesPerFrame()); 413 414 } 415 return DataCallbackResult::Continue; 416 } 417 return DataCallbackResult::Stop; 418 } 419 420 bool AudioEngine::start() { 421 ... 422 streamBuilder.setCallback(this); 423 } 424 425 void setRecordingStream(AudioStream *stream) { 426 recordingStream = stream; 427 } 428 429 private: 430 AudioStream *recordingStream; 431 } 432 433 434Note that in this example it is assumed the input and output streams have the same number of channels, format and sample rate. The format of the streams can be mismatched - as long as the code handles the translations properly. 435 436#### Callback do's and don'ts 437You should never perform an operation which could block inside `onAudioReady`. Examples of blocking operations include: 438 439- allocate memory using, for example, malloc() or new 440- file operations such as opening, closing, reading or writing 441- network operations such as streaming 442- use mutexes or other synchronization primitives 443- sleep 444- stop or close the stream 445- Call read() or write() on the stream which invoked it 446 447The following methods are OK to call: 448 449- AudioStream::get*() 450- oboe::convertResultToText() 451 452### Setting performance mode 453 454Every AudioStream has a *performance mode* which has a large effect on your app's behavior. There are three modes: 455 456* `PerformanceMode::None` is the default mode. It uses a basic stream that balances latency and power savings. 457* `PerformanceMode::LowLatency` uses smaller buffers and an optimized data path for reduced latency. 458* `PerformanceMode::PowerSaving` uses larger internal buffers and a data path that trades off latency for lower power. 459 460You can select the performance mode by calling `setPerformanceMode()`, 461and discover the current mode by calling `getPerformanceMode()`. 462 463If low latency is more important than power savings in your application, use `PerformanceMode::LowLatency`. 464This is useful for apps that are very interactive, such as games or keyboard synthesizers. 465 466If saving power is more important than low latency in your application, use `PerformanceMode::PowerSaving`. 467This is typical for apps that play back previously generated music, such as streaming audio or MIDI file players. 468 469In the current version of Oboe, in order to achieve the lowest possible latency you must use the `PerformanceMode::LowLatency` performance mode along with a high-priority callback. Follow this example: 470 471``` 472// Create a callback object 473MyOboeStreamCallback myCallback; 474 475// Create a stream builder 476AudioStreamBuilder builder; 477builder.setCallback(myCallback); 478builder.setPerformanceMode(PerformanceMode::LowLatency); 479 480// Use it to create the stream 481AudioStream *stream; 482builder.openStream(&stream); 483``` 484 485## Thread safety 486 487The Oboe API is not completely [thread safe](https://en.wikipedia.org/wiki/Thread_safety). 488You cannot call some of the Oboe functions concurrently from more than one thread at a time. 489This is because Oboe avoids using mutexes, which can cause thread preemption and glitches. 490 491To be safe, don't call `waitForStateChange()` or read or write to the same stream from two different threads. Similarly, don't close a stream in one thread while reading or writing to it in another thread. 492 493Calls that return stream settings, like `AudioStream::getSampleRate()` and `AudioStream::getChannelCount()`, are thread safe. 494 495These calls are also thread safe: 496 497* `convertToText()` 498* `AudioStream::get*()` except for `getTimestamp()` and `getState()` 499 500<b>Note:</b> When a stream uses a callback function, it's safe to read/write from the callback thread while also closing the stream 501from the thread in which it is running. 502 503 504## Code samples 505 506Code samples are available in the [samples folder](../samples). 507 508## Known Issues 509 510The following methods are defined, but will return `Result::ErrorUnimplemented` for OpenSLES streams: 511 512* `getFramesRead()` 513* `getFramesWritten()` 514* `getTimestamp()` 515 516Additionally, `setDeviceId()` will not be respected by OpenSLES streams. 517