1# Using AudioRenderer for Audio Playback 2 3The AudioRenderer is used to play Pulse Code Modulation (PCM) audio data. Unlike the [AVPlayer](../media/using-avplayer-for-playback.md), the AudioRenderer can perform data preprocessing before audio input. Therefore, the AudioRenderer is more suitable if you have extensive audio development experience and want to implement more flexible playback features. 4 5## Development Guidelines 6 7The full rendering process involves creating an AudioRenderer instance, configuring audio rendering parameters, starting and stopping rendering, and releasing the instance. In this topic, you will learn how to use the AudioRenderer to render audio data. Before the development, you are advised to read [AudioRenderer](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md) for the API reference. 8 9The figure below shows the state changes of the AudioRenderer. After an AudioRenderer instance is created, different APIs can be called to switch the AudioRenderer to different states and trigger the required behavior. If an API is called when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. Therefore, you are advised to check the AudioRenderer state before triggering state transition. 10 11To prevent the UI thread from being blocked, most AudioRenderer calls are asynchronous. Each API provides the callback and promise functions. The following examples use the callback functions. 12 13**Figure 1** AudioRenderer state transition 14 15 16 17During application development, you are advised to use [on('stateChange')](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#onstatechange8) to subscribe to state changes of the AudioRenderer. This is because some operations can be performed only when the AudioRenderer is in a given state. If the application performs an operation when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. 18 19- **prepared**: The AudioRenderer enters this state by calling [createAudioRenderer()](../../reference/apis-audio-kit/arkts-apis-audio-f.md#audiocreateaudiorenderer8). 20 21- **running**: The AudioRenderer enters this state by calling [start()](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#start8) when it is in the **prepared**, **paused**, or **stopped** state. 22 23- **paused**: The AudioRenderer enters this state by calling [pause()](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#pause8) when it is in the **running** state. When the audio playback is paused, it can call [start()](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#start8) to resume the playback. 24 25- **stopped**: The AudioRenderer enters this state by calling [stop()](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#stop8) when it is in the **paused** or **running** state. 26 27- **released**: The AudioRenderer enters this state by calling [release()](../../reference/apis-audio-kit/arkts-apis-audio-AudioRenderer.md#release8) when it is in the **prepared**, **paused**, or **stopped** state. In this state, the AudioRenderer releases all occupied hardware and software resources and will not transit to any other state. 28 29### How to Develop 30 311. Set audio rendering parameters and create an AudioRenderer instance. For details about the parameters, see [AudioRendererOptions](../../reference/apis-audio-kit/arkts-apis-audio-i.md#audiorendereroptions8). 32 33 ```ts 34 import { audio } from '@kit.AudioKit'; 35 36 let audioStreamInfo: audio.AudioStreamInfo = { 37 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate. 38 channels: audio.AudioChannel.CHANNEL_2, // Channel. 39 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format. 40 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format. 41 }; 42 43 let audioRendererInfo: audio.AudioRendererInfo = { 44 usage: audio.StreamUsage.STREAM_USAGE_MUSIC, // Audio stream usage type: music. Set this parameter based on the service scenario. 45 rendererFlags: 0 // AudioRenderer flag. 46 }; 47 48 let audioRendererOptions: audio.AudioRendererOptions = { 49 streamInfo: audioStreamInfo, 50 rendererInfo: audioRendererInfo 51 }; 52 53 audio.createAudioRenderer(audioRendererOptions, (err, data) => { 54 if (err) { 55 console.error(`Invoke createAudioRenderer failed, code is ${err.code}, message is ${err.message}`); 56 return; 57 } else { 58 console.info('Invoke createAudioRenderer succeeded.'); 59 let audioRenderer = data; 60 } 61 }); 62 ``` 63 642. Call **on('writeData')** to subscribe to the callback for audio data writing. You are advised to use this function in API version 12, since it returns a callback result. 65 66 - From API version 12, this function returns a callback result, enabling the system to determine whether to play the data in the callback based on the value returned. 67 68 > **NOTE** 69 > 70 > - When the amount of data is sufficient to meet the required buffer length of the callback, you should return **audio.AudioDataCallbackResult.VALID**, and the system uses the entire data buffer for playback. Do not return **audio.AudioDataCallbackResult.VALID** in this case, as this leads to audio artifacts such as noise and playback stuttering. 71 > 72 > - When the amount of data is insufficient to meet the required buffer length of the callback, you are advised to return **audio.AudioDataCallbackResult.INVALID**. In this case, the system does not process this portion of audio data but requests data from the application again. Once the buffer is adequately filled, you can return **audio.AudioDataCallbackResult.VALID**. 73 > 74 > - Once the callback function finishes its execution, the audio service queues the data in the buffer for playback. Therefore, do not change the buffered data outside the callback. Regarding the last frame, if there is insufficient data to completely fill the buffer, you must concatenate the available data with padding to ensure that the buffer is full. This prevents any residual dirty data in the buffer from adversely affecting the playback effect. 75 76 ```ts 77 import { audio } from '@kit.AudioKit'; 78 import { BusinessError } from '@kit.BasicServicesKit'; 79 import { fileIo as fs } from '@kit.CoreFileKit'; 80 import { common } from '@kit.AbilityKit'; 81 82 class Options { 83 offset?: number; 84 length?: number; 85 } 86 87 let bufferSize: number = 0; 88 // Obtain the context from the component and ensure that the return value of this.getUIContext().getHostContext() is UIAbilityContext. 89 let context = this.getUIContext().getHostContext() as common.UIAbilityContext; 90 let path = context.cacheDir; 91 // Ensure that the resource exists in the path. 92 let filePath = path + '/StarWars10s-2C-48000-4SW.pcm'; 93 let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 94 95 let writeDataCallback = (buffer: ArrayBuffer) => { 96 let options: Options = { 97 offset: bufferSize, 98 length: buffer.byteLength 99 }; 100 101 try { 102 fs.readSync(file.fd, buffer, options); 103 bufferSize += buffer.byteLength; 104 // The system determines that the buffer is valid and plays the data normally. 105 return audio.AudioDataCallbackResult.VALID; 106 } catch (error) { 107 console.error('Error reading file:', error); 108 // The system determines that the buffer is invalid and does not play the data. 109 return audio.AudioDataCallbackResult.INVALID; 110 } 111 }; 112 113 audioRenderer.on('writeData', writeDataCallback); 114 ``` 115 116 - In API version 11, this function does not return a callback result, and the system treats all data in the callback as valid by default. 117 118 > **NOTE** 119 > 120 > - Ensure that the callback's data buffer is completely filled to the necessary length to prevent issues such as audio noise and playback stuttering. 121 > 122 > - If the amount of data is insufficient to fill the data buffer, you are advised to temporarily halt data writing (without pausing the audio stream), block the callback function, and wait until enough data accumulates before resuming writing, thereby ensuring that the buffer is fully filled. If you need to call AudioRenderer APIs after the callback function is blocked, unblock the callback function first. 123 > 124 > - If you do not want to play the audio data in this callback function, you can nullify the data block in the callback function. (Once nullified, the system still regards this as part of the written data, leading to silent frames during playback). 125 > 126 > - Once the callback function finishes its execution, the audio service queues the data in the buffer for playback. Therefore, do not change the buffered data outside the callback. Regarding the last frame, if there is insufficient data to completely fill the buffer, you must concatenate the available data with padding to ensure that the buffer is full. This prevents any residual dirty data in the buffer from adversely affecting the playback effect. 127 128 ```ts 129 import { BusinessError } from '@kit.BasicServicesKit'; 130 import { fileIo as fs } from '@kit.CoreFileKit'; 131 import { common } from '@kit.AbilityKit'; 132 133 class Options { 134 offset?: number; 135 length?: number; 136 } 137 138 let bufferSize: number = 0; 139 // Obtain the context from the component and ensure that the return value of this.getUIContext().getHostContext() is UIAbilityContext. 140 let context = this.getUIContext().getHostContext() as common.UIAbilityContext; 141 let path = context.cacheDir; 142 // Ensure that the resource exists in the path. 143 let filePath = path + '/StarWars10s-2C-48000-4SW.pcm'; 144 let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 145 let writeDataCallback = (buffer: ArrayBuffer) => { 146 // If you do not want to play a particular portion of the buffer, you can add a check and clear that specific section of the buffer. 147 let options: Options = { 148 offset: bufferSize, 149 length: buffer.byteLength 150 }; 151 fs.readSync(file.fd, buffer, options); 152 bufferSize += buffer.byteLength; 153 }; 154 155 audioRenderer.on('writeData', writeDataCallback); 156 ``` 157 1583. Call **start()** to switch the AudioRenderer to the **running** state and start rendering. 159 160 ```ts 161 import { BusinessError } from '@kit.BasicServicesKit'; 162 163 audioRenderer.start((err: BusinessError) => { 164 if (err) { 165 console.error(`Renderer start failed, code is ${err.code}, message is ${err.message}`); 166 } else { 167 console.info('Renderer start success.'); 168 } 169 }); 170 ``` 171 1724. Call **stop()** to stop rendering. 173 174 ```ts 175 import { BusinessError } from '@kit.BasicServicesKit'; 176 177 audioRenderer.stop((err: BusinessError) => { 178 if (err) { 179 console.error(`Renderer stop failed, code is ${err.code}, message is ${err.message}`); 180 } else { 181 console.info('Renderer stopped.'); 182 } 183 }); 184 ``` 185 1865. Call **release()** to release the instance. 187 188 ```ts 189 import { BusinessError } from '@kit.BasicServicesKit'; 190 191 audioRenderer.release((err: BusinessError) => { 192 if (err) { 193 console.error(`Renderer release failed, code is ${err.code}, message is ${err.message}`); 194 } else { 195 console.info('Renderer released.'); 196 } 197 }); 198 ``` 199 200### Selecting the Correct Stream Usage 201 202When developing a media player, it is important to correctly set the stream usage type according to the intended use case. This will ensure that the player behaves as expected in different scenarios. 203 204The recommended use cases are described in [StreamUsage](../../reference/apis-audio-kit/arkts-apis-audio-e.md#streamusage). For example, **STREAM_USAGE_MUSIC** is recommended for music scenarios, **STREAM_USAGE_MOVIE** is recommended for movie or video scenarios, and **STREAM_USAGE_GAME** is recommended for gaming scenarios. 205 206An incorrect configuration of **StreamUsage** may cause unexpected behavior. Example scenarios are as follows: 207 208- When **STREAM_USAGE_MUSIC** is incorrectly used in a game scenario, the game cannot be played simultaneously with music applications. However, games usually can coexist with music playback. 209- When **STREAM_USAGE_MUSIC** is incorrectly used in a navigation scenario, any playing music is interrupted when the navigation application provides audio guidance. However, it is generally expected that the music keeps playing at a lower volume while the navigation is active. 210 211### Configuring the Appropriate Audio Sampling Rate 212 213The sampling rate refers to the number of samples captured per second for a single audio channel, measured in Hz. 214 215Resampling involves upsampling (adding samples through interpolation) or downsampling (removing samples through decimation) when there is a mismatch between the input and output audio sampling rates. 216 217The AudioRenderer supports all sampling rates defined in the enum **AudioSamplingRate**. 218 219If the input audio sampling rate configured by AudioRenderer is different from the output sampling rate of the device, the system resamples the input audio to match the output sampling rate. 220 221To minimize power consumption from resampling, it is best to use input audio with a sampling rate that matches the output sampling rate of the device. A sampling rate of 48 kHz is highly recommended. 222 223### Sample Code 224 225Refer to the sample code below to render an audio file using AudioRenderer. 226 227```ts 228import { audio } from '@kit.AudioKit'; 229import { BusinessError } from '@kit.BasicServicesKit'; 230import { fileIo as fs } from '@kit.CoreFileKit'; 231import { common } from '@kit.AbilityKit'; 232 233const TAG = 'AudioRendererDemo'; 234 235class Options { 236 offset?: number; 237 length?: number; 238} 239 240let bufferSize: number = 0; 241let renderModel: audio.AudioRenderer | undefined = undefined; 242let audioStreamInfo: audio.AudioStreamInfo = { 243 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate. 244 channels: audio.AudioChannel.CHANNEL_2, // Channel. 245 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format. 246 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format. 247}; 248let audioRendererInfo: audio.AudioRendererInfo = { 249 usage: audio.StreamUsage.STREAM_USAGE_MUSIC, // Audio stream usage type: music. Set this parameter based on the service scenario. 250 rendererFlags: 0 // AudioRenderer flag. 251}; 252let audioRendererOptions: audio.AudioRendererOptions = { 253 streamInfo: audioStreamInfo, 254 rendererInfo: audioRendererInfo 255}; 256// Obtain the context from the component and ensure that the return value of this.getUIContext().getHostContext() is UIAbilityContext. 257let context = this.getUIContext().getHostContext() as common.UIAbilityContext; 258let path = context.cacheDir; 259// Ensure that the resource exists in the path. 260let filePath = path + '/StarWars10s-2C-48000-4SW.pcm'; 261let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 262let writeDataCallback = (buffer: ArrayBuffer) => { 263 let options: Options = { 264 offset: bufferSize, 265 length: buffer.byteLength 266 }; 267 268 try { 269 let bufferLength = fs.readSync(file.fd, buffer, options); 270 bufferSize += buffer.byteLength; 271 // If the data passed in the current callback is less than one frame, the blank areas must be filled with silent data to avoid playback noise. 272 if (bufferLength < buffer.byteLength) { 273 let view = new DataView(buffer); 274 for (let i = bufferLength; i < buffer.byteLength; i++) { 275 // For blank areas, silent data should be used. When using the SAMPLE_FORMAT_U8 audio sampling format, 0x7F represents silent data. For other sampling formats, 0 is used as silent data. 276 view.setUint8(i, 0); 277 } 278 } 279 // This function does not return a callback result in API version 11, but does so in API version 12 and later versions. 280 // If you do not want to play a certain buffer, return audio.AudioDataCallbackResult.INVALID. 281 return audio.AudioDataCallbackResult.VALID; 282 } catch (error) { 283 console.error('Error reading file:', error); 284 // This function does not return a callback result in API version 11, but does so in API version 12 and later versions. 285 return audio.AudioDataCallbackResult.INVALID; 286 } 287}; 288 289// Create an AudioRenderer instance, and set the events to listen for. 290function init() { 291 audio.createAudioRenderer(audioRendererOptions, (err, renderer) => { // Create an AudioRenderer instance. 292 if (!err) { 293 console.info(`${TAG}: creating AudioRenderer success`); 294 renderModel = renderer; 295 if (renderModel !== undefined) { 296 (renderModel as audio.AudioRenderer).on('writeData', writeDataCallback); 297 } 298 } else { 299 console.info(`${TAG}: creating AudioRenderer failed, error: ${err.message}`); 300 } 301 }); 302} 303 304// Start audio rendering. 305function start() { 306 if (renderModel !== undefined) { 307 let stateGroup = [audio.AudioState.STATE_PREPARED, audio.AudioState.STATE_PAUSED, audio.AudioState.STATE_STOPPED]; 308 if (stateGroup.indexOf((renderModel as audio.AudioRenderer).state.valueOf()) === -1) { // Rendering can be started only when the AudioRenderer is in the prepared, paused, or stopped state. 309 console.error(TAG + 'start failed'); 310 return; 311 } 312 // Start rendering. 313 (renderModel as audio.AudioRenderer).start((err: BusinessError) => { 314 if (err) { 315 console.error('Renderer start failed.'); 316 } else { 317 console.info('Renderer start success.'); 318 } 319 }); 320 } 321} 322 323// Pause the rendering. 324function pause() { 325 if (renderModel !== undefined) { 326 // Rendering can be paused only when the AudioRenderer is in the running state. 327 if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING) { 328 console.info('Renderer is not running'); 329 return; 330 } 331 // Pause the rendering. 332 (renderModel as audio.AudioRenderer).pause((err: BusinessError) => { 333 if (err) { 334 console.error('Renderer pause failed.'); 335 } else { 336 console.info('Renderer pause success.'); 337 } 338 }); 339 } 340} 341 342// Stop rendering. 343async function stop() { 344 if (renderModel !== undefined) { 345 // Rendering can be stopped only when the AudioRenderer is in the running or paused state. 346 if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING && (renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_PAUSED) { 347 console.info('Renderer is not running or paused.'); 348 return; 349 } 350 // Stop rendering. 351 (renderModel as audio.AudioRenderer).stop((err: BusinessError) => { 352 if (err) { 353 console.error('Renderer stop failed.'); 354 } else { 355 fs.close(file); 356 console.info('Renderer stop success.'); 357 } 358 }); 359 } 360} 361 362// Release the instance. 363async function release() { 364 if (renderModel !== undefined) { 365 // The AudioRenderer can be released only when it is not in the released state. 366 if (renderModel.state.valueOf() === audio.AudioState.STATE_RELEASED) { 367 console.info('Renderer already released'); 368 return; 369 } 370 // Release the resources. 371 (renderModel as audio.AudioRenderer).release((err: BusinessError) => { 372 if (err) { 373 console.error('Renderer release failed.'); 374 } else { 375 console.info('Renderer release success.'); 376 } 377 }); 378 } 379} 380``` 381 382When audio streams with the same or higher priority need to use the output device, the current audio playback will be interrupted. The application can respond to and handle the interruption event. For details, see [Processing Audio Interruption Events](audio-playback-concurrency.md). 383