1# Using AudioRenderer for Audio Playback 2 3The AudioRenderer is used to play Pulse Code Modulation (PCM) audio data. Unlike the [AVPlayer](../media/using-avplayer-for-playback.md), the AudioRenderer can perform data preprocessing before audio input. Therefore, the AudioRenderer is more suitable if you have extensive audio development experience and want to implement more flexible playback features. 4 5## Development Guidelines 6 7The full rendering process involves creating an **AudioRenderer** instance, configuring audio rendering parameters, starting and stopping rendering, and releasing the instance. In this topic, you will learn how to use the AudioRenderer to render audio data. Before the development, you are advised to read [AudioRenderer](../../reference/apis-audio-kit/js-apis-audio.md#audiorenderer8) for the API reference. 8 9The figure below shows the state changes of the AudioRenderer. After an **AudioRenderer** instance is created, different APIs can be called to switch the AudioRenderer to different states and trigger the required behavior. If an API is called when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. Therefore, you are advised to check the AudioRenderer state before triggering state transition. 10 11To prevent the UI thread from being blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the callback functions. 12 13**Figure 1** AudioRenderer state transition 14 15 16 17During application development, you are advised to use [on('stateChange')](../../reference/apis-audio-kit/js-apis-audio.md#onstatechange-8) to subscribe to state changes of the AudioRenderer. This is because some operations can be performed only when the AudioRenderer is in a given state. If the application performs an operation when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. 18 19- **prepared**: The AudioRenderer enters this state by calling [createAudioRenderer()](../../reference/apis-audio-kit/js-apis-audio.md#audiocreateaudiorenderer8). 20 21- **running**: The AudioRenderer enters this state by calling [start()](../../reference/apis-audio-kit/js-apis-audio.md#start8) when it is in the **prepared**, **paused**, or **stopped** state. 22 23- **paused**: The AudioRenderer enters this state by calling [pause()](../../reference/apis-audio-kit/js-apis-audio.md#pause8) when it is in the **running** state. When the audio playback is paused, it can call [start()](../../reference/apis-audio-kit/js-apis-audio.md#start8) to resume the playback. 24 25- **stopped**: The AudioRenderer enters this state by calling [stop()](../../reference/apis-audio-kit/js-apis-audio.md#stop8) when it is in the **paused** or **running** state. 26 27- **released**: The AudioRenderer enters this state by calling [release()](../../reference/apis-audio-kit/js-apis-audio.md#release8) when it is in the **prepared**, **paused**, or **stopped** state. In this state, the AudioRenderer releases all occupied hardware and software resources and will not transit to any other state. 28 29### How to Develop 30 311. Set audio rendering parameters and create an **AudioRenderer** instance. For details about the parameters, see [AudioRendererOptions](../../reference/apis-audio-kit/js-apis-audio.md#audiorendereroptions8). 32 33 ```ts 34 import { audio } from '@kit.AudioKit'; 35 36 let audioStreamInfo: audio.AudioStreamInfo = { 37 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate. 38 channels: audio.AudioChannel.CHANNEL_2, // Channel. 39 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format. 40 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format. 41 }; 42 43 let audioRendererInfo: audio.AudioRendererInfo = { 44 usage: audio.StreamUsage.STREAM_USAGE_MUSIC, 45 rendererFlags: 0 46 }; 47 48 let audioRendererOptions: audio.AudioRendererOptions = { 49 streamInfo: audioStreamInfo, 50 rendererInfo: audioRendererInfo 51 }; 52 53 audio.createAudioRenderer(audioRendererOptions, (err, data) => { 54 if (err) { 55 console.error(`Invoke createAudioRenderer failed, code is ${err.code}, message is ${err.message}`); 56 return; 57 } else { 58 console.info('Invoke createAudioRenderer succeeded.'); 59 let audioRenderer = data; 60 } 61 }); 62 ``` 63 642. Call **on('writeData')** to subscribe to the audio data write callback. 65 66 ```ts 67 import { BusinessError } from '@kit.BasicServicesKit'; 68 import { fileIo as fs } from '@kit.CoreFileKit'; 69 70 class Options { 71 offset?: number; 72 length?: number; 73 } 74 75 let bufferSize: number = 0; 76 let path = getContext().cacheDir; 77 // Ensure that the resource exists in the path. 78 let filePath = path + '/StarWars10s-2C-48000-4SW.wav'; 79 let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 80 let writeDataCallback = (buffer: ArrayBuffer) => { 81 let options: Options = { 82 offset: bufferSize, 83 length: buffer.byteLength 84 }; 85 fs.readSync(file.fd, buffer, options); 86 bufferSize += buffer.byteLength; 87 }; 88 89 audioRenderer.on('writeData', writeDataCallback); 90 ``` 91 923. Call **start()** to switch the AudioRenderer to the **running** state and start rendering. 93 94 ```ts 95 import { BusinessError } from '@kit.BasicServicesKit'; 96 97 audioRenderer.start((err: BusinessError) => { 98 if (err) { 99 console.error(`Renderer start failed, code is ${err.code}, message is ${err.message}`); 100 } else { 101 console.info('Renderer start success.'); 102 } 103 }); 104 ``` 105 1064. Call **stop()** to stop rendering. 107 108 ```ts 109 import { BusinessError } from '@kit.BasicServicesKit'; 110 111 audioRenderer.stop((err: BusinessError) => { 112 if (err) { 113 console.error(`Renderer stop failed, code is ${err.code}, message is ${err.message}`); 114 } else { 115 console.info('Renderer stopped.'); 116 } 117 }); 118 ``` 119 1205. Call **release()** to release the instance. 121 122 ```ts 123 import { BusinessError } from '@kit.BasicServicesKit'; 124 125 audioRenderer.release((err: BusinessError) => { 126 if (err) { 127 console.error(`Renderer release failed, code is ${err.code}, message is ${err.message}`); 128 } else { 129 console.info('Renderer released.'); 130 } 131 }); 132 ``` 133 134### Selecting the Correct Stream Usage 135 136When developing a media player, it is important to correctly set the stream usage type according to the intended use case. This will ensure that the player behaves as expected in different scenarios. 137 138The recommended use cases are described in [StreamUsage](../../reference/apis-audio-kit/js-apis-audio.md#streamusage). For example, **STREAM_USAGE_MUSIC** is recommended for music scenarios, **STREAM_USAGE_MOVIE** is recommended for movie or video scenarios, and **STREAM_USAGE_GAME** is recommended for gaming scenarios. 139 140An incorrect configuration of **StreamUsage** may cause unexpected behavior. Example scenarios are as follows: 141 142- When **STREAM_USAGE_MUSIC** is incorrectly used in a game scenario, the game cannot be played simultaneously with music applications. However, games usually can coexist with music playback. 143- When **STREAM_USAGE_MUSIC** is incorrectly used in a navigation scenario, any playing music is interrupted when the navigation application provides audio guidance. However, it is generally expected that the music keeps playing at a lower volume while the navigation is active. 144 145### Sample Code 146 147Refer to the sample code below to render an audio file using AudioRenderer. 148 149```ts 150import { audio } from '@kit.AudioKit'; 151import { BusinessError } from '@kit.BasicServicesKit'; 152import { fileIo as fs } from '@kit.CoreFileKit'; 153 154const TAG = 'AudioRendererDemo'; 155 156class Options { 157 offset?: number; 158 length?: number; 159} 160 161let bufferSize: number = 0; 162let renderModel: audio.AudioRenderer | undefined = undefined; 163let audioStreamInfo: audio.AudioStreamInfo = { 164 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate. 165 channels: audio.AudioChannel.CHANNEL_2, // Channel. 166 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format. 167 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format. 168}; 169let audioRendererInfo: audio.AudioRendererInfo = { 170 usage: audio.StreamUsage.STREAM_USAGE_MUSIC, // Audio stream usage type. 171 rendererFlags: 0 // AudioRenderer flag. 172}; 173let audioRendererOptions: audio.AudioRendererOptions = { 174 streamInfo: audioStreamInfo, 175 rendererInfo: audioRendererInfo 176}; 177let path = getContext().cacheDir; 178// Ensure that the resource exists in the path. 179let filePath = path + '/StarWars10s-2C-48000-4SW.wav'; 180let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 181let writeDataCallback = (buffer: ArrayBuffer) => { 182 let options: Options = { 183 offset: bufferSize, 184 length: buffer.byteLength 185 }; 186 fs.readSync(file.fd, buffer, options); 187 bufferSize += buffer.byteLength; 188}; 189 190// Create an AudioRenderer instance, and set the events to listen for. 191function init() { 192 audio.createAudioRenderer(audioRendererOptions, (err, renderer) => { // Create an AudioRenderer instance. 193 if (!err) { 194 console.info(`${TAG}: creating AudioRenderer success`); 195 renderModel = renderer; 196 if (renderModel !== undefined) { 197 (renderModel as audio.AudioRenderer).on('writeData', writeDataCallback); 198 } 199 } else { 200 console.info(`${TAG}: creating AudioRenderer failed, error: ${err.message}`); 201 } 202 }); 203} 204 205// Start audio rendering. 206function start() { 207 if (renderModel !== undefined) { 208 let stateGroup = [audio.AudioState.STATE_PREPARED, audio.AudioState.STATE_PAUSED, audio.AudioState.STATE_STOPPED]; 209 if (stateGroup.indexOf((renderModel as audio.AudioRenderer).state.valueOf()) === -1) { // Rendering can be started only when the AudioRenderer is in the prepared, paused, or stopped state. 210 console.error(TAG + 'start failed'); 211 return; 212 } 213 // Start rendering. 214 (renderModel as audio.AudioRenderer).start((err: BusinessError) => { 215 if (err) { 216 console.error('Renderer start failed.'); 217 } else { 218 console.info('Renderer start success.'); 219 } 220 }); 221 } 222} 223 224// Pause the rendering. 225function pause() { 226 if (renderModel !== undefined) { 227 // Rendering can be paused only when the AudioRenderer is in the running state. 228 if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING) { 229 console.info('Renderer is not running'); 230 return; 231 } 232 // Pause the rendering. 233 (renderModel as audio.AudioRenderer).pause((err: BusinessError) => { 234 if (err) { 235 console.error('Renderer pause failed.'); 236 } else { 237 console.info('Renderer pause success.'); 238 } 239 }); 240 } 241} 242 243// Stop rendering. 244async function stop() { 245 if (renderModel !== undefined) { 246 // Rendering can be stopped only when the AudioRenderer is in the running or paused state. 247 if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING && (renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_PAUSED) { 248 console.info('Renderer is not running or paused.'); 249 return; 250 } 251 // Stop rendering. 252 (renderModel as audio.AudioRenderer).stop((err: BusinessError) => { 253 if (err) { 254 console.error('Renderer stop failed.'); 255 } else { 256 fs.close(file); 257 console.info('Renderer stop success.'); 258 } 259 }); 260 } 261} 262 263// Release the instance. 264async function release() { 265 if (renderModel !== undefined) { 266 // The AudioRenderer can be released only when it is not in the released state. 267 if (renderModel.state.valueOf() === audio.AudioState.STATE_RELEASED) { 268 console.info('Renderer already released'); 269 return; 270 } 271 // Release the resources. 272 (renderModel as audio.AudioRenderer).release((err: BusinessError) => { 273 if (err) { 274 console.error('Renderer release failed.'); 275 } else { 276 console.info('Renderer release success.'); 277 } 278 }); 279 } 280} 281``` 282 283When audio streams with the same or higher priority need to use the output device, the current audio playback will be interrupted. The application can respond to and handle the interruption event. For details, see [Processing Audio Interruption Events](audio-playback-concurrency.md). 284