1# Using AudioRenderer for Audio Playback 2 3The AudioRenderer is used to play Pulse Code Modulation (PCM) audio data. Unlike the AVPlayer, the AudioRenderer can perform data preprocessing before audio input. Therefore, the AudioRenderer is more suitable if you have extensive audio development experience and want to implement more flexible playback features. 4 5## Development Guidelines 6 7The full rendering process involves creating an **AudioRenderer** instance, configuring audio rendering parameters, starting and stopping rendering, and releasing the instance. In this topic, you will learn how to use the AudioRenderer to render audio data. Before the development, you are advised to read [AudioRenderer](../reference/apis/js-apis-audio.md#audiorenderer8) for the API reference. 8 9The figure below shows the state changes of the AudioRenderer. After an **AudioRenderer** instance is created, different APIs can be called to switch the AudioRenderer to different states and trigger the required behavior. If an API is called when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. Therefore, you are advised to check the AudioRenderer state before triggering state transition. 10 11To prevent the UI thread from being blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the callback functions. 12 13**Figure 1** AudioRenderer state transition 14 15 16 17During application development, you are advised to use **on('stateChange')** to subscribe to state changes of the AudioRenderer. This is because some operations can be performed only when the AudioRenderer is in a given state. If the application performs an operation when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. 18 19- **prepared**: The AudioRenderer enters this state by calling **createAudioRenderer()**. 20 21- **running**: The AudioRenderer enters this state by calling **start()** when it is in the **prepared**, **paused**, or **stopped** state. 22 23- **paused**: The AudioRenderer enters this state by calling **pause()** when it is in the **running** state. When the audio playback is paused, it can call **start()** to resume the playback. 24 25- **stopped**: The AudioRenderer enters this state by calling **stop()** when it is in the **paused** or **running** state 26 27- **released**: The AudioRenderer enters this state by calling **release()** when it is in the **prepared**, **paused**, or **stopped** state. In this state, the AudioRenderer releases all occupied hardware and software resources and will not transit to any other state. 28 29### How to Develop 30 311. Set audio rendering parameters and create an **AudioRenderer** instance. For details about the parameters, see [AudioRendererOptions](../reference/apis/js-apis-audio.md#audiorendereroptions8). 32 33 ```ts 34 import audio from '@ohos.multimedia.audio'; 35 36 let audioStreamInfo = { 37 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100, 38 channels: audio.AudioChannel.CHANNEL_1, 39 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, 40 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW 41 }; 42 43 let audioRendererInfo = { 44 content: audio.ContentType.CONTENT_TYPE_SPEECH, 45 usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION, 46 rendererFlags: 0 47 }; 48 49 let audioRendererOptions = { 50 streamInfo: audioStreamInfo, 51 rendererInfo: audioRendererInfo 52 }; 53 54 audio.createAudioRenderer(audioRendererOptions, (err, data) => { 55 if (err) { 56 console.error(`Invoke createAudioRenderer failed, code is ${err.code}, message is ${err.message}`); 57 return; 58 } else { 59 console.info('Invoke createAudioRenderer succeeded.'); 60 let audioRenderer = data; 61 } 62 }); 63 ``` 64 652. Call **start()** to switch the AudioRenderer to the **running** state and start rendering. 66 67 ```ts 68 audioRenderer.start((err) => { 69 if (err) { 70 console.error(`Renderer start failed, code is ${err.code}, message is ${err.message}`); 71 } else { 72 console.info('Renderer start success.'); 73 } 74 }); 75 ``` 76 773. Specify the address of the file to render. Open the file and call **write()** to continuously write audio data to the buffer for rendering and playing. To implement personalized playback, process the audio data before writing it. 78 79 ```ts 80 const bufferSize = await audioRenderer.getBufferSize(); 81 let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 82 let buf = new ArrayBuffer(bufferSize); 83 let readsize = await fs.read(file.fd, buf); 84 let writeSize = await new Promise((resolve, reject) => { 85 audioRenderer.write(buf, (err, writeSize) => { 86 if (err) { 87 reject(err); 88 } else { 89 resolve(writeSize); 90 } 91 }); 92 }); 93 ``` 94 954. Call **stop()** to stop rendering. 96 97 ```ts 98 audioRenderer.stop((err) => { 99 if (err) { 100 console.error(`Renderer stop failed, code is ${err.code}, message is ${err.message}`); 101 } else { 102 console.info('Renderer stopped.'); 103 } 104 }); 105 ``` 106 1075. Call **release()** to release the instance. 108 109 ```ts 110 audioRenderer.release((err) => { 111 if (err) { 112 console.error(`Renderer release failed, code is ${err.code}, message is ${err.message}`); 113 } else { 114 console.info('Renderer released.'); 115 } 116 }); 117 ``` 118 119### Sample Code 120 121Refer to the sample code below to render an audio file using AudioRenderer. 122 123```ts 124import audio from '@ohos.multimedia.audio'; 125import fs from '@ohos.file.fs'; 126 127const TAG = 'AudioRendererDemo'; 128 129export default class AudioRendererDemo { 130 private renderModel = undefined; 131 private audioStreamInfo = { 132 samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate. 133 channels: audio.AudioChannel.CHANNEL_2, // Channel. 134 sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format. 135 encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format. 136 } 137 private audioRendererInfo = { 138 content: audio.ContentType.CONTENT_TYPE_MUSIC, // Media type. 139 usage: audio.StreamUsage.STREAM_USAGE_MEDIA, // Audio stream usage type. 140 rendererFlags: 0 // AudioRenderer flag. 141 } 142 private audioRendererOptions = { 143 streamInfo: this.audioStreamInfo, 144 rendererInfo: this.audioRendererInfo 145 } 146 147 // Create an AudioRenderer instance, and set the events to listen for. 148 init() { 149 audio.createAudioRenderer(this.audioRendererOptions, (err, renderer) => { // Create an AudioRenderer instance. 150 if (!err) { 151 console.info(`${TAG}: creating AudioRenderer success`); 152 this.renderModel = renderer; 153 this.renderModel.on('stateChange', (state) => { // Set the events to listen for. A callback is invoked when the AudioRenderer is switched to the specified state. 154 if (state == 2) { 155 console.info('audio renderer state is: STATE_RUNNING'); 156 } 157 }); 158 this.renderModel.on('markReach', 1000, (position) => { // Subscribe to the markReach event. A callback is triggered when the number of rendered frames reaches 1000. 159 if (position == 1000) { 160 console.info('ON Triggered successfully'); 161 } 162 }); 163 } else { 164 console.info(`${TAG}: creating AudioRenderer failed, error: ${err.message}`); 165 } 166 }); 167 } 168 169 // Start audio rendering. 170 async start() { 171 let stateGroup = [audio.AudioState.STATE_PREPARED, audio.AudioState.STATE_PAUSED, audio.AudioState.STATE_STOPPED]; 172 if (stateGroup.indexOf(this.renderModel.state) === -1) { // Rendering can be started only when the AudioRenderer is in the prepared, paused, or stopped state. 173 console.error(TAG + 'start failed'); 174 return; 175 } 176 await this.renderModel.start(); // Start rendering. 177 178 const bufferSize = await this.renderModel.getBufferSize(); 179 let context = getContext(this); 180 let path = context.filesDir; 181 const filePath = path + '/test.wav'; // Use the sandbox path to obtain the file. The actual file path is /data/storage/el2/base/haps/entry/files/test.wav. 182 183 let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY); 184 let stat = await fs.stat(filePath); 185 let buf = new ArrayBuffer(bufferSize); 186 let len = stat.size % bufferSize === 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1); 187 for (let i = 0; i < len; i++) { 188 let options = { 189 offset: i * bufferSize, 190 length: bufferSize 191 }; 192 let readsize = await fs.read(file.fd, buf, options); 193 194 // buf indicates the audio data to be written to the buffer. Before calling AudioRenderer.write(), you can preprocess the audio data for personalized playback. The AudioRenderer reads the audio data written to the buffer for rendering. 195 196 let writeSize = await new Promise((resolve, reject) => { 197 this.renderModel.write(buf, (err, writeSize) => { 198 if (err) { 199 reject(err); 200 } else { 201 resolve(writeSize); 202 } 203 }); 204 }); 205 if (this.renderModel.state === audio.AudioState.STATE_RELEASED) { // The rendering stops if the AudioRenderer is in the released state. 206 fs.close(file); 207 await this.renderModel.stop(); 208 } 209 if (this.renderModel.state === audio.AudioState.STATE_RUNNING) { 210 if (i === len - 1) { // The rendering stops if the file finishes reading. 211 fs.close(file); 212 await this.renderModel.stop(); 213 } 214 } 215 } 216 } 217 218 // Pause the rendering. 219 async pause() { 220 // Rendering can be paused only when the AudioRenderer is in the running state. 221 if (this.renderModel.state !== audio.AudioState.STATE_RUNNING) { 222 console.info('Renderer is not running'); 223 return; 224 } 225 await this.renderModel.pause(); // Pause rendering. 226 if (this.renderModel.state === audio.AudioState.STATE_PAUSED) { 227 console.info('Renderer is paused.'); 228 } else { 229 console.error('Pausing renderer failed.'); 230 } 231 } 232 233 // Stop rendering. 234 async stop() { 235 // Rendering can be stopped only when the AudioRenderer is in the running or paused state. 236 if (this.renderModel.state !== audio.AudioState.STATE_RUNNING && this.renderModel.state !== audio.AudioState.STATE_PAUSED) { 237 console.info('Renderer is not running or paused.'); 238 return; 239 } 240 await this.renderModel.stop(); // Stop rendering. 241 if (this.renderModel.state === audio.AudioState.STATE_STOPPED) { 242 console.info('Renderer stopped.'); 243 } else { 244 console.error('Stopping renderer failed.'); 245 } 246 } 247 248 // Release the instance. 249 async release() { 250 // The AudioRenderer can be released only when it is not in the released state. 251 if (this.renderModel.state === audio.AudioState.STATE_RELEASED) { 252 console.info('Renderer already released'); 253 return; 254 } 255 await this.renderModel.release(); // Release the instance. 256 if (this.renderModel.state === audio.AudioState.STATE_RELEASED) { 257 console.info('Renderer released'); 258 } else { 259 console.error('Renderer release failed.'); 260 } 261 } 262} 263``` 264 265When audio streams with the same or higher priority need to use the output device, the current audio playback will be interrupted. The application can respond to and handle the interruption event. For details about how to process concurrent audio playback, see [Audio Playback Concurrency Policies](audio-playback-concurrency.md). 266