• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Audio Rendering Development
2
3## Introduction
4
5**AudioRenderer** provides APIs for rendering audio files and controlling playback. It also supports audio interruption. You can use the APIs provided by **AudioRenderer** to play audio files in output devices and manage playback tasks.
6Before calling the APIs, be familiar with the following terms:
7
8- **Audio interruption**: When an audio stream with a higher priority needs to be played, the audio renderer interrupts the stream with a lower priority. For example, if a call comes in when the user is listening to music, the music playback, which is the lower priority stream, is paused.
9- **Status check**: During application development, you are advised to use **on('stateChange')** to subscribe to state changes of the **AudioRenderer** instance. This is because some operations can be performed only when the audio renderer is in a given state. If the application performs an operation when the audio renderer is not in the given state, the system may throw an exception or generate other undefined behavior.
10- **Asynchronous operation**: To prevent the UI thread from being blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the promise functions. For more information, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8).
11- **Audio interruption mode**: OpenHarmony provides two audio interruption modes: **shared mode** and **independent mode**. In shared mode, all **AudioRenderer** instances created by the same application share one focus object, and there is no focus transfer inside the application. Therefore, no callback will be triggered. In independent mode, each **AudioRenderer** instance has an independent focus object, and focus transfer is triggered by focus preemption. When focus transfer occurs, the **AudioRenderer** instance that is having the focus receives a notification through the callback. By default, the shared mode is used. You can call **setInterruptMode()** to switch to the independent mode.
12
13## Working Principles
14
15The following figure shows the audio renderer state transitions.
16
17**Figure 1** Audio renderer state transitions
18
19![audio-renderer-state](figures/audio-renderer-state.png)
20
21- **PREPARED**: The audio renderer enters this state by calling **create()**.
22- **RUNNING**: The audio renderer enters this state by calling **start()** when it is in the **PREPARED** state or by calling **start()** when it is in the **STOPPED** state.
23- **PAUSED**: The audio renderer enters this state by calling **pause()** when it is in the **RUNNING** state. When the audio playback is paused, it can call **start()** to resume the playback.
24- **STOPPED**: The audio renderer enters this state by calling **stop()** when it is in the **PAUSED** or **RUNNING** state.
25- **RELEASED**: The audio renderer enters this state by calling **release()** when it is in the **PREPARED**, **PAUSED**, or **STOPPED** state. In this state, the audio renderer releases all occupied hardware and software resources and will not transit to any other state.
26
27## How to Develop
28
29For details about the APIs, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8).
30
311. Use **createAudioRenderer()** to create a global **AudioRenderer** instance.
32   Set parameters of the **AudioRenderer** instance in **audioRendererOptions**. This instance is used to render audio, control and obtain the rendering status, and register a callback for notification.
33
34   ```js
35   import audio from '@ohos.multimedia.audio';
36   import fs from '@ohos.file.fs';
37
38   // Perform a self-test on APIs related to audio rendering.
39   @Entry
40   @Component
41   struct AudioRenderer1129 {
42     private audioRenderer: audio.AudioRenderer;
43     private bufferSize; // It will be used for the call of the write function in step 3.
44     private audioRenderer1: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
45     private audioRenderer2: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
46
47     async initAudioRender(){
48       let audioStreamInfo = {
49         samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
50         channels: audio.AudioChannel.CHANNEL_1,
51         sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
52         encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
53       }
54       let audioRendererInfo = {
55         content: audio.ContentType.CONTENT_TYPE_SPEECH,
56         usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
57         rendererFlags: 0 // 0 is the extended flag bit of the audio renderer. The default value is 0.
58       }
59       let audioRendererOptions = {
60         streamInfo: audioStreamInfo,
61         rendererInfo: audioRendererInfo
62       }
63       this.audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
64       console.log("Create audio renderer success.");
65     }
66   }
67   ```
68
692. Use **start()** to start audio rendering.
70
71   ```js
72   async startRenderer() {
73     let state = this.audioRenderer.state;
74     // The audio renderer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state when start() is called.
75     if (state != audio.AudioState.STATE_PREPARED && state != audio.AudioState.STATE_PAUSED &&
76     state != audio.AudioState.STATE_STOPPED) {
77       console.info('Renderer is not in a correct state to start');
78       return;
79     }
80
81     await this.audioRenderer.start();
82
83     state = this.audioRenderer.state;
84     if (state == audio.AudioState.STATE_RUNNING) {
85       console.info('Renderer started');
86     } else {
87       console.error('Renderer start failed');
88     }
89   }
90   ```
91
92   The renderer state will be **STATE_RUNNING** once the audio renderer is started. The application can then begin reading buffers.
93
943. Call **write()** to write data to the buffer.
95
96   Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer. Import fs from '@ohos.file.fs'; as step 1.
97
98   ```js
99   async writeData(){
100     // Set a proper buffer size for the audio renderer. You can also select a buffer of another size.
101     this.bufferSize = await this.audioRenderer.getBufferSize();
102     let dir = globalThis.fileDir; // You must use the sandbox path.
103     const filePath = dir + '/file_example_WAV_2MG.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/file_example_WAV_2MG.wav
104     console.info(`file filePath: ${ filePath}`);
105
106     let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
107     let stat = await fs.stat(filePath); // Music file information.
108     let buf = new ArrayBuffer(this.bufferSize);
109     let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
110     for (let i = 0;i < len; i++) {
111       let options = {
112         offset: i * this.bufferSize,
113         length: this.bufferSize
114       }
115       let readsize = await fs.read(file.fd, buf, options)
116       let writeSize = await new Promise((resolve,reject)=>{
117         this.audioRenderer.write(buf,(err,writeSize)=>{
118           if(err){
119             reject(err)
120           }else{
121             resolve(writeSize)
122           }
123         })
124       })
125     }
126
127     fs.close(file)
128     await this.audioRenderer.stop(); // Stop rendering.
129     await this.audioRenderer.release(); // Release the resources.
130   }
131   ```
132
1334. (Optional) Call **pause()** or **stop()** to pause or stop rendering.
134
135   ```js
136   async  pauseRenderer() {
137     let state = this.audioRenderer.state;
138     // The audio renderer can be paused only when it is in the STATE_RUNNING state.
139     if (state != audio.AudioState.STATE_RUNNING) {
140       console.info('Renderer is not running');
141       return;
142     }
143
144     await this.audioRenderer.pause();
145
146     state = this.audioRenderer.state;
147     if (state == audio.AudioState.STATE_PAUSED) {
148       console.info('Renderer paused');
149     } else {
150       console.error('Renderer pause failed');
151     }
152   }
153
154   async  stopRenderer() {
155     let state = this.audioRenderer.state;
156     // The audio renderer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state.
157     if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) {
158       console.info('Renderer is not running or paused');
159       return;
160     }
161
162     await this.audioRenderer.stop();
163
164     state = this.audioRenderer.state;
165     if (state == audio.AudioState.STATE_STOPPED) {
166       console.info('Renderer stopped');
167     } else {
168       console.error('Renderer stop failed');
169     }
170   }
171   ```
172
1735. (Optional) Call **drain()** to clear the buffer.
174
175   ```js
176    async  drainRenderer() {
177    let state = this.audioRenderer.state;
178    // drain() can be used only when the audio renderer is in the STATE_RUNNING state.
179    if (state != audio.AudioState.STATE_RUNNING) {
180      console.info('Renderer is not running');
181      return;
182    }
183
184    await this.audioRenderer.drain();
185    state = this.audioRenderer.state;
186    }
187   ```
188
1896. After the task is complete, call **release()** to release related resources.
190
191   **AudioRenderer** uses a large number of system resources. Therefore, ensure that the resources are released after the task is complete.
192
193   ```js
194    async releaseRenderer() {
195    let state = this.audioRenderer.state;
196    // The audio renderer can be released only when it is not in the STATE_RELEASED or STATE_NEW state.
197    if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) {
198      console.info('Renderer already released');
199      return;
200    }
201    await this.audioRenderer.release();
202
203    state = this.audioRenderer.state;
204    if (state == audio.AudioState.STATE_RELEASED) {
205      console.info('Renderer released');
206    } else {
207      console.info('Renderer release failed');
208    }
209    }
210   ```
211
2127. (Optional) Obtain the audio renderer information.
213
214   You can use the following code to obtain the audio renderer information:
215
216   ```js
217   async getRenderInfo(){
218     // Obtain the audio renderer state.
219     let state = this.audioRenderer.state;
220     // Obtain the audio renderer information.
221     let audioRendererInfo : audio.AudioRendererInfo = await this.audioRenderer.getRendererInfo();
222     // Obtain the audio stream information.
223     let audioStreamInfo : audio.AudioStreamInfo = await this.audioRenderer.getStreamInfo();
224     // Obtain the audio stream ID.
225     let audioStreamId : number = await this.audioRenderer.getAudioStreamId();
226     // Obtain the Unix timestamp, in nanoseconds.
227     let audioTime : number = await this.audioRenderer.getAudioTime();
228     // Obtain a proper minimum buffer size.
229     let bufferSize : number = await this.audioRenderer.getBufferSize();
230     // Obtain the audio renderer rate.
231     let renderRate : audio.AudioRendererRate = await this.audioRenderer.getRenderRate();
232   }
233   ```
234
2358. (Optional) Set the audio renderer information.
236
237   You can use the following code to set the audio renderer information:
238
239   ```js
240   async setAudioRenderInfo(){
241     // Set the audio renderer rate to RENDER_RATE_NORMAL.
242     let renderRate : audio.AudioRendererRate = audio.AudioRendererRate.RENDER_RATE_NORMAL;
243     await this.audioRenderer.setRenderRate(renderRate);
244     // Set the interruption mode of the audio renderer to SHARE_MODE.
245     let interruptMode : audio.InterruptMode = audio.InterruptMode.SHARE_MODE;
246     await this.audioRenderer.setInterruptMode(interruptMode);
247     // Set the volume of the stream to 0.5.
248     let volume : number = 0.5;
249     await this.audioRenderer.setVolume(volume);
250   }
251   ```
252
2539. (Optional) Use **on('audioInterrupt')** to subscribe to the audio interruption event, and use **off('audioInterrupt')** to unsubscribe from the event.
254
255   Audio interruption means that Stream A will be interrupted when Stream B with a higher or equal priority requests to become active and use the output device.
256
257   In some cases, the audio renderer performs forcible operations such as pausing and ducking, and notifies the application through **InterruptEvent**. In other cases, the application can choose to act on the **InterruptEvent** or ignore it.
258
259   In the case of audio interruption, the application may encounter write failures. To avoid such failures, interruption-unaware applications can use **audioRenderer.state** to check the audio renderer state before writing audio data. The applications can obtain more details by subscribing to the audio interruption events. For details, see [InterruptEvent](../reference/apis/js-apis-audio.md#interruptevent9).
260
261   It should be noted that the audio interruption event subscription of the **AudioRenderer** module is slightly different from **on('interrupt')** in [AudioManager](../reference/apis/js-apis-audio.md#audiomanager). The **on('interrupt')** and **off('interrupt')** APIs are deprecated since API version 9. In the **AudioRenderer** module, you only need to call **on('audioInterrupt')** to listen for focus change events. When the **AudioRenderer** instance created by the application performs actions such as start, stop, and pause, it requests the focus, which triggers focus transfer and in return enables the related **AudioRenderer** instance to receive a notification through the callback. For instances other than **AudioRenderer**, such as frequency modulation (FM) and voice wakeup, the application does not create an instance. In this case, the application can call **on('interrupt')** in **AudioManager** to receive a focus change notification.
262
263   ```js
264   async subscribeAudioRender(){
265     this.audioRenderer.on('audioInterrupt', (interruptEvent) => {
266       console.info('InterruptEvent Received');
267       console.info(`InterruptType: ${interruptEvent.eventType}`);
268       console.info(`InterruptForceType: ${interruptEvent.forceType}`);
269       console.info(`AInterruptHint: ${interruptEvent.hintType}`);
270
271       if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_FORCE) {
272         switch (interruptEvent.hintType) {
273         // Forcible pausing initiated by the audio framework. To prevent data loss, stop the write operation.
274           case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
275             console.info('isPlay is false');
276             break;
277         // Forcible stopping initiated by the audio framework. To prevent data loss, stop the write operation.
278           case audio.InterruptHint.INTERRUPT_HINT_STOP:
279             console.info('isPlay is false');
280             break;
281         // Forcible ducking initiated by the audio framework.
282           case audio.InterruptHint.INTERRUPT_HINT_DUCK:
283             break;
284         // Undocking initiated by the audio framework.
285           case audio.InterruptHint.INTERRUPT_HINT_UNDUCK:
286             break;
287         }
288       } else if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_SHARE) {
289         switch (interruptEvent.hintType) {
290         // Notify the application that the rendering starts.
291           case audio.InterruptHint.INTERRUPT_HINT_RESUME:
292             this.startRenderer();
293             break;
294         // Notify the application that the audio stream is interrupted. The application then determines whether to continue. (In this example, the application pauses the rendering.)
295           case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
296             console.info('isPlay is false');
297             this.pauseRenderer();
298             break;
299         }
300       }
301     });
302   }
303   ```
304
30510. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event.
306
307    After the mark reached event is subscribed to, when the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
308
309    ```js
310    async markReach(){
311      this.audioRenderer.on('markReach', 50, (position) => {
312        if (position == 50) {
313          console.info('ON Triggered successfully');
314        }
315      });
316      this.audioRenderer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
317    }
318    ```
319
32011. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event.
321
322    After the period reached event is subscribed to, each time the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
323
324    ```js
325    async periodReach(){
326      this.audioRenderer.on('periodReach',10, (reachNumber) => {
327        console.info(`In this period, the renderer reached frame: ${reachNumber} `);
328      });
329
330      this.audioRenderer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
331    }
332    ```
333
33412. (Optional) Use **on('stateChange')** to subscribe to audio renderer state changes.
335
336    After the **stateChange** event is subscribed to, when the audio renderer state changes, a callback is triggered and the audio renderer state is returned.
337
338    ```js
339    async stateChange(){
340      this.audioRenderer.on('stateChange', (audioState) => {
341        console.info('State change event Received');
342        console.info(`Current renderer state is: ${audioState}`);
343      });
344    }
345    ```
346
34713. (Optional) Handle exceptions of **on()**.
348
349    If the string or the parameter type passed in **on()** is incorrect , the application throws an exception. In this case, you can use **try catch** to capture the exception.
350
351    ```js
352    async errorCall(){
353      try {
354        this.audioRenderer.on('invalidInput', () => { // The string is invalid.
355        })
356      } catch (err) {
357        console.info(`Call on function error, ${err}`); // The application throws exception 401.
358      }
359      try {
360        this.audioRenderer.on(1, () => { // The type of the input parameter is incorrect.
361        })
362      } catch (err) {
363        console.info(`Call on function error,  ${err}`); // The application throws exception 6800101.
364      }
365    }
366    ```
367
36814. (Optional) Refer to the complete example of **on('audioInterrupt')**.
369     Declare audioRenderer1 and audioRenderer2 first. For details, see step 1.
370     Create **AudioRender1** and **AudioRender2** in an application, configure the independent interruption mode, and call **on('audioInterrupt')** to subscribe to audio interruption events. At the beginning, **AudioRender1** has the focus. When **AudioRender2** attempts to obtain the focus, **AudioRender1** receives a focus transfer notification and the related log information is printed. If the shared mode is used, the log information will not be printed during application running.
371     ```js
372     async runningAudioRender1(){
373       let audioStreamInfo = {
374         samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000,
375         channels: audio.AudioChannel.CHANNEL_1,
376         sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S32LE,
377         encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
378       }
379       let audioRendererInfo = {
380         content: audio.ContentType.CONTENT_TYPE_MUSIC,
381         usage: audio.StreamUsage.STREAM_USAGE_MEDIA,
382         rendererFlags: 0 // 0 is the extended flag bit of the audio renderer. The default value is 0.
383       }
384       let audioRendererOptions = {
385         streamInfo: audioStreamInfo,
386         rendererInfo: audioRendererInfo
387       }
388
389       // 1.1 Create an instance.
390       this.audioRenderer1 = await audio.createAudioRenderer(audioRendererOptions);
391       console.info("Create audio renderer 1 success.");
392
393       // 1.2 Set the independent mode.
394       this.audioRenderer1.setInterruptMode(1).then( data => {
395         console.info('audioRenderer1 setInterruptMode Success!');
396       }).catch((err) => {
397         console.error(`audioRenderer1 setInterruptMode Fail: ${err}`);
398       });
399
400       // 1.3 Set the listener.
401       this.audioRenderer1.on('audioInterrupt', async(interruptEvent) => {
402         console.info(`audioRenderer1 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
403       });
404
405       // 1.4 Start rendering.
406       await this.audioRenderer1.start();
407       console.info('startAudioRender1 success');
408
409       // 1.5 Obtain the buffer size, which is the proper minimum buffer size of the audio renderer. You can also select a buffer of another size.
410       const bufferSize = await this.audioRenderer1.getBufferSize();
411       console.info(`audio bufferSize: ${bufferSize}`);
412
413       // 1.6 Obtain the original audio data file.
414       let dir = globalThis.fileDir; // You must use the sandbox path.
415       const path1 = dir + '/music001_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music001_48000_32_1.wav
416       console.info(`audioRender1 file path: ${ path1}`);
417       let file1 = fs.openSync(path1, fs.OpenMode.READ_ONLY);
418       let stat = await fs.stat(path1); // Music file information.
419       let buf = new ArrayBuffer(bufferSize);
420       let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
421
422       // 1.7 Render the original audio data in the buffer by using audioRender.
423       for (let i = 0;i < len; i++) {
424         let options = {
425           offset: i * this.bufferSize,
426           length: this.bufferSize
427         }
428         let readsize = await fs.read(file1.fd, buf, options)
429         let writeSize = await new Promise((resolve,reject)=>{
430           this.audioRenderer1.write(buf,(err,writeSize)=>{
431             if(err){
432               reject(err)
433             }else{
434               resolve(writeSize)
435             }
436           })
437         })
438       }
439       fs.close(file1)
440       await this.audioRenderer1.stop(); // Stop rendering.
441       await this.audioRenderer1.release(); // Release the resources.
442     }
443
444     async runningAudioRender2(){
445       let audioStreamInfo = {
446         samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000,
447         channels: audio.AudioChannel.CHANNEL_1,
448         sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S32LE,
449         encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
450       }
451       let audioRendererInfo = {
452         content: audio.ContentType.CONTENT_TYPE_MUSIC,
453         usage: audio.StreamUsage.STREAM_USAGE_MEDIA,
454         rendererFlags: 0 // 0 is the extended flag bit of the audio renderer. The default value is 0.
455       }
456       let audioRendererOptions = {
457         streamInfo: audioStreamInfo,
458         rendererInfo: audioRendererInfo
459       }
460
461       // 2.1 Create another instance.
462       this.audioRenderer2 = await audio.createAudioRenderer(audioRendererOptions);
463       console.info("Create audio renderer 2 success.");
464
465       // 2.2 Set the independent mode.
466       this.audioRenderer2.setInterruptMode(1).then( data => {
467         console.info('audioRenderer2 setInterruptMode Success!');
468       }).catch((err) => {
469         console.error(`audioRenderer2 setInterruptMode Fail: ${err}`);
470       });
471
472       // 2.3 Set the listener.
473       this.audioRenderer2.on('audioInterrupt', async(interruptEvent) => {
474         console.info(`audioRenderer2 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
475       });
476
477       // 2.4 Start rendering.
478       await this.audioRenderer2.start();
479       console.info('startAudioRender2 success');
480
481       // 2.5 Obtain the buffer size.
482       const bufferSize = await this.audioRenderer2.getBufferSize();
483       console.info(`audio bufferSize: ${bufferSize}`);
484
485       // 2.6 Read the original audio data file.
486       let dir = globalThis.fileDir; // You must use the sandbox path.
487       const path2 = dir + '/music002_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music002_48000_32_1.wav
488       console.info(`audioRender2 file path: ${ path2}`);
489       let file2 = fs.openSync(path2, fs.OpenMode.READ_ONLY);
490       let stat = await fs.stat(path2); // Music file information.
491       let buf = new ArrayBuffer(bufferSize);
492       let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
493
494       // 2.7 Render the original audio data in the buffer by using audioRender.
495       for (let i = 0;i < len; i++) {
496         let options = {
497           offset: i * this.bufferSize,
498           length: this.bufferSize
499         }
500         let readsize = await fs.read(file2.fd, buf, options)
501         let writeSize = await new Promise((resolve,reject)=>{
502           this.audioRenderer2.write(buf,(err,writeSize)=>{
503             if(err){
504               reject(err)
505             }else{
506               resolve(writeSize)
507             }
508           })
509         })
510       }
511       fs.close(file2)
512       await this.audioRenderer2.stop(); // Stop rendering.
513       await this.audioRenderer2.release(); // Release the resources.
514     }
515
516     // Integrated invoking entry.
517     async test(){
518       await this.runningAudioRender1();
519       await this.runningAudioRender2();
520     }
521
522     ```