• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Using AudioRenderer for Audio Playback
2
3The AudioRenderer is used to play Pulse Code Modulation (PCM) audio data. Unlike the AVPlayer, the AudioRenderer can perform data preprocessing before audio input. Therefore, the AudioRenderer is more suitable if you have extensive audio development experience and want to implement more flexible playback features.
4
5## Development Guidelines
6
7The full rendering process involves creating an **AudioRenderer** instance, configuring audio rendering parameters, starting and stopping rendering, and releasing the instance. In this topic, you will learn how to use the AudioRenderer to render audio data. Before the development, you are advised to read [AudioRenderer](../reference/apis/js-apis-audio.md#audiorenderer8) for the API reference.
8
9The figure below shows the state changes of the AudioRenderer. After an **AudioRenderer** instance is created, different APIs can be called to switch the AudioRenderer to different states and trigger the required behavior. If an API is called when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. Therefore, you are advised to check the AudioRenderer state before triggering state transition.
10
11To prevent the UI thread from being blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the callback functions.
12
13**Figure 1** AudioRenderer state transition
14
15![AudioRenderer state transition](figures/audiorenderer-status-change.png)
16
17During application development, you are advised to use **on('stateChange')** to subscribe to state changes of the AudioRenderer. This is because some operations can be performed only when the AudioRenderer is in a given state. If the application performs an operation when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior.
18
19- **prepared**: The AudioRenderer enters this state by calling **createAudioRenderer()**.
20
21- **running**: The AudioRenderer enters this state by calling **start()** when it is in the **prepared**, **paused**, or **stopped** state.
22
23- **paused**: The AudioRenderer enters this state by calling **pause()** when it is in the **running** state. When the audio playback is paused, it can call **start()** to resume the playback.
24
25- **stopped**: The AudioRenderer enters this state by calling **stop()** when it is in the **paused** or **running** state
26
27- **released**: The AudioRenderer enters this state by calling **release()** when it is in the **prepared**, **paused**, or **stopped** state. In this state, the AudioRenderer releases all occupied hardware and software resources and will not transit to any other state.
28
29### How to Develop
30
311. Set audio rendering parameters and create an **AudioRenderer** instance. For details about the parameters, see [AudioRendererOptions](../reference/apis/js-apis-audio.md#audiorendereroptions8).
32
33   ```ts
34   import audio from '@ohos.multimedia.audio';
35
36   let audioStreamInfo: audio.AudioStreamInfo = {
37     samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
38     channels: audio.AudioChannel.CHANNEL_1,
39     sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
40     encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
41   };
42
43   let audioRendererInfo: audio.AudioRendererInfo = {
44     content: audio.ContentType.CONTENT_TYPE_SPEECH,
45     usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
46     rendererFlags: 0
47   };
48
49   let audioRendererOptions: audio.AudioRendererOptions = {
50     streamInfo: audioStreamInfo,
51     rendererInfo: audioRendererInfo
52   };
53
54   audio.createAudioRenderer(audioRendererOptions, (err, data) => {
55     if (err) {
56      console.error(`Invoke createAudioRenderer failed, code is ${err.code}, message is ${err.message}`);
57      return;
58     } else {
59      console.info('Invoke createAudioRenderer succeeded.');
60      let audioRenderer = data;
61     }
62   });
63   ```
64
652. Call **start()** to switch the AudioRenderer to the **running** state and start rendering.
66
67   ```ts
68   audioRenderer.start((err: BusinessError) => {
69     if (err) {
70       console.error(`Renderer start failed, code is ${err.code}, message is ${err.message}`);
71     } else {
72       console.info('Renderer start success.');
73     }
74   });
75   ```
76
773. Specify the address of the file to render. Open the file and call **write()** to continuously write audio data to the buffer for rendering and playing. To implement personalized playback, process the audio data before writing it.
78
79   ```ts
80   import fs from '@ohos.file.fs';
81
82   let context = getContext(this);
83   async function read() {
84     const bufferSize: number = await audioRenderer.getBufferSize();
85     let path = context.filesDir;
86
87     const filePath = path + '/voice_call_data.wav';
88     let file: fs.File = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
89     let buf = new ArrayBuffer(bufferSize);
90     let readsize: number = await fs.read(file.fd, buf);
91     let writeSize: number = await audioRenderer.write(buf);
92   }
93   ```
94
954. Call **stop()** to stop rendering.
96
97   ```ts
98   audioRenderer.stop((err: BusinessError) => {
99     if (err) {
100       console.error(`Renderer stop failed, code is ${err.code}, message is ${err.message}`);
101     } else {
102       console.info('Renderer stopped.');
103     }
104   });
105   ```
106
1075. Call **release()** to release the instance.
108
109   ```ts
110   audioRenderer.release((err: BusinessError) => {
111     if (err) {
112       console.error(`Renderer release failed, code is ${err.code}, message is ${err.message}`);
113     } else {
114       console.info('Renderer released.');
115     }
116   });
117   ```
118
119### Sample Code
120
121Refer to the sample code below to render an audio file using AudioRenderer.
122
123```ts
124import audio from '@ohos.multimedia.audio';
125import fs from '@ohos.file.fs';
126
127const TAG = 'AudioRendererDemo';
128
129let context = getContext(this);
130let renderModel: audio.AudioRenderer | undefined = undefined;
131let audioStreamInfo: audio.AudioStreamInfo = {
132  samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate.
133  channels: audio.AudioChannel.CHANNEL_2, // Channel.
134  sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format.
135  encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format.
136}
137let audioRendererInfo: audio.AudioRendererInfo = {
138  content: audio.ContentType.CONTENT_TYPE_MUSIC, // Media type.
139  usage: audio.StreamUsage.STREAM_USAGE_MEDIA, // Audio stream usage type.
140  rendererFlags: 0 // AudioRenderer flag.
141}
142let audioRendererOptions: audio.AudioRendererOptions = {
143  streamInfo: audioStreamInfo,
144  rendererInfo: audioRendererInfo
145}
146
147// Create an AudioRenderer instance, and set the events to listen for.
148async function init() {
149  audio.createAudioRenderer(audioRendererOptions, (err, renderer) => { // Create an AudioRenderer instance.
150    if (!err) {
151      console.info(`${TAG}: creating AudioRenderer success`);
152      renderModel = renderer;
153      if (renderModel !== undefined) {
154        (renderModel as audio.AudioRenderer).on('stateChange', (state: audio.AudioState) => { // Set the events to listen for. A callback is invoked when the AudioRenderer is switched to the specified state.
155          if (state == 2) {
156            console.info('audio renderer state is: STATE_RUNNING');
157          }
158        });
159        (renderModel as audio.AudioRenderer).on('markReach', 1000, (position: number) => { // Subscribe to the markReach event. A callback is triggered when the number of rendered frames reaches 1000.
160          if (position == 1000) {
161            console.info('ON Triggered successfully');
162          }
163        });
164      }
165    } else {
166      console.info(`${TAG}: creating AudioRenderer failed, error: ${err.message}`);
167    }
168  });
169}
170
171// Start audio rendering.
172async function start() {
173  if (renderModel !== undefined) {
174    let stateGroup = [audio.AudioState.STATE_PREPARED, audio.AudioState.STATE_PAUSED, audio.AudioState.STATE_STOPPED];
175    if (stateGroup.indexOf((renderModel as audio.AudioRenderer).state.valueOf()) === -1) { // Rendering can be started only when the AudioRenderer is in the prepared, paused, or stopped state.
176      console.error(TAG + 'start failed');
177      return;
178    }
179    await (renderModel as audio.AudioRenderer).start(); // Start rendering.
180
181    const bufferSize = await (renderModel as audio.AudioRenderer).getBufferSize();
182
183    let path = context.filesDir;
184    const filePath = path + '/test.wav'; // Use the sandbox path to obtain the file. The actual file path is /data/storage/el2/base/haps/entry/files/test.wav.
185
186    let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
187    let stat = await fs.stat(filePath);
188    let buf = new ArrayBuffer(bufferSize);
189    let len = stat.size % bufferSize === 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1);
190    class Options {
191      offset: number = 0;
192      length: number = 0
193    }
194    for (let i = 0; i < len; i++) {
195      let options: Options = {
196        offset: i * bufferSize,
197        length: bufferSize
198      };
199      let readsize = await fs.read(file.fd, buf, options);
200
201      // buf indicates the audio data to be written to the buffer. Before calling AudioRenderer.write(), you can preprocess the audio data for personalized playback. The AudioRenderer reads the audio data written to the buffer for rendering.
202
203      let writeSize: number = await (renderModel as audio.AudioRenderer).write(buf);
204        if ((renderModel as audio.AudioRenderer).state.valueOf() === audio.AudioState.STATE_RELEASED) { // The rendering stops if the AudioRenderer is in the released state.
205        fs.close(file);
206        await (renderModel as audio.AudioRenderer).stop();
207      }
208      if ((renderModel as audio.AudioRenderer).state.valueOf() === audio.AudioState.STATE_RUNNING) {
209        if (i === len - 1) { // The rendering stops if the file finishes reading.
210          fs.close(file);
211          await (renderModel as audio.AudioRenderer).stop();
212        }
213      }
214    }
215  }
216}
217
218// Pause the rendering.
219async function pause() {
220  if (renderModel !== undefined) {
221    // Rendering can be paused only when the AudioRenderer is in the running state.
222    if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING) {
223      console.info('Renderer is not running');
224      return;
225    }
226    await (renderModel as audio.AudioRenderer).pause(); // Pause rendering.
227    if ((renderModel as audio.AudioRenderer).state.valueOf() === audio.AudioState.STATE_PAUSED) {
228      console.info('Renderer is paused.');
229    } else {
230      console.error('Pausing renderer failed.');
231    }
232  }
233}
234
235// Stop rendering.
236async function stop() {
237  if (renderModel !== undefined) {
238    // Rendering can be stopped only when the AudioRenderer is in the running or paused state.
239    if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING && (renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_PAUSED) {
240      console.info('Renderer is not running or paused.');
241      return;
242    }
243    await (renderModel as audio.AudioRenderer).stop(); // Stop rendering.
244    if ((renderModel as audio.AudioRenderer).state.valueOf() === audio.AudioState.STATE_STOPPED) {
245      console.info('Renderer stopped.');
246    } else {
247      console.error('Stopping renderer failed.');
248    }
249  }
250}
251
252// Release the instance.
253async function release() {
254  if (renderModel !== undefined) {
255    // The AudioRenderer can be released only when it is not in the released state.
256    if (renderModel.state.valueOf() === audio.AudioState.STATE_RELEASED) {
257      console.info('Renderer already released');
258      return;
259    }
260    await renderModel.release(); // Release the instance.
261    if (renderModel.state.valueOf() === audio.AudioState.STATE_RELEASED) {
262      console.info('Renderer released');
263    } else {
264      console.error('Renderer release failed.');
265    }
266  }
267}
268```
269
270When audio streams with the same or higher priority need to use the output device, the current audio playback will be interrupted. The application can respond to and handle the interruption event. For details about how to process concurrent audio playback, see [Audio Playback Concurrency Policies](audio-playback-concurrency.md).
271