• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Video Playback (ArkTS)
2
3The system provides two solutions for video playback development:
4
5- **AVPlayer** class: provides ArkTS and JS APIs to implement audio and video playback. It also supports parsing streaming media and local assets, decapsulating media assets, decoding video, and rendering video. It is applicable to end-to-end playback of media assets and can be used to play video files in MP4 and MKV formats.
6
7- **\<Video>** component: encapsulates basic video playback capabilities. It can be used to play video files after the data source and basic information are set. However, its scalability is poor. This component is provided by ArkUI. For details about how to use this component for video playback development, see [Video Component](../ui/arkts-common-components-video-player.md).
8
9In this topic, you will learn how to use the AVPlayer to develop a video playback service that plays a complete video file. If you want the application to continue playing the video in the background or when the screen is off, you must use the [AVSession](avsession-overview.md) and [continuous task](../task-management/continuous-task.md) to prevent the playback from being forcibly interrupted by the system.
10
11The full playback process includes creating an **AVPlayer** instance, setting the media asset to play and the window to display the video, setting playback parameters (volume, speed, and scale type), controlling playback (play, pause, seek, and stop), resetting the playback configuration, and releasing the instance. During application development, you can use the **state** attribute of the AVPlayer to obtain the AVPlayer state or call **on('stateChange')** to listen for state changes. If the application performs an operation when the AVPlayer is not in the given state, the system may throw an exception or generate other undefined behavior.
12
13**Figure 1** Playback state transition
14
15![Playback state change](figures/video-playback-status-change.png)
16
17For details about the state, see [AVPlayerState](../reference/apis-media-kit/js-apis-media.md#avplayerstate9). When the AVPlayer is in the **prepared**, **playing**, **paused**, or **completed** state, the playback engine is working and a large amount of RAM is occupied. If your application does not need to use the AVPlayer, call **reset()** or **release()** to release the instance.
18
19## How to Develop
20
21Read [AVPlayer](../reference/apis-media-kit/js-apis-media.md#avplayer9) for the API reference.
22
231. Call **createAVPlayer()** to create an **AVPlayer** instance. The AVPlayer is the idle state.
24
252. Set the events to listen for, which will be used in the full-process scenario. The table below lists the supported events.
26   | Event Type| Description|
27   | -------- | -------- |
28   | stateChange | Mandatory; used to listen for changes of the **state** attribute of the AVPlayer.|
29   | error | Mandatory; used to listen for AVPlayer errors.|
30   | durationUpdate | Used to listen for progress bar updates to refresh the media asset duration.|
31   | timeUpdate | Used to listen for the current position of the progress bar to refresh the current time.|
32   | seekDone | Used to listen for the completion status of the **seek()** request.<br>This event is reported when the AVPlayer seeks to the playback position specified in **seek()**.|
33   | speedDone | Used to listen for the completion status of the **setSpeed()** request.<br>This event is reported when the AVPlayer plays video at the speed specified in **setSpeed()**.|
34   | volumeChange | Used to listen for the completion status of the **setVolume()** request.<br>This event is reported when the AVPlayer plays video at the volume specified in **setVolume()**.|
35   | bitrateDone | Used to listen for the completion status of the **setBitrate()** request, which is used for HTTP Live Streaming (HLS) streams.<br>This event is reported when the AVPlayer plays video at the bit rate specified in **setBitrate()**.|
36   | availableBitrates | Used to listen for available bit rates of HLS resources. The available bit rates are provided for **setBitrate()**.|
37   | bufferingUpdate | Used to listen for network playback buffer information.|
38   | startRenderFrame | Used to listen for the rendering time of the first frame during video playback.<br>This event is reported when the AVPlayer enters the playing state and the first frame of the video image is rendered to the display. Generally, the application can use this event to remove the video cover, achieving smooth connection between the cover and the video image.|
39   | videoSizeChange | Used to listen for the width and height of video playback and adjust the window size and ratio.|
40   | audioInterrupt | Used to listen for audio interruption. This event is used together with the **audioInterruptMode** attribute.<br>This event is reported when the current audio playback is interrupted by another (for example, when a call is coming), so the application can process the event in time.|
41
423. Set the media asset URL. The AVPlayer enters the **initialized** state.
43   > **NOTE**
44   >
45   > The URL in the code snippet below is for reference only. You need to check the media asset validity and set the URL based on service requirements.
46   >
47   > - If local files are used for playback, ensure that the files are available and the application sandbox path is used for access. For details about how to obtain the application sandbox path, see [Obtaining Application File Paths](../application-models/application-context-stage.md#obtaining-application-file-paths). For details about the application sandbox and how to push files to the application sandbox, see [File Management](../file-management/app-sandbox-directory.md).
48   >
49   > - If a network playback path is used, you must [declare the ohos.permission.INTERNET permission](../security/AccessToken/declare-permissions.md).
50   >
51   > - You can also use **ResourceManager.getRawFd** to obtain the FD of a file packed in the HAP file. For details, see [ResourceManager API Reference](../reference/apis-localization-kit/js-apis-resource-manager.md#getrawfd9).
52   >
53   > - The [playback formats and protocols](media-kit-intro.md#supported-formats-and-protocols) in use must be those supported by the system.
54
554. Obtain and set the surface ID of the window to display the video.
56   The application obtains the surface ID from the **\<XComponent>**. For details about the process, see [XComponent](../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md).
57
585. Call **prepare()** to switch the AVPlayer to the **prepared** state. In this state, you can obtain the duration of the media asset to play and set the scale type and volume.
59
606. Call **play()**, **pause()**, **seek()**, and **stop()** to perform video playback control as required.
61
627. (Optional) Call **reset()** to reset the AVPlayer. The AVPlayer enters the **idle** state again and you can change the media asset URL.
63
648. Call **release()** to switch the AVPlayer to the **released** state. Now your application exits the playback.
65
66## Sample Code
67
68
69```ts
70import media from '@ohos.multimedia.media';
71import fs from '@ohos.file.fs';
72import common from '@ohos.app.ability.common';
73import { BusinessError } from '@ohos.base';
74
75export class AVPlayerDemo {
76  private count: number = 0;
77  private surfaceID: string = ''; // The surfaceID parameter specifies the window used to display the video. Its value is obtained through the XComponent.
78  private isSeek: boolean = true; // Specify whether the seek operation is supported.
79  private fileSize: number = -1;
80  private fd: number = 0;
81  // Set AVPlayer callback functions.
82  setAVPlayerCallback(avPlayer: media.AVPlayer) {
83    // startRenderFrame: callback function invoked when the first frame starts rendering.
84    avPlayer.on('startRenderFrame', () => {
85      console.info(`AVPlayer start render frame`);
86    })
87    // Callback function for the seek operation.
88    avPlayer.on('seekDone', (seekDoneTime: number) => {
89      console.info(`AVPlayer seek succeeded, seek time is ${seekDoneTime}`);
90    })
91    // Callback function for errors. If an error occurs during the operation on the AVPlayer, reset() is called to reset the AVPlayer.
92    avPlayer.on('error', (err: BusinessError) => {
93      console.error(`Invoke avPlayer failed, code is ${err.code}, message is ${err.message}`);
94      avPlayer.reset(); // Call reset() to reset the AVPlayer, which enters the idle state.
95    })
96    // Callback function for state changes.
97    avPlayer.on('stateChange', async (state: string, reason: media.StateChangeReason) => {
98      switch (state) {
99        case 'idle': // This state is reported upon a successful callback of reset().
100          console.info('AVPlayer state idle called.');
101          avPlayer.release(); // Call release() to release the instance.
102          break;
103        case 'initialized': // This state is reported when the AVPlayer sets the playback source.
104          console.info('AVPlayer state initialized called.');
105          avPlayer.surfaceId = this.surfaceID; // Set the window to display the video. This setting is not required when a pure audio asset is to be played.
106          avPlayer.prepare();
107          break;
108        case 'prepared': // This state is reported upon a successful callback of prepare().
109          console.info('AVPlayer state prepared called.');
110          avPlayer.play(); // Call play() to start playback.
111          break;
112        case 'playing': // This state is reported upon a successful callback of play().
113          console.info('AVPlayer state playing called.');
114          if (this.count !== 0) {
115            if (this.isSeek) {
116              console.info('AVPlayer start to seek.');
117              avPlayer.seek(avPlayer.duration); // Call seek() to seek to the end of the video clip.
118            } else {
119              // When the seek operation is not supported, the playback continues until it reaches the end.
120              console.info('AVPlayer wait to play end.');
121            }
122          } else {
123            avPlayer.pause(); // Call pause() to pause the playback.
124          }
125          this.count++;
126          break;
127        case 'paused': // This state is reported upon a successful callback of pause().
128          console.info('AVPlayer state paused called.');
129          avPlayer.play(); // Call play() again to start playback.
130          break;
131        case 'completed': // This state is reported upon the completion of the playback.
132          console.info('AVPlayer state completed called.');
133          avPlayer.stop(); // Call stop() to stop the playback.
134          break;
135        case 'stopped': // This state is reported upon a successful callback of stop().
136          console.info('AVPlayer state stopped called.');
137          avPlayer.reset(); // Call reset() to reset the AVPlayer state.
138          break;
139        case 'released':
140          console.info('AVPlayer state released called.');
141          break;
142        default:
143          console.info('AVPlayer state unknown called.');
144          break;
145      }
146    })
147  }
148
149  // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file using the URL attribute.
150  async avPlayerUrlDemo() {
151    // Create an AVPlayer instance.
152    let avPlayer: media.AVPlayer = await media.createAVPlayer();
153    // Set a callback function for state changes.
154    this.setAVPlayerCallback(avPlayer);
155    let fdPath = 'fd://';
156    let context = getContext(this) as common.UIAbilityContext;
157    // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
158    let pathDir = context.filesDir;
159    let path = pathDir + '/H264_AAC.mp4';
160    // Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
161    let file = await fs.open(path);
162    fdPath = fdPath + '' + file.fd;
163    this.isSeek = true; // The seek operation is supported.
164    avPlayer.url = fdPath;
165  }
166
167  // The following demo shows how to use resourceManager to obtain the media file packed in the HAP file and play the media file by using the fdSrc attribute.
168  async avPlayerFdSrcDemo() {
169    // Create an AVPlayer instance.
170    let avPlayer: media.AVPlayer = await media.createAVPlayer();
171    // Set a callback function for state changes.
172    this.setAVPlayerCallback(avPlayer);
173    // Call getRawFd of the resourceManager member of UIAbilityContext to obtain the media asset URL.
174    // The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
175    let context = getContext(this) as common.UIAbilityContext;
176    let fileDescriptor = await context.resourceManager.getRawFd('H264_AAC.mp4');
177    let avFileDescriptor: media.AVFileDescriptor =
178      { fd: fileDescriptor.fd, offset: fileDescriptor.offset, length: fileDescriptor.length };
179    this.isSeek = true; // The seek operation is supported.
180    // Assign a value to fdSrc to trigger the reporting of the initialized state.
181    avPlayer.fdSrc = avFileDescriptor;
182  }
183
184  // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
185  async avPlayerDataSrcSeekDemo() {
186    // Create an AVPlayer instance.
187    let avPlayer: media.AVPlayer = await media.createAVPlayer();
188    // Set a callback function for state changes.
189    this.setAVPlayerCallback(avPlayer);
190    // dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
191    let src: media.AVDataSrcDescriptor = {
192      fileSize: -1,
193      callback: (buf: ArrayBuffer, length: number, pos: number | undefined) => {
194        let num = 0;
195        if (buf == undefined || length == undefined || pos == undefined) {
196          return -1;
197        }
198        num = fs.readSync(this.fd, buf, { offset: pos, length: length });
199        if (num > 0 && (this.fileSize >= pos)) {
200          return num;
201        }
202        return -1;
203      }
204    }
205    let context = getContext(this) as common.UIAbilityContext;
206    // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
207    let pathDir = context.filesDir;
208    let path = pathDir + '/H264_AAC.mp4';
209    await fs.open(path).then((file: fs.File) => {
210      this.fd = file.fd;
211    })
212    // Obtain the size of the file to be played.
213    this.fileSize = fs.statSync(path).size;
214    src.fileSize = this.fileSize;
215    this.isSeek = true; // The seek operation is supported.
216    avPlayer.dataSrc = src;
217  }
218
219  // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
220  async avPlayerDataSrcNoSeekDemo() {
221    // Create an AVPlayer instance.
222    let avPlayer: media.AVPlayer = await media.createAVPlayer();
223    // Set a callback function for state changes.
224    this.setAVPlayerCallback(avPlayer);
225    let context = getContext(this) as common.UIAbilityContext;
226    let src: media.AVDataSrcDescriptor = {
227      fileSize: -1,
228      callback: (buf: ArrayBuffer, length: number) => {
229        let num = 0;
230        if (buf == undefined || length == undefined) {
231          return -1;
232        }
233        num = fs.readSync(this.fd, buf);
234        if (num > 0) {
235          return num;
236        }
237        return -1;
238      }
239    }
240    // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
241    let pathDir = context.filesDir;
242    let path = pathDir + '/H264_AAC.mp4';
243    await fs.open(path).then((file: fs.File) => {
244      this.fd = file.fd;
245    })
246    this.isSeek = false; // The seek operation is not supported.
247    avPlayer.dataSrc = src;
248  }
249
250  // The following demo shows how to play live streams by setting the network address through the URL.
251  async avPlayerLiveDemo() {
252    // Create an AVPlayer instance.
253    let avPlayer: media.AVPlayer = await media.createAVPlayer();
254    // Set a callback function for state changes.
255    this.setAVPlayerCallback(avPlayer);
256    this.isSeek = false; // The seek operation is not supported.
257    avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8'; // Play live webcasting streams using HLS.
258  }
259}
260```
261
262