• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Video Encoding
2
3You can call the native APIs provided by the VideoEncoder module to encode a video, that is, to compress video data into a video stream.
4
5<!--RP3--><!--RP3End-->
6
7For details about the supported encoding capabilities, see [AVCodec Supported Formats](avcodec-support-formats.md#video-encoding).
8
9<!--RP1--><!--RP1End-->
10
11The following table lists the video encoding capabilities supported:
12
13<!--RP4-->
14|          Capability                      |                              How to Use                                           |
15| --------------------------------------- | ---------------------------------------------------------------------------------- |
16| Layered encoding<br> Setting the LTR frame and reference frame                     | For details, see [Temporally Scalable Video Coding](video-encoding-temporal-scalability.md).       |
17| Repeat encoding of historical frames                   | For details, see [OH_MD_KEY_VIDEO_ENCODER_REPEAT_PREVIOUS_FRAME_AFTER](../../reference/apis-avcodec-kit/_codec_base.md#oh_md_key_video_encoder_repeat_previous_frame_after) and<br> [OH_MD_KEY_VIDEO_ENCODER_REPEAT_PREVIOUS_MAX_COUNT](../../reference/apis-avcodec-kit/_codec_base.md#oh_md_key_video_encoder_repeat_previous_max_count).   |
18<!--RP4End-->
19
20## Constraints
21
22- The buffer mode does not support 10-bit image data.
23- Due to limited hardware encoder resources, you must call **OH_VideoEncoder_Destroy** to destroy every encoder instance when it is no longer needed.
24- If **flush()**, **reset()**, **stop()**, or **destroy()** is executed in a non-callback thread, the execution result is returned after all callbacks are executed.
25- Once **Flush**, **Reset**, or **Stop** is called, the system reclaims the OH_AVBuffer. Therefore, do not continue to operate the OH_AVBuffer obtained through the previous callback function.
26- The buffer mode and surface mode use the same APIs. Therefore, the surface mode is described as an example.
27- In buffer mode, after obtaining the pointer to an OH_AVBuffer instance through the callback function **OH_AVCodecOnNeedInputBuffer**, call **OH_VideoEncoder_PushInputBuffer** to notify the system that the buffer has been fully utilized. In this way, the system will proceed with encoding the data contained in the buffer. If the OH_NativeBuffer instance is obtained through **OH_AVBuffer_GetNativeBuffer** and its lifecycle extends beyond that of the OH_AVBuffer pointer instance, you mut perform data duplication. In this case, you should manage the lifecycle of the newly generated OH_NativeBuffer object to ensure that the object can be correctly used and released.
28<!--RP14--><!--RP14End-->
29
30## Surface Input and Buffer Input
31
32- Surface input and buffer input differ in data sources.
33
34- They are applicable to different scenarios.
35  - Surface input indicates that the OHNativeWindow is used to transfer passed-in data. It supports connection with other modules, such as the camera module.
36  - Buffer input refers to a pre-allocated memory area. The caller needs to copy original data to this memory area. It is more applicable to scenarios such as reading video data from files.
37
38- The two also differ slightly in the API calling modes:
39  - In buffer mode, the caller calls **OH_VideoEncoder_PushInputBuffer** to input data. In surface mode, the caller, before the encoder is ready, calls **OH_VideoEncoder_GetSurface** to obtain the OHNativeWindow for video data transmission.
40  - In buffer mode, you can use **attr** in **OH_AVBuffer** to pass in the End of Stream (EOS) flag, and the encoder stops when it reads the last frame. In surface mode, the caller calls **OH_VideoEncoder_NotifyEndOfStream** to notify the encoder of EOS.
41
42For details about the development procedure, see [Surface Input](#surface-input) and [Buffer Input](#buffer-input).
43
44## State Machine Interaction
45
46The following figure shows the interaction between states.
47
48![Invoking relationship of state](figures/state-invocation.png)
49
501. An encoder enters the Initialized state in either of the following ways:
51   - When an encoder instance is initially created, the encoder enters the Initialized state.
52   - When **OH_VideoEncoder_Reset** is called in any state, the encoder returns to the Initialized state.
53
542. When the encoder is in the Initialized state, you can call **OH_VideoEncoder_Configure** to configure the encoder. After the configuration, the encoder enters the Configured state.
553. When the encoder is in the Configured state, you can call **OH_VideoEncoder_Prepare()** to switch it to the Prepared state.
564. When the encoder is in the Prepared state, you can call **OH_VideoEncoder_Start** to switch it to the Executing state.
57   - When the encoder is in the Executing state, you can call **OH_VideoEncoder_Stop** to switch it back to the Prepared state.
58
595. In rare cases, the encoder may encounter an error and enter the Error state. If this is the case, an invalid value can be returned or an exception can be thrown through a queue operation.
60   - When the encoder is in the Error state, you can either call **OH_VideoEncoder_Reset** to switch it to the Initialized state or call **OH_VideoEncoder_Destroy** to switch it to the Released state.
61
626. The Executing state has three substates: Flushed, Running, and End-of-Stream.
63   - After **OH_VideoEncoder_Start** is called, the encoder enters the Running substate immediately.
64   - When the encoder is in the Executing state, you can call **OH_VideoEncoder_Flush** to switch it to the Flushed substate.
65   - After all data to be processed is transferred to the encoder, the [AVCODEC_BUFFER_FLAGS_EOS](../../reference/apis-avcodec-kit/_core.md#oh_avcodecbufferflags-1) flag is added to the last input buffer in the input buffers queue. Once this flag is detected, the encoder transits to the End-of-Stream substate. In this state, the encoder does not accept new inputs, but continues to generate outputs until it reaches the tail frame.
66
677. When the encoder is no longer needed, you must call **OH_VideoEncoder_Destroy** to destroy the encoder instance, which then transitions to the Released state.
68
69## How to Develop
70
71Read [VideoEncoder](../../reference/apis-avcodec-kit/_video_encoder.md) for the API reference.
72
73The figure below shows the call relationship of video encoding.
74
75- The dotted line indicates an optional operation.
76
77- The solid line indicates a mandatory operation.
78
79![Call relationship of video encoding](figures/video-encode.png)
80
81### Linking the Dynamic Libraries in the CMake Script
82
83```cmake
84target_link_libraries(sample PUBLIC libnative_media_codecbase.so)
85target_link_libraries(sample PUBLIC libnative_media_core.so)
86target_link_libraries(sample PUBLIC libnative_media_venc.so)
87```
88
89> **NOTE**
90>
91> The word **sample** in the preceding code snippet is only an example. Use the actual project directory name.
92>
93
94### Defining the Basic Structure
95
96The sample code provided in this section adheres to the C++17 standard and is for reference only. You can define your own buffer objects by referring to it.
97
981. Add the header files.
99
100    ```c++
101    #include <condition_variable>
102    #include <memory>
103    #include <mutex>
104    #include <queue>
105    #include <shared_mutex>
106    ```
107
1082. Define the information about the encoder callback buffer.
109
110    ```c++
111    struct CodecBufferInfo {
112        CodecBufferInfo(uint32_t index, OH_AVBuffer *buffer): index(index), buffer(buffer), isValid(true) {}
113        CodecBufferInfo(uint32_t index, OH_AVFormat *parameter): index(index), parameter(parameter), isValid(true) {}
114        // Callback buffer.
115        OH_AVBuffer *buffer = nullptr;
116        // In surface mode, pass the frame-specific parameter of the callback, which can be used only after the frame-specific parameter callback function is registered.
117        OH_AVFormat *parameter = nullptr;
118        // Index of the callback buffer.
119        uint32_t index = 0;
120        // Check whether the current buffer information is valid.
121        bool isValid = true;
122    };
123    ```
124
1253. Define the input and output queue for encoding.
126
127    ```c++
128    class CodecBufferQueue {
129    public:
130        // Pass the callback buffer information to the queue.
131        void Enqueue(const std::shared_ptr<CodecBufferInfo> bufferInfo)
132        {
133            std::unique_lock<std::mutex> lock(mutex_);
134            bufferQueue_.push(bufferInfo);
135            cond_.notify_all();
136        }
137
138        // Obtain the information about the callback buffer.
139        std::shared_ptr<CodecBufferInfo> Dequeue(int32_t timeoutMs = 1000)
140        {
141            std::unique_lock<std::mutex> lock(mutex_);
142            (void)cond_.wait_for(lock, std::chrono::milliseconds(timeoutMs), [this]() { return !bufferQueue_.empty(); });
143            if (bufferQueue_.empty()) {
144                return nullptr;
145            }
146            std::shared_ptr<CodecBufferInfo> bufferInfo = bufferQueue_.front();
147            bufferQueue_.pop();
148            return bufferInfo;
149        }
150
151        // Clear the queue. The previous callback buffer becomes unavailable.
152        void Flush()
153        {
154            std::unique_lock<std::mutex> lock(mutex_);
155            while (!bufferQueue_.empty()) {
156                std::shared_ptr<CodecBufferInfo> bufferInfo = bufferQueue_.front();
157                // After the flush, stop, reset, and destroy operations are performed, the previous callback buffer information is invalid.
158                bufferInfo->isValid = false;
159                bufferQueue_.pop();
160            }
161        }
162
163    private:
164        std::mutex mutex_;
165        std::condition_variable cond_;
166        std::queue<std::shared_ptr<CodecBufferInfo>> bufferQueue_;
167    };
168    ```
169
1704. Configure global variables.
171
172    These global variables are for reference only. They can be encapsulated into an object based on service requirements.
173
174    ```c++
175    // Video frame width.
176    int32_t width = 320;
177    // Video frame height.
178    int32_t height = 240;
179    // Video pixel format.
180     OH_AVPixelFormat pixelFormat = AV_PIXEL_FORMAT_NV12;
181    // Video width stride.
182    int32_t widthStride = 0;
183    // Video height stride.
184    int32_t heightStride = 0;
185    // Pointer to the encoder instance.
186    OH_AVCodec *videoEnc = nullptr;
187    // Encoder synchronization lock.
188    std::shared_mutex codecMutex;
189    // Encoder input queue.
190    CodecBufferQueue inQueue;
191    // Encoder output queue.
192    CodecBufferQueue outQueue;
193    ```
194
195### Surface Input
196
197The following walks you through how to implement the entire video encoding process in surface mode. In this example, surface data is input and encoded into a H.264 stream.
198
199Currently, the VideoEncoder module supports only data rotation in asynchronous mode.
200
2011. Add the header files.
202
203    ```c++
204    #include <multimedia/player_framework/native_avcodec_videoencoder.h>
205    #include <multimedia/player_framework/native_avcapability.h>
206    #include <multimedia/player_framework/native_avcodec_base.h>
207    #include <multimedia/player_framework/native_avformat.h>
208    #include <multimedia/player_framework/native_avbuffer.h>
209    #include <fstream>
210    ```
211
2122. Create an encoder instance.
213
214    You can create an encoder by name or MIME type. In the code snippet below, the following variables are used:
215
216    - **videoEnc**: pointer to the video encoder instance.
217    - **capability**: pointer to the encoder's capability.
218    - **OH_AVCODEC_MIMETYPE_VIDEO_AVC**: AVC video codec.
219
220    The following is an example:
221
222    ```c++
223    // Create an encoder by name. If your application has special requirements, for example, expecting an encoder that supports a certain resolution, you can call OH_AVCodec_GetCapability to query the capability first.
224    OH_AVCapability *capability = OH_AVCodec_GetCapability(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true);
225    // Create a hardware encoder instance.
226    OH_AVCapability *capability= OH_AVCodec_GetCapabilityByCategory(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true, HARDWARE);
227    const char *codecName = OH_AVCapability_GetName(capability);
228    OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByName(codecName);
229    ```
230
231    ```c++
232    // Create an encoder by MIME type. Only specific codecs recommended by the system can be created in this way.
233    // Only hardware encoders can be created.
234    OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByMime(OH_AVCODEC_MIMETYPE_VIDEO_AVC);
235    ```
236
2373. Call **OH_VideoEncoder_RegisterCallback()** to register the callback functions.
238
239    Register the **OH_AVCodecCallback** struct that defines the following callback function pointers:
240
241    - **OH_AVCodecOnError**, a callback used to report a codec operation error. For details about the error codes, see [OH_AVCodecOnError](../../reference/apis-avcodec-kit/_codec_base.md#oh_avcodeconerror).
242    - **OH_AVCodecOnStreamChanged**, a callback used to report a codec stream change, for example, format change.
243    - **OH_AVCodecOnNeedInputBuffer**, a callback used to report input data required. This callback does not take effect, since you input data through the obtained surface.
244    - **OH_AVCodecOnNewOutputBuffer**, a callback used to report output data generated, which means that encoding is complete.
245
246    <!--RP2--><!--RP2End-->
247
248    The following is an example:
249
250    ```c++
251    // Set the OH_AVCodecOnError callback function, which is used to report a codec operation error.
252    static void OnError(OH_AVCodec *codec, int32_t errorCode, void *userData)
253    {
254        // Process the error code in the callback.
255        (void)codec;
256        (void)errorCode;
257        (void)userData;
258    }
259    ```
260
261    <!--RP12-->
262    ```c++
263    // Set the OH_AVCodecOnStreamChanged callback function, which is used to report an encoding stream change.
264    static void OnStreamChanged(OH_AVCodec *codec, OH_AVFormat *format, void *userData)
265    {
266        // In surface mode, this callback function is triggered when the surface resolution changes.
267        (void)codec;
268        (void)userData;
269        OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_WIDTH, &width);
270        OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_HEIGHT, &height);
271    }
272    ```
273    <!--RP12End-->
274
275    ```c++
276    // Set the OH_AVCodecOnNeedInputBuffer callback function, which is used to send an input frame to the data queue.
277    static void OnNeedInputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData)
278    {
279        // In surface mode, this callback function does not take effect. Data is input through the obtained surface.
280        (void)userData;
281        (void)index;
282        (void)buffer;
283    }
284    ```
285
286    ```c++
287    // Set the OH_AVCodecOnNewOutputBuffer callback function, which is used to send an encoded frame to the output queue.
288    static void OnNewOutputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData)
289    {
290        // The data buffer of the finished frame and its index are sent to outQueue.
291        (void)codec;
292        (void)userData;
293        outQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer));
294    }
295    ```
296
297    ```c++
298    // Call OH_VideoEncoder_RegisterCallback() to register the callback functions.
299    OH_AVCodecCallback cb = {&OnError, &OnStreamChanged, &OnNeedInputBuffer, &OnNewOutputBuffer};
300    int32_t ret = OH_VideoEncoder_RegisterCallback(videoEnc, cb, nullptr); // nullptr: userData is null.
301    if (ret != AV_ERR_OK) {
302        // Handle exceptions.
303    }
304    ```
305
306    > **NOTE**
307    >
308    > In the callback functions, pay attention to multi-thread synchronization for operations on the data queue.
309
3104. (Optional) Call **OH_VideoEncoder_RegisterParameterCallback()** to register the frame-specific parameter callback function.
311
312    For details, see [Temporally Scalable Video Coding](video-encoding-temporal-scalability.md).
313
314    <!--RP7-->
315    ```c++
316    // 4.1 Implement the OH_VideoEncoder_OnNeedInputParameter callback function.
317    static void OnNeedInputParameter(OH_AVCodec *codec, uint32_t index, OH_AVFormat *parameter, void *userData)
318    {
319        // The data parameter of the input frame and its index are sent to inQueue.
320        inQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, parameter));
321    }
322
323    // 4.2 Register the frame-specific parameter callback function.
324    OH_VideoEncoder_OnNeedInputParameter inParaCb = OnNeedInputParameter;
325    OH_VideoEncoder_RegisterParameterCallback(videoEnc, inParaCb, nullptr); // nullptr: userData is null.
326    ```
327    <!--RP7End-->
328
3295. Call **OH_VideoEncoder_Configure()** to configure the encoder.
330
331    For details about the configurable options, see [Video Dedicated Key-Value Paris](../../reference/apis-avcodec-kit/_codec_base.md#media-data-key-value-pairs).
332
333    For details about the parameter verification rules, see [OH_VideoEncoder_Configure()](../../reference/apis-avcodec-kit/_video_encoder.md#oh_videoencoder_configure).
334
335    The parameter value ranges can be obtained through the capability query interface. For details, see [Obtaining Supported Codecs](obtain-supported-codecs.md).
336
337    Currently, the following options must be configured for all supported formats: video frame width, video frame height, and video pixel format. In the code snippet below, the following variables are used:
338
339    - **DEFAULT_WIDTH**: 320 pixels
340    - **DEFAULT_HEIGHT**: 240 pixels
341    - **DEFAULT_PIXELFORMAT**: **AV_PIXEL_FORMAT_NV12** (the pixel format of the YUV file is NV12)
342
343    ```c++
344    // Configure the video frame rate.
345    double frameRate = 30.0;
346    // Configure the video YUV range flag.
347    bool rangeFlag = false;
348    // Configure the video primary color.
349    int32_t primary = static_cast<int32_t>(OH_ColorPrimary::COLOR_PRIMARY_BT709);
350    // Configure the transfer characteristics.
351    int32_t transfer = static_cast<int32_t>(OH_TransferCharacteristic::TRANSFER_CHARACTERISTIC_BT709);
352    // Configure the maximum matrix coefficient.
353    int32_t matrix = static_cast<int32_t>(OH_MatrixCoefficient::MATRIX_COEFFICIENT_IDENTITY);
354    // Configure the encoding profile.
355    int32_t profile = static_cast<int32_t>(OH_AVCProfile::AVC_PROFILE_HIGH);
356    // Configure the encoding bit rate mode.
357    int32_t rateMode = static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_VBR);
358    // Configure the key frame interval, in milliseconds.
359    int32_t iFrameInterval = 1000;
360    // Configure the bit rate, in bit/s.
361    int64_t bitRate = 5000000;
362    // Set the encoding quality.
363    int64_t quality = 90;
364
365    OH_AVFormat *format = OH_AVFormat_Create();
366    OH_AVFormat_SetIntValue (format, OH_MD_KEY_WIDTH, width); // Mandatory.
367    OH_AVFormat_SetIntValue(format, OH_MD_KEY_HEIGHT, height); // Mandatory.
368    OH_AVFormat_SetIntValue(format, OH_MD_KEY_PIXEL_FORMAT, pixelFormat); // Mandatory.
369
370    OH_AVFormat_SetDoubleValue(format, OH_MD_KEY_FRAME_RATE, frameRate);
371    OH_AVFormat_SetIntValue(format, OH_MD_KEY_RANGE_FLAG, rangeFlag);
372    OH_AVFormat_SetIntValue(format, OH_MD_KEY_COLOR_PRIMARIES, primary);
373    OH_AVFormat_SetIntValue(format, OH_MD_KEY_TRANSFER_CHARACTERISTICS, transfer);
374    OH_AVFormat_SetIntValue(format, OH_MD_KEY_MATRIX_COEFFICIENTS, matrix);
375    OH_AVFormat_SetIntValue(format, OH_MD_KEY_I_FRAME_INTERVAL, iFrameInterval);
376    OH_AVFormat_SetIntValue(format, OH_MD_KEY_PROFILE, profile);
377    // Configure OH_MD_KEY_QUALITY only when OH_BitrateMode = BITRATE_MODE_CQ is used.
378    if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_CQ)) {
379        OH_AVFormat_SetIntValue(format, OH_MD_KEY_QUALITY, quality);
380    } else if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_CBR) ||
381               rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_VBR)){
382        OH_AVFormat_SetLongValue(format, OH_MD_KEY_BITRATE, bitRate);
383    }
384    OH_AVFormat_SetIntValue(format, OH_MD_KEY_VIDEO_ENCODE_BITRATE_MODE, rateMode);
385    int32_t ret = OH_VideoEncoder_Configure(videoEnc, format);
386    if (ret != AV_ERR_OK) {
387        // Handle exceptions.
388    }
389    OH_AVFormat_Destroy(format);
390    ```
391
392    > **NOTE**
393    >
394    > If an optional parameter is incorrectly configured, the error code **AV_ERR_INVAILD_VAL** is returned. However, **OH_VideoEncoder_Configure()** does not fail. Instead, its execution continues with the default value.
395
3966. Obtain a surface.
397
398    Obtain the OHNativeWindow in surface mode. The surface must be obtained before **OH_VideoEncoder_Prepare** is called.
399
400    ```c++
401    // Obtain the surface used for data input.
402    OHNativeWindow *nativeWindow;
403    int32_t ret = OH_VideoEncoder_GetSurface(videoEnc, &nativeWindow);
404    if (ret != AV_ERR_OK) {
405        // Handle exceptions.
406    }
407    // Use the OHNativeWindow* variable to obtain the address of the data to be filled through the producer interface.
408    ```
409
410    For details about how to use the OHNativeWindow* variable-type, see [OHNativeWindow](../../reference/apis-arkgraphics2d/_native_window.md#ohnativewindow).
411
4127. Call **OH_VideoEncoder_Prepare()** to prepare internal resources for the encoder.
413
414    ```c++
415    int32_t ret = OH_VideoEncoder_Prepare(videoEnc);
416    if (ret != AV_ERR_OK) {
417        // Handle exceptions.
418    }
419    ```
420
4218. Call **OH_VideoEncoder_Start()** to start the encoder.
422
423    ```c++
424    // Configure the paths of the input and output files.
425    std::string_view outputFilePath = "/*yourpath*.h264";
426    std::unique_ptr<std::ofstream> outputFile = std::make_unique<std::ofstream>();
427    outputFile->open(outputFilePath.data(), std::ios::out | std::ios::binary | std::ios::ate);
428    // Start the encoder.
429    int32_t ret = OH_VideoEncoder_Start(videoEnc);
430    if (ret != AV_ERR_OK) {
431        // Handle exceptions.
432    }
433    ```
434
4359. (Optional) Call **OH_VideoEncoder_SetParameter()** to dynamically configure encoder parameters during running.
436
437    <!--RP8-->
438    ```c++
439    OH_AVFormat *format = OH_AVFormat_Create();
440    // Dynamically request IDR frames.
441    OH_AVFormat_SetIntValue(format, OH_MD_KEY_REQUEST_I_FRAME, true);
442    int32_t ret = OH_VideoEncoder_SetParameter(videoEnc, format);
443    if (ret != AV_ERR_OK) {
444        // Handle exceptions.
445    }
446    OH_AVFormat_Destroy(format);
447    ```
448    <!--RP8End-->
449
45010. Write the image to encode.
451
452    In step 6, you have configured the **OHNativeWindow*** variable type returned by **OH_VideoEncoder_GetSurface**. The data required for encoding is continuously input by the surface. Therefore, you do not need to process the **OnNeedInputBuffer** callback function or use **OH_VideoEncoder_PushInputBuffer** to input data.
453    <!--RP13--><!--RP13End-->
454
45511. (Optional) Call **OH_VideoEncoder_PushInputParameter()** to notify the encoder that the frame-specific parameter configuration is complete.
456
457    In step 4, you have registered the frame-specific parameter callback function.
458
459    In the code snippet below, the following variables are used:
460
461    - **index**: parameter passed by the callback function **OnNeedInputParameter**, which uniquely corresponds to the buffer.
462
463    ```c++
464    std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue();
465    std::shared_lock<std::shared_mutex> lock(codecMutex);
466    if (bufferInfo == nullptr || !bufferInfo->isValid) {
467        // Handle exceptions.
468    }
469    // You can determine the value.
470    int32_t isIFrame;
471    OH_AVFormat_SetIntValue(bufferInfo->parameter, OH_MD_KEY_REQUEST_I_FRAME, isIFrame);
472    int32_t ret = OH_VideoEncoder_PushInputParameter(videoEnc, bufferInfo->index);
473    if (ret != AV_ERR_OK) {
474        // Handle exceptions.
475    }
476    ```
477
47812. Call **OH_VideoEncoder_NotifyEndOfStream()** to notify the encoder of EOS.
479
480    ```c++
481    // In surface mode, you only need to call this API to notify the encoder of EOS.
482    // In buffer mode, you need to set the AVCODEC_BUFFER_FLAGS_EOS flag and then call OH_VideoEncoder_PushInputBuffer to notify the encoder of EOS.
483    int32_t ret = OH_VideoEncoder_NotifyEndOfStream(videoEnc);
484    if (ret != AV_ERR_OK) {
485        // Handle exceptions.
486    }
487    ```
488
48913. Call **OH_VideoEncoder_FreeOutputBuffer()** to release encoded frames.
490
491    In the code snippet below, the following variables are used:
492
493    - **index**: parameter passed by the callback function **OnNewOutputBuffer**, which uniquely corresponds to the buffer.
494    - **buffer**: parameter passed by the callback function **OnNewOutputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr).
495
496    <!--RP6-->
497    ```c++
498    std::shared_ptr<CodecBufferInfo> bufferInfo = outQueue.Dequeue();
499    std::shared_lock<std::shared_mutex> lock(codecMutex);
500    if (bufferInfo == nullptr || !bufferInfo->isValid) {
501        // Handle exceptions.
502    }
503    // Obtain the encoded information.
504    OH_AVCodecBufferAttr info;
505    int32_t ret = OH_AVBuffer_GetBufferAttr(bufferInfo->buffer, &info);
506    if (ret != AV_ERR_OK) {
507        // Handle exceptions.
508    }
509    // Write the encoded frame data (specified by buffer) to the output file.
510    outputFile->write(reinterpret_cast<char *>(OH_AVBuffer_GetAddr(bufferInfo->buffer)), info.size);
511    // Free the output buffer. index is the index of the buffer.
512    ret = OH_VideoEncoder_FreeOutputBuffer(videoEnc, bufferInfo->index);
513    if (ret != AV_ERR_OK) {
514        // Handle exceptions.
515    }
516    ```
517    <!--RP6End-->
518
51914. (Optional) Call **OH_VideoEncoder_Flush()** to refresh the encoder.
520
521    After **OH_VideoEncoder_Flush** is called, the encoder remains in the Running state, but the input and output data and parameter set (such as the H.264 PPS/SPS) buffered in the encoder are cleared.
522
523    To continue encoding, you must call **OH_VideoEncoder_Start** again.
524
525    ```c++
526    std::unique_lock<std::shared_mutex> lock(codecMutex);
527    // Refresh the encoder.
528    int32_t ret = OH_VideoEncoder_Flush(videoEnc);
529    if (ret != AV_ERR_OK) {
530        // Handle exceptions.
531    }
532    inQueue.Flush();
533    outQueue.Flush();
534    // Start encoding again.
535    ret = OH_VideoEncoder_Start(videoEnc);
536    if (ret != AV_ERR_OK) {
537        // Handle exceptions.
538    }
539    ```
540
54115. (Optional) Call **OH_VideoEncoder_Reset()** to reset the encoder.
542
543    After **OH_VideoEncoder_Reset** is called, the encoder returns to the Initialized state. To continue, you must call **OH_VideoEncoder_Configure** and **OH_VideoEncoder_Prepare** again.
544
545    ```c++
546    std::unique_lock<std::shared_mutex> lock(codecMutex);
547    // Reset the encoder.
548    int32_t ret = OH_VideoEncoder_Reset(videoEnc);
549    if (ret != AV_ERR_OK) {
550        // Handle exceptions.
551    }
552    inQueue.Flush();
553    outQueue.Flush();
554    // Reconfigure the encoder.
555    OH_AVFormat *format = OH_AVFormat_Create();
556    ret = OH_VideoEncoder_Configure(videoEnc, format);
557    if (ret != AV_ERR_OK) {
558        // Handle exceptions.
559    }
560    OH_AVFormat_Destroy(format);
561    // The encoder is ready again.
562    ret = OH_VideoEncoder_Prepare(videoEnc);
563    if (ret != AV_ERR_OK) {
564        // Handle exceptions.
565    }
566    ```
567
56816. (Optional) Call **OH_VideoEncoder_Stop()** to stop the encoder.
569
570    After **OH_VideoEncoder_Stop** is called, the encoder retains the encoding instance and releases the input and output buffers. You can directly call **OH_VideoEncoder_Start** to continue encoding. The first **buffer** passed must carry the parameter set, starting from the IDR frame.
571
572    ```c++
573    std::unique_lock<std::shared_mutex> lock(codecMutex);
574    // Stop the encoder.
575    int32_t ret = OH_VideoEncoder_Stop(videoEnc);
576    if (ret != AV_ERR_OK) {
577        // Handle exceptions.
578    }
579    inQueue.Flush();
580    outQueue.Flush();
581    ```
582
58317. Call **OH_VideoEncoder_Destroy()** to destroy the encoder instance and release resources.
584
585    > **NOTE**
586    >
587    > This API cannot be called in the callback function.
588    >
589    > After the call, you must set a null pointer to the encoder to prevent program errors caused by wild pointers.
590
591    ```c++
592    std::unique_lock<std::shared_mutex> lock(codecMutex);
593    // Release the nativeWindow instance.
594    if(nativeWindow != nullptr){
595        OH_NativeWindow_DestroyNativeWindow(nativeWindow);
596        nativeWindow = nullptr;
597    }
598    // Call OH_VideoEncoder_Destroy to destroy the encoder.
599    int32_t ret = AV_ERR_OK;
600    if (videoEnc != nullptr) {
601        ret = OH_VideoEncoder_Destroy(videoEnc);
602        videoEnc = nullptr;
603    }
604    if (ret != AV_ERR_OK) {
605        // Handle exceptions.
606    }
607    inQueue.Flush();
608    outQueue.Flush();
609    ```
610
611### Buffer Input
612
613The following walks you through how to implement the entire video encoding process in buffer mode. It uses the YUV file input and H.264 encoding format as an example.
614Currently, the VideoEncoder module supports only data rotation in asynchronous mode.
615
6161. Add the header files.
617
618    ```c++
619    #include <multimedia/player_framework/native_avcodec_videoencoder.h>
620    #include <multimedia/player_framework/native_avcapability.h>
621    #include <multimedia/player_framework/native_avcodec_base.h>
622    #include <multimedia/player_framework/native_avformat.h>
623    #include <multimedia/player_framework/native_avbuffer.h>
624    #include <fstream>
625    ```
626
6272. Create an encoder instance.
628
629    The procedure is the same as that in surface mode and is not described here.
630
631    ```c++
632    // Create an encoder by name. If your application has special requirements, for example, expecting an encoder that supports a certain resolution, you can call OH_AVCodec_GetCapability to query the capability first.
633    OH_AVCapability *capability = OH_AVCodec_GetCapability(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true);
634    const char *codecName = OH_AVCapability_GetName(capability);
635    OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByName(codecName);
636    ```
637
638    ```c++
639    // Create an encoder by MIME type. Only specific codecs recommended by the system can be created in this way.
640    // If multiple codecs need to be created, create hardware encoder instances first. If the hardware resources are insufficient, create software encoder instances.
641    OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByMime(OH_AVCODEC_MIMETYPE_VIDEO_AVC);
642    ```
643
6443. Call **OH_VideoEncoder_RegisterCallback()** to register the callback functions.
645
646    Register the **OH_AVCodecCallback** struct that defines the following callback function pointers:
647    - **OH_AVCodecOnError**, a callback used to report a codec operation error. For details about the error codes, see [OH_AVCodecOnError](../../reference/apis-avcodec-kit/_codec_base.md#oh_avcodeconerror).
648    - **OH_AVCodecOnStreamChanged**, a callback used to report a codec stream change, for example, format change.
649    - **OH_AVCodecOnNeedInputBuffer**, a callback used to report input data required, which means that the encoder is ready for receiving YUV/RGB data.
650    - **OH_AVCodecOnNewOutputBuffer**, a callback used to report output data generated, which means that encoding is complete.
651
652    You need to process the callback functions to ensure that the encoder runs properly.
653
654    <!--RP2--><!--RP2End-->
655
656    <!--RP9-->
657    ```c++
658    bool isFirstFrame = true;
659    ```
660    <!--RP9End-->
661
662    ```c++
663    // Implement the OH_AVCodecOnError callback function.
664    static void OnError(OH_AVCodec *codec, int32_t errorCode, void *userData)
665    {
666        // Process the error code in the callback.
667        (void)codec;
668        (void)errorCode;
669        (void)userData;
670    }
671    ```
672
673    ```c++
674    // Implement the OH_AVCodecOnStreamChanged callback function.
675    static void OnStreamChanged(OH_AVCodec *codec, OH_AVFormat *format, void *userData)
676    {
677        // In buffer mode, this callback function does not take effect.
678        (void)codec;
679        (void)format;
680        (void)userData;
681    }
682    ```
683
684    ```c++
685    // Implement the OH_AVCodecOnNeedInputBuffer callback function.
686    static void OnNeedInputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData)
687    {
688        // Obtain the video width stride and height stride.
689        if (isFirstFrame) {
690            OH_AVFormat *format = OH_VideoEncoder_GetInputDescription(codec);
691            OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_STRIDE, &widthStride);
692            OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_SLICE_HEIGHT, &heightStride);
693            OH_AVFormat_Destroy(format);
694            isFirstFrame = false;
695        }
696        // The data buffer of the input frame and its index are sent to inQueue.
697        (void)codec;
698        (void)userData;
699        inQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer));
700    }
701    ```
702
703    <!--RP10-->
704    ```c++
705    // Implement the OH_AVCodecOnNewOutputBuffer callback function.
706    static void OnNewOutputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData)
707    {
708        // The data buffer of the finished frame and its index are sent to outQueue.
709        (void)userData;
710        outQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer));
711    }
712    ```
713    <!--RP10End-->
714
715    ```c++
716    // Call OH_VideoEncoder_RegisterCallback() to register the callback functions.
717    OH_AVCodecCallback cb = {&OnError, &OnStreamChanged, &OnNeedInputBuffer, &OnNewOutputBuffer};
718    int32_t ret = OH_VideoEncoder_RegisterCallback(videoEnc, cb, nullptr);
719    if (ret != AV_ERR_OK) {
720        // Handle exceptions.
721    }
722    ```
723
724    > **NOTE**
725    >
726    > In the callback functions, pay attention to multi-thread synchronization for operations on the data queue.
727    >
728
7294. Call **OH_VideoEncoder_Configure()** to configure the encoder.
730
731    The procedure is the same as that in surface mode and is not described here.
732
733    ```c++
734    OH_AVFormat *format = OH_AVFormat_Create();
735    // Set the format.
736    OH_AVFormat_SetIntValue (format, OH_MD_KEY_WIDTH, width); // Mandatory.
737    OH_AVFormat_SetIntValue(format, OH_MD_KEY_HEIGHT, height); // Mandatory.
738    OH_AVFormat_SetIntValue(format, OH_MD_KEY_PIXEL_FORMAT, pixelFormat); // Mandatory.
739    // Configure the encoder.
740    int32_t ret = OH_VideoEncoder_Configure(videoEnc, format);
741    if (ret != AV_ERR_OK) {
742        // Handle exceptions.
743    }
744    OH_AVFormat_Destroy(format);
745    ```
746
7475. Call **OH_VideoEncoder_Prepare()** to prepare internal resources for the encoder.
748
749
750
751    ```c++
752    ret = OH_VideoEncoder_Prepare(videoEnc);
753    if (ret != AV_ERR_OK) {
754        // Handle exceptions.
755    }
756    ```
757
7586. Call **OH_VideoEncoder_Start()** to start the encoder.
759
760    As soon as the encoder starts, the callback functions will be triggered to respond to events. Therefore, you must configure the input file and output file first.
761
762    ```c++
763    // Configure the paths of the input and output files.
764    std::string_view inputFilePath = "/*yourpath*.yuv";
765    std::string_view outputFilePath = "/*yourpath*.h264";
766    std::unique_ptr<std::ifstream> inputFile = std::make_unique<std::ifstream>();
767    std::unique_ptr<std::ofstream> outputFile = std::make_unique<std::ofstream>();
768    inputFile->open(inputFilePath.data(), std::ios::in | std::ios::binary);
769    outputFile->open(outputFilePath.data(), std::ios::out | std::ios::binary | std::ios::ate);
770    // Start the encoder.
771    int32_t ret = OH_VideoEncoder_Start(videoEnc);
772    if (ret != AV_ERR_OK) {
773        // Handle exceptions.
774    }
775    ```
776
7777. (Optional) Dynamically configure encoder parameters during running.
778
779   <!--RP11-->
780    ```c++
781    OH_AVFormat *format = OH_AVFormat_Create();
782    // Dynamically request IDR frames.
783    OH_AVFormat_SetIntValue(format, OH_MD_KEY_REQUEST_I_FRAME, true);
784    int32_t ret = OH_VideoEncoder_SetParameter(videoEnc, format);
785    if (ret != AV_ERR_OK) {
786        // Handle exceptions.
787    }
788    OH_AVFormat_Destroy(format);
789    ```
790    <!--RP11End-->
791
7928. Call **OH_VideoEncoder_PushInputBuffer()** to push the image to the input queue for encoding.
793
794    In the code snippet below, the following variables are used:
795
796    - **buffer**: parameter passed by the callback function **OnNeedInputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr).
797    - **index**: parameter passed by the callback function **OnNeedInputBuffer**, which uniquely corresponds to the buffer.
798    - **widthStride**: stride of the obtained buffer data.
799
800    ```c++
801    std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue();
802    std::shared_lock<std::shared_mutex> lock(codecMutex);
803    if (bufferInfo == nullptr || !bufferInfo->isValid) {
804        // Handle exceptions.
805    }
806    // Write image data.
807    if (widthStride == width) {
808        // Process the file stream and obtain the frame length, and then write the data to encode to the buffer of the specified index.
809        int32_t frameSize = width * height * 3 / 2; // Formula for calculating the data size of each frame in NV12 pixel format.
810        inputFile->read(reinterpret_cast<char *>(OH_AVBuffer_GetAddr(bufferInfo->buffer)), frameSize);
811    } else {
812        // If the stride is not equal to the width, perform offset based on the stride. For details, see the following example.
813    }
814    // Configure the buffer information.
815    OH_AVCodecBufferAttr info;
816    info.size = frameSize;
817    info.offset = 0;
818    info.pts = 0;
819    int32_t ret = OH_AVBuffer_SetBufferAttr(bufferInfo->buffer, &info);
820    if (ret != AV_ERR_OK) {
821        // Handle exceptions.
822    }
823    // Configure the buffer frame-specific information.
824    // You can determine the value.
825    int32_t isIFrame;
826    OH_AVFormat *parameter = OH_AVBuffer_GetParameter(bufferInfo->buffer);
827    OH_AVFormat_SetIntValue(parameter, OH_MD_KEY_REQUEST_I_FRAME, isIFrame);
828    ret = OH_AVBuffer_SetParameter(bufferInfo->buffer, parameter);
829    if (ret != AV_ERR_OK) {
830        // Handle exceptions.
831    }
832    OH_AVFormat_Destroy(parameter);
833    // Send the data to the input buffer for encoding. index is the index of the buffer.
834    ret = OH_VideoEncoder_PushInputBuffer(videoEnc, bufferInfo->index);
835    if (ret != AV_ERR_OK) {
836        // Handle exceptions.
837    }
838    ```
839
840
841
842    Offset the stride. The following uses an NV12 image as an example, presenting the image layout of **width**, **height**, **wStride**, and **hStride**.
843
844    - **OH_MD_KEY_WIDTH** corresponds to **width**.
845    - **OH_MD_KEY_HEIGHT** corresponds to **height**.
846    - **OH_MD_KEY_VIDEO_STRIDE** corresponds to **wStride**.
847    - **OH_MD_KEY_VIDEO_SLICE_HEIGHT** corresponds to **hStride**.
848
849    ![copy by line](figures/copy-by-line-encoder.png)
850
851    Add the header file.
852
853    ```c++
854    #include <string.h>
855    ```
856
857    The following is the sample code:
858
859    ```c++
860    struct Rect // Width and height of the source buffer. You can set them as required.
861    {
862        int32_t width;
863        int32_t height;
864    };
865
866    struct DstRect // Width stride and height stride of the destination buffer. They are obtained by calling OH_VideoEncoder_GetInputDescription.
867    {
868        int32_t wStride;
869        int32_t hStride;
870    };
871
872    struct SrcRect // Width stride and height stride of the source buffer. You can set them as required.
873    {
874        int32_t wStride;
875        int32_t hStride;
876    };
877
878    Rect rect = {320, 240};
879    DstRect dstRect = {320, 256};
880    SrcRect srcRect = {320, 250};
881    uint8_t* dst = new uint8_t[dstRect.hStride * dstRect.wStride * 3 / 2]; // Pointer to the target memory area.
882    uint8_t* src = new uint8_t[srcRect.hStride * srcRect.wStride * 3 / 2]; // Pointer to the source memory area.
883    uint8_t* dstTemp = dst;
884    uint8_t* srcTemp = src;
885
886    // Y: Copy the source data in the Y region to the target data in another region.
887    for (int32_t i = 0; i < rect.height; ++i) {
888        // Copy a row of data from the source to a row of the target.
889        memcpy(dstTemp, srcTemp, rect.width);
890        // Update the pointers to the source data and target data to copy the next row. The pointers to the source data and target data are moved downwards by one wStride each time the source data and target data are updated.
891        dstTemp += dstRect.wStride;
892        srcTemp += srcRect.wStride;
893    }
894    // Padding.
895    // Update the pointers to the source data and target data. The pointers move downwards by one padding.
896    dstTemp += (dstRect.hStride - rect.height) * dstRect.wStride;
897    srcTemp += (srcRect.hStride - rect.height) * srcRect.wStride;
898    rect.height >>= 1;
899    // UV: Copy the source data in the UV region to the target data in another region.
900    for (int32_t i = 0; i < rect.height; ++i) {
901        memcpy(dstTemp, srcTemp, rect.width);
902        dstTemp += dstRect.wStride;
903        srcTemp += srcRect.wStride;
904    }
905
906    delete[] dst;
907    dst = nullptr;
908    delete[] src;
909    src = nullptr;
910    ```
911
912    When processing buffer data (before pushing data) during hardware encoding, you must copy the image data after width and height alignment to the input callback AVBuffer. Generally, copy the image width, height, stride, and pixel format to ensure correct processing of the data to encode. For details, see step 3 in [Buffer Input](#buffer-input).
913
9149. Notify the encoder of EOS.
915
916    In the code snippet below, the following variables are used:
917    - **index**: parameter passed by the callback function **OnNeedInputBuffer**, which uniquely corresponds to the buffer.
918    - **buffer**: parameter passed by the callback function **OnNeedInputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr).
919
920     The API **OH_VideoEncoder_PushInputBuffer** is used to notify the encoder of EOS. This API is also used in step 8 to push the stream to the input queue for encoding. Therefore, in the current step, you must pass in the **AVCODEC_BUFFER_FLAGS_EOS** flag.
921
922    ```c++
923    std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue();
924    std::shared_lock<std::shared_mutex> lock(codecMutex);
925    if (bufferInfo == nullptr || !bufferInfo->isValid) {
926        // Handle exceptions.
927    }
928    OH_AVCodecBufferAttr info;
929    info.size = 0;
930    info.offset = 0;
931    info.pts = 0;
932    info.flags = AVCODEC_BUFFER_FLAGS_EOS;
933    int32_t ret = OH_AVBuffer_SetBufferAttr(bufferInfo->buffer, &info);
934    if (ret != AV_ERR_OK) {
935        // Handle exceptions.
936    }
937    ret = OH_VideoEncoder_PushInputBuffer(videoEnc, bufferInfo->index);
938    if (ret != AV_ERR_OK) {
939        // Handle exceptions.
940    }
941    ```
942
94310. Call **OH_VideoEncoder_FreeOutputBuffer()** to release encoded frames.
944
945    The procedure is the same as that in surface mode and is not described here.
946
947    ```c++
948    std::shared_ptr<CodecBufferInfo> bufferInfo = outQueue.Dequeue();
949    std::shared_lock<std::shared_mutex> lock(codecMutex);
950    if (bufferInfo == nullptr || !bufferInfo->isValid) {
951        // Handle exceptions.
952    }
953    // Obtain the encoded information.
954    OH_AVCodecBufferAttr info;
955    int32_t ret = OH_AVBuffer_GetBufferAttr(bufferInfo->buffer, &info);
956    if (ret != AV_ERR_OK) {
957        // Handle exceptions.
958    }
959    // Write the encoded frame data (specified by buffer) to the output file.
960    outputFile->write(reinterpret_cast<char *>(OH_AVBuffer_GetAddr(bufferInfo->buffer)), info.size);
961    // Free the output buffer. index is the index of the buffer.
962    ret = OH_VideoEncoder_FreeOutputBuffer(videoEnc, bufferInfo->index);
963    if (ret != AV_ERR_OK) {
964        // Handle exceptions.
965    }
966    ```
967
968The subsequent processes (including refreshing, resetting, stopping, and destroying the encoder) are the same as those in surface mode. For details, see steps 14–17 in [Surface Input](#surface-input).
969