1# Video Encoding 2 3<!--Kit: AVCodec Kit--> 4<!--Subsystem: Multimedia--> 5<!--Owner: @zhanghongran--> 6<!--Designer: @dpy2650---> 7<!--Tester: @cyakee--> 8<!--Adviser: @zengyawen--> 9 10You can call native APIs to perform video encoding, which compresses video data into a video stream. 11 12<!--RP3--><!--RP3End--> 13 14For details about the supported encoding capabilities, see [AVCodec Supported Formats](avcodec-support-formats.md#video-encoding). 15 16<!--RP1--><!--RP1End--> 17 18The following table lists the video encoding capabilities supported: 19 20<!--RP4--> 21| Capability | How to Use | 22| --------------------------------------- | ---------------------------------------------------------------------------------- | 23| Layered encoding<br> Setting the LTR frame and reference frame | For details, see [Temporally Scalable Video Coding](video-encoding-temporal-scalability.md). | 24| Repeat encoding of historical frames | For details, see [OH_MD_KEY_VIDEO_ENCODER_REPEAT_PREVIOUS_FRAME_AFTER](../../reference/apis-avcodec-kit/_codec_base.md#oh_md_key_video_encoder_repeat_previous_frame_after) and<br> [OH_MD_KEY_VIDEO_ENCODER_REPEAT_PREVIOUS_MAX_COUNT](../../reference/apis-avcodec-kit/_codec_base.md#oh_md_key_video_encoder_repeat_previous_max_count). | 25<!--RP4End--> 26 27## Constraints 28 29- The buffer mode does not support 10-bit image data. 30- Due to limited hardware encoder resources, you must call **OH_VideoEncoder_Destroy** to destroy every encoder instance when it is no longer needed. 31- If **flush()**, **reset()**, **stop()**, or **destroy()** is executed in a non-callback thread, the execution result is returned after all callbacks are executed. 32- Once **Flush**, **Reset**, or **Stop** is called, the system reclaims the OH_AVBuffer. Therefore, do not continue to operate the OH_AVBuffer obtained through the previous callback function. 33- The buffer mode and surface mode use the same APIs. Therefore, the surface mode is described as an example. 34- In buffer mode, after obtaining the pointer to an OH_AVBuffer instance through the callback function **OH_AVCodecOnNeedInputBuffer**, call **OH_VideoEncoder_PushInputBuffer** to notify the system that the buffer has been fully utilized. In this way, the system will proceed with encoding the data contained in the buffer. If the OH_NativeBuffer instance is obtained through **OH_AVBuffer_GetNativeBuffer** and its lifecycle extends beyond that of the OH_AVBuffer pointer instance, you mut perform data duplication. In this case, you should manage the lifecycle of the newly generated OH_NativeBuffer object to ensure that the object can be correctly used and released. 35<!--RP14--><!--RP14End--> 36 37## Surface Input and Buffer Input 38 39- Surface input and buffer input differ in data sources. 40 41- They are applicable to different scenarios. 42 - Surface input indicates that the OHNativeWindow is used to transfer passed-in data. It supports connection with other modules, such as the camera module. 43 - Buffer input refers to a pre-allocated memory area. You need to copy original data to this memory area. It is more applicable to scenarios such as reading video data from files. 44 45- The two also differ slightly in the API calling modes: 46 - In buffer mode, call **OH_VideoEncoder_PushInputBuffer** to input data. In surface mode, before the encoder is ready, call **OH_VideoEncoder_GetSurface** to obtain the OHNativeWindow for video data transmission. 47 - In buffer mode, you can use **attr** in **OH_AVBuffer** to pass in the End of Stream (EOS) flag, and the encoder stops when it reads the last frame. In surface mode, call **OH_VideoEncoder_NotifyEndOfStream** to notify the encoder of EOS. 48 49- Data transfer performance in surface mode is better than that in buffer mode. 50 51For details about the development procedure, see [Surface Mode](#surface-mode) and [Buffer Mode](#buffer-mode). 52 53## State Machine Interaction 54 55The following figure shows the interaction between states. 56 57 58 591. An encoder enters the Initialized state in either of the following ways: 60 - When an encoder instance is initially created, the encoder enters the Initialized state. 61 - When **OH_VideoEncoder_Reset** is called in any state, the encoder returns to the Initialized state. 62 632. When the encoder is in the Initialized state, you can call **OH_VideoEncoder_Configure** to configure the encoder. After the configuration, the encoder enters the Configured state. 643. When the encoder is in the Configured state, you can call **OH_VideoEncoder_Prepare()** to switch it to the Prepared state. 654. When the encoder is in the Prepared state, you can call **OH_VideoEncoder_Start** to switch it to the Executing state. 66 - When the encoder is in the Executing state, you can call **OH_VideoEncoder_Stop** to switch it back to the Prepared state. 67 685. In rare cases, the encoder may encounter an error and enter the Error state. If this is the case, an invalid value can be returned or an exception can be thrown through a queue operation. 69 - When the encoder is in the Error state, you can either call **OH_VideoEncoder_Reset** to switch it to the Initialized state or call **OH_VideoEncoder_Destroy** to switch it to the Released state. 70 716. The Executing state has three substates: Flushed, Running, and End-of-Stream. 72 - After **OH_VideoEncoder_Start** is called, the encoder enters the Running substate immediately. 73 - When the encoder is in the Executing state, you can call **OH_VideoEncoder_Flush** to switch it to the Flushed substate. 74 - After all data to be processed is transferred to the encoder, the [AVCODEC_BUFFER_FLAGS_EOS](../../reference/apis-avcodec-kit/_core.md#oh_avcodecbufferflags-1) flag is added to the last input buffer in the input buffers queue. Once this flag is detected, the encoder transits to the End-of-Stream substate. In this state, the encoder does not accept new inputs, but continues to generate outputs until it reaches the tail frame. 75 767. When the encoder is no longer needed, you must call **OH_VideoEncoder_Destroy** to destroy the encoder instance, which then transitions to the Released state. 77 78## How to Develop 79 80Read [VideoEncoder](../../reference/apis-avcodec-kit/_video_encoder.md) for the API reference. 81 82The figure below shows the call relationship of video encoding. 83 84- The dotted line indicates an optional operation. 85 86- The solid line indicates a mandatory operation. 87 88 89 90### Linking the Dynamic Libraries in the CMake Script 91 92```cmake 93target_link_libraries(sample PUBLIC libnative_media_codecbase.so) 94target_link_libraries(sample PUBLIC libnative_media_core.so) 95target_link_libraries(sample PUBLIC libnative_media_venc.so) 96``` 97 98> **NOTE** 99> 100> The word **sample** in the preceding code snippet is only an example. Use the actual project directory name. 101> 102 103### Defining the Basic Structure 104 105The sample code provided in this section adheres to the C++17 standard and is for reference only. You can define your own buffer objects by referring to it. 106 1071. Add the header files. 108 109 ```c++ 110 #include <condition_variable> 111 #include <memory> 112 #include <mutex> 113 #include <queue> 114 #include <shared_mutex> 115 ``` 116 1172. Define the information about the encoder callback buffer. 118 119 ```c++ 120 struct CodecBufferInfo { 121 CodecBufferInfo(uint32_t index, OH_AVBuffer *buffer): index(index), buffer(buffer), isValid(true) {} 122 CodecBufferInfo(uint32_t index, OH_AVFormat *parameter): index(index), parameter(parameter), isValid(true) {} 123 // Callback buffer. 124 OH_AVBuffer *buffer = nullptr; 125 // In surface mode, pass the frame-specific parameter of the callback, which can be used only after the frame-specific parameter callback function is registered. 126 OH_AVFormat *parameter = nullptr; 127 // Index of the callback buffer. 128 uint32_t index = 0; 129 // Check whether the current buffer information is valid. 130 bool isValid = true; 131 }; 132 ``` 133 1343. Define the input and output queue for encoding. 135 136 ```c++ 137 class CodecBufferQueue { 138 public: 139 // Pass the callback buffer information to the queue. 140 void Enqueue(const std::shared_ptr<CodecBufferInfo> bufferInfo) 141 { 142 std::unique_lock<std::mutex> lock(mutex_); 143 bufferQueue_.push(bufferInfo); 144 cond_.notify_all(); 145 } 146 147 // Obtain the information about the callback buffer. 148 std::shared_ptr<CodecBufferInfo> Dequeue(int32_t timeoutMs = 1000) 149 { 150 std::unique_lock<std::mutex> lock(mutex_); 151 (void)cond_.wait_for(lock, std::chrono::milliseconds(timeoutMs), [this]() { return !bufferQueue_.empty(); }); 152 if (bufferQueue_.empty()) { 153 return nullptr; 154 } 155 std::shared_ptr<CodecBufferInfo> bufferInfo = bufferQueue_.front(); 156 bufferQueue_.pop(); 157 return bufferInfo; 158 } 159 160 // Clear the queue. The previous callback buffer becomes unavailable. 161 void Flush() 162 { 163 std::unique_lock<std::mutex> lock(mutex_); 164 while (!bufferQueue_.empty()) { 165 std::shared_ptr<CodecBufferInfo> bufferInfo = bufferQueue_.front(); 166 // After the flush, stop, reset, and destroy operations are performed, the previous callback buffer information is invalid. 167 bufferInfo->isValid = false; 168 bufferQueue_.pop(); 169 } 170 } 171 172 private: 173 std::mutex mutex_; 174 std::condition_variable cond_; 175 std::queue<std::shared_ptr<CodecBufferInfo>> bufferQueue_; 176 }; 177 ``` 178 1794. Configure global variables. 180 181 These global variables are for reference only. They can be encapsulated into an object based on service requirements. 182 183 ```c++ 184 // Video frame width. 185 int32_t width = 320; 186 // Video frame height. 187 int32_t height = 240; 188 // Video pixel format. 189 OH_AVPixelFormat pixelFormat = AV_PIXEL_FORMAT_NV12; 190 // Video width stride. 191 int32_t widthStride = 0; 192 // Video height stride. 193 int32_t heightStride = 0; 194 // Pointer to the encoder instance. 195 OH_AVCodec *videoEnc = nullptr; 196 // Encoder synchronization lock. 197 std::shared_mutex codecMutex; 198 // Encoder input queue. 199 CodecBufferQueue inQueue; 200 // Encoder output queue. 201 CodecBufferQueue outQueue; 202 ``` 203 204### Surface Mode 205 206The following walks you through how to implement the entire video encoding process in surface mode and implement data rotation in asynchronous mode. In this example, surface data is input and encoded into a H.264 stream. 207 2081. Add the header files. 209 210 ```c++ 211 #include <multimedia/player_framework/native_avcodec_videoencoder.h> 212 #include <multimedia/player_framework/native_avcapability.h> 213 #include <multimedia/player_framework/native_avcodec_base.h> 214 #include <multimedia/player_framework/native_avformat.h> 215 #include <multimedia/player_framework/native_avbuffer.h> 216 #include <fstream> 217 ``` 218 2192. Create an encoder instance. 220 221 You can create an encoder by name or MIME type. In the code snippet below, the following variables are used: 222 223 - **videoEnc**: pointer to the video encoder instance. 224 - **capability**: pointer to the encoder's capability. 225 - **OH_AVCODEC_MIMETYPE_VIDEO_AVC**: AVC video codec. 226 227 The following is an example: 228 229 ```c++ 230 // Create an encoder by name. If your application has special requirements, for example, expecting an encoder that supports a certain resolution, you can call OH_AVCodec_GetCapability to query the capability first. 231 OH_AVCapability *capability = OH_AVCodec_GetCapability(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true); 232 // Create a hardware encoder instance. 233 OH_AVCapability *capability= OH_AVCodec_GetCapabilityByCategory(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true, HARDWARE); 234 const char *codecName = OH_AVCapability_GetName(capability); 235 OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByName(codecName); 236 ``` 237 238 ```c++ 239 // Create an encoder by MIME type. Only specific codecs recommended by the system can be created in this way. 240 // Only hardware encoders can be created. 241 OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByMime(OH_AVCODEC_MIMETYPE_VIDEO_AVC); 242 ``` 243 2443. Call **OH_VideoEncoder_RegisterCallback()** to register the callback functions. 245 246 Register the **OH_AVCodecCallback** struct that defines the following callback function pointers: 247 248 - **OH_AVCodecOnError**, a callback used to report a codec operation error. For details about the error codes, see [OH_AVCodecOnError](../../reference/apis-avcodec-kit/_codec_base.md#oh_avcodeconerror). 249 - **OH_AVCodecOnStreamChanged**, a callback used to report a codec stream change, for example, format change. 250 - **OH_AVCodecOnNeedInputBuffer**, a callback used to report input data required. This callback does not take effect, since you input data through the obtained surface. 251 - **OH_AVCodecOnNewOutputBuffer**, a callback used to report output data generated, which means that encoding is complete. 252 253 <!--RP2--><!--RP2End--> 254 255 The following is an example: 256 257 ```c++ 258 // Set the OH_AVCodecOnError callback function, which is used to report a codec operation error. 259 static void OnError(OH_AVCodec *codec, int32_t errorCode, void *userData) 260 { 261 // Process the error code in the callback. 262 (void)codec; 263 (void)errorCode; 264 (void)userData; 265 } 266 ``` 267 268 <!--RP12--> 269 ```c++ 270 // Set the OH_AVCodecOnStreamChanged callback function, which is used to report an encoding stream change. 271 static void OnStreamChanged(OH_AVCodec *codec, OH_AVFormat *format, void *userData) 272 { 273 // In surface mode, this callback function is triggered when the surface resolution changes. 274 (void)codec; 275 (void)userData; 276 bool ret = OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_WIDTH, &width) && 277 OH_AVFormat_GetIntValue(format, OH_MD_KEY_VIDEO_HEIGHT, &height); 278 if (!ret) { 279 // Handle exceptions. 280 } 281 } 282 ``` 283 <!--RP12End--> 284 285 ```c++ 286 // Set the OH_AVCodecOnNeedInputBuffer callback function, which is used to send an input frame to the data queue. 287 static void OnNeedInputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData) 288 { 289 // In surface mode, this callback function does not take effect. Data is input through the obtained surface. 290 (void)userData; 291 (void)index; 292 (void)buffer; 293 } 294 ``` 295 296 ```c++ 297 // Set the OH_AVCodecOnNewOutputBuffer callback function, which is used to send an encoded frame to the output queue. 298 static void OnNewOutputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData) 299 { 300 // The data buffer of the finished frame and its index are sent to outQueue. 301 (void)codec; 302 (void)userData; 303 outQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer)); 304 } 305 ``` 306 307 ```c++ 308 // Call OH_VideoEncoder_RegisterCallback() to register the callback functions. 309 OH_AVCodecCallback cb = {&OnError, &OnStreamChanged, &OnNeedInputBuffer, &OnNewOutputBuffer}; 310 OH_AVErrCode ret = OH_VideoEncoder_RegisterCallback(videoEnc, cb, nullptr); // nullptr: userData on which the callback depends is empty. 311 if (ret != AV_ERR_OK) { 312 // Handle exceptions. 313 } 314 ``` 315 316 > **NOTE** 317 > 318 > In the callback functions, pay attention to multi-thread synchronization for operations on the data queue. 319 3204. (Optional) Call **OH_VideoEncoder_RegisterParameterCallback()** to register the frame-specific parameter callback function. 321 322 For details, see [Temporally Scalable Video Coding](video-encoding-temporal-scalability.md). 323 324 <!--RP7--> 325 ```c++ 326 // 4.1 Implement the OH_VideoEncoder_OnNeedInputParameter callback function. 327 static void OnNeedInputParameter(OH_AVCodec *codec, uint32_t index, OH_AVFormat *parameter, void *userData) 328 { 329 // The data parameter of the input frame and its index are sent to inQueue. 330 inQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, parameter)); 331 } 332 333 // 4.2 Register the frame-specific parameter callback function. 334 OH_VideoEncoder_OnNeedInputParameter inParaCb = OnNeedInputParameter; 335 OH_VideoEncoder_RegisterParameterCallback(videoEnc, inParaCb, nullptr); // nullptr: userData on which the callback depends is empty. 336 ``` 337 <!--RP7End--> 338 3395. Call **OH_VideoEncoder_Configure()** to configure the encoder. 340 341 For details about the configurable options, see [Video Dedicated Key-Value Paris](../../reference/apis-avcodec-kit/_codec_base.md#media-data-key-value-pairs). 342 343 For details about the parameter verification rules, see [OH_VideoEncoder_Configure()](../../reference/apis-avcodec-kit/_video_encoder.md#oh_videoencoder_configure). 344 345 The parameter value ranges can be obtained through the capability query interface. For details, see [Obtaining Supported Codecs](obtain-supported-codecs.md). 346 347 Currently, the following options must be configured for all supported formats: video frame width, video frame height, and video pixel format. 348 349 ```c++ 350 // Configure the video frame rate. 351 double frameRate = 30.0; 352 // Configure the video YUV range flag. 353 bool rangeFlag = false; 354 // Configure the video primary color. 355 int32_t primary = static_cast<int32_t>(OH_ColorPrimary::COLOR_PRIMARY_BT709); 356 // Configure the transfer characteristics. 357 int32_t transfer = static_cast<int32_t>(OH_TransferCharacteristic::TRANSFER_CHARACTERISTIC_BT709); 358 // Configure the maximum matrix coefficient. 359 int32_t matrix = static_cast<int32_t>(OH_MatrixCoefficient::MATRIX_COEFFICIENT_IDENTITY); 360 // Configure the encoding profile. 361 int32_t profile = static_cast<int32_t>(OH_AVCProfile::AVC_PROFILE_HIGH); 362 // Configure the encoding bit rate mode. 363 int32_t rateMode = static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_VBR); 364 // Configure the key frame interval, in milliseconds. 365 int32_t iFrameInterval = 1000; 366 // Configure the SQR factor. 367 int32_t sqrFactor = 30; 368 // Configure the maximum bit rate, in bit/s. 369 int64_t maxBitRate = 20000000; 370 // Configure the bit rate, in bit/s. 371 int64_t bitRate = 5000000; 372 // Set the encoding quality. 373 int64_t quality = 90; 374 375 auto format = std::shared_ptr<OH_AVFormat>(OH_AVFormat_Create(), OH_AVFormat_Destroy); 376 if (format == nullptr) { 377 // Handle exceptions. 378 } 379 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_WIDTH, width); // Mandatory. 380 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_HEIGHT, height); // Mandatory. 381 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_PIXEL_FORMAT, pixelFormat); // Mandatory. 382 383 OH_AVFormat_SetDoubleValue(format.get(), OH_MD_KEY_FRAME_RATE, frameRate); 384 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_RANGE_FLAG, rangeFlag); 385 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_COLOR_PRIMARIES, primary); 386 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_TRANSFER_CHARACTERISTICS, transfer); 387 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_MATRIX_COEFFICIENTS, matrix); 388 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_I_FRAME_INTERVAL, iFrameInterval); 389 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_PROFILE, profile); 390 // Configure OH_MD_KEY_QUALITY only when OH_BitrateMode = BITRATE_MODE_CQ is used. 391 if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_CQ)) { 392 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_QUALITY, quality); 393 } else if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_SQR)) { 394 // Configure OH_MD_KEY_MAX_BITRATE and OH_MD_KEY_SQR_FACTOR only when OH_BitrateMode is set to BITRATE_MODE_SQR. 395 OH_AVFormat_SetLongValue(format.get(), OH_MD_KEY_MAX_BITRATE, maxBitRate); 396 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_SQR_FACTOR, sqrFactor); 397 } else if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_CBR) || 398 rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_VBR)){ 399 OH_AVFormat_SetLongValue(format.get(), OH_MD_KEY_BITRATE, bitRate); 400 } 401 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_VIDEO_ENCODE_BITRATE_MODE, rateMode); 402 OH_AVErrCode ret = OH_VideoEncoder_Configure(videoEnc, format.get()); 403 if (ret != AV_ERR_OK) { 404 // Handle exceptions. 405 } 406 ``` 407 408 > **NOTE** 409 > 410 > If an optional parameter is incorrectly configured, the error code **AV_ERR_INVALID_VAL** is returned. However, **OH_VideoEncoder_Configure()** does not fail. Instead, its execution continues with the default value. 411 4126. Obtain a surface. 413 414 Obtain the OHNativeWindow in surface mode. The surface must be obtained before **OH_VideoEncoder_Prepare** is called. 415 416 ```c++ 417 // Obtain the surface used for data input. 418 OHNativeWindow *nativeWindow; 419 OH_AVErrCode ret = OH_VideoEncoder_GetSurface(videoEnc, &nativeWindow); 420 if (ret != AV_ERR_OK) { 421 // Handle exceptions. 422 } 423 // Use the OHNativeWindow* variable to obtain the address of the data to be filled through the producer interface. 424 ``` 425 426 For details about how to use the OHNativeWindow* variable-type, see [OHNativeWindow](../../reference/apis-arkgraphics2d/capi-nativewindow.md). 427 4287. Call **OH_VideoEncoder_Prepare()** to prepare internal resources for the encoder. 429 430 ```c++ 431 OH_AVErrCode ret = OH_VideoEncoder_Prepare(videoEnc); 432 if (ret != AV_ERR_OK) { 433 // Handle exceptions. 434 } 435 ``` 436 4378. Call **OH_VideoEncoder_Start()** to start the encoder. 438 439 ```c++ 440 // Configure the paths of the input and output files. 441 std::string_view outputFilePath = "/*yourpath*.h264"; 442 std::unique_ptr<std::ofstream> outputFile = std::make_unique<std::ofstream>(); 443 if (outputFile != nullptr) { 444 outputFile->open(outputFilePath.data(), std::ios::out | std::ios::binary | std::ios::ate); 445 } 446 // Start the encoder. 447 OH_AVErrCode ret = OH_VideoEncoder_Start(videoEnc); 448 if (ret != AV_ERR_OK) { 449 // Handle exceptions. 450 } 451 ``` 452 4539. (Optional) Call **OH_VideoEncoder_SetParameter()** to dynamically configure encoder parameters during running. 454 455 <!--RP8--> 456 ```c++ 457 auto format = std::shared_ptr<OH_AVFormat>(OH_AVFormat_Create(), OH_AVFormat_Destroy); 458 if (format == nullptr) { 459 // Handle exceptions. 460 } 461 // Dynamically request IDR frames. 462 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_REQUEST_I_FRAME, true); 463 // SQR supports dynamic configuration of the maximum bit rate and SQR factor. 464 if (rateMode == static_cast<int32_t>(OH_BitrateMode::BITRATE_MODE_SQR)) { 465 int32_t sqrFactor = 25; // SQR factor. 466 int64_t maxBitRate = 10000000; // Maximum bit rate (bit/s). 467 OH_AVFormat_SetLongValue(format.get(), OH_MD_KEY_MAX_BITRATE, maxBitRate); 468 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_SQR_FACTOR, sqrFactor); 469 } 470 OH_AVErrCode ret = OH_VideoEncoder_SetParameter(videoEnc, format.get()); 471 if (ret != AV_ERR_OK) { 472 // Handle exceptions. 473 } 474 ``` 475 <!--RP8End--> 476 47710. Write the image to encode. 478 479 In step 6, you have configured the **OHNativeWindow*** variable type returned by **OH_VideoEncoder_GetSurface**. The data required for encoding is continuously input by the surface. Therefore, you do not need to process the **OnNeedInputBuffer** callback function or use **OH_VideoEncoder_PushInputBuffer** to input data. 480 <!--RP13--><!--RP13End--> 481 48211. (Optional) Call **OH_VideoEncoder_PushInputParameter()** to notify the encoder that the frame-specific parameter configuration is complete. 483 484 In step 4, you have registered the frame-specific parameter callback function. 485 486 In the code snippet below, the following variables are used: 487 488 - **index**: parameter passed by the callback function **OnNeedInputParameter**, which uniquely corresponds to the buffer. 489 490 ```c++ 491 std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue(); 492 std::shared_lock<std::shared_mutex> lock(codecMutex); 493 if (bufferInfo == nullptr || !bufferInfo->isValid) { 494 // Handle exceptions. 495 } 496 // You can determine the value. 497 int32_t isIFrame; 498 OH_AVFormat_SetIntValue(bufferInfo->parameter, OH_MD_KEY_REQUEST_I_FRAME, isIFrame); 499 OH_AVErrCode ret = OH_VideoEncoder_PushInputParameter(videoEnc, bufferInfo->index); 500 if (ret != AV_ERR_OK) { 501 // Handle exceptions. 502 } 503 ``` 504 50512. Call **OH_VideoEncoder_NotifyEndOfStream()** to notify the encoder of EOS. 506 507 ```c++ 508 // In surface mode, you only need to call this API to notify the encoder of EOS. 509 // In buffer mode, you need to set the AVCODEC_BUFFER_FLAGS_EOS flag and then call OH_VideoEncoder_PushInputBuffer to notify the encoder of EOS. 510 OH_AVErrCode ret = OH_VideoEncoder_NotifyEndOfStream(videoEnc); 511 if (ret != AV_ERR_OK) { 512 // Handle exceptions. 513 } 514 ``` 515 51613. Call **OH_VideoEncoder_FreeOutputBuffer()** to release encoded frames. 517 518 In the following example, the member variables of **bufferInfo** are as follows: 519 520 - **index**: parameter passed by the callback function **OnNewOutputBuffer**, which uniquely corresponds to the buffer. 521 - **buffer**: parameter passed by the callback function **OnNewOutputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr). 522 - **isValid**: whether the buffer instance stored in **bufferInfo** is valid. 523 524 <!--RP6--> 525 ```c++ 526 std::shared_ptr<CodecBufferInfo> bufferInfo = outQueue.Dequeue(); 527 std::shared_lock<std::shared_mutex> lock(codecMutex); 528 if (bufferInfo == nullptr || !bufferInfo->isValid) { 529 // Handle exceptions. 530 } 531 // Obtain the encoded information. 532 OH_AVCodecBufferAttr info; 533 OH_AVErrCode getBufferRet = OH_AVBuffer_GetBufferAttr(bufferInfo->buffer, &info); 534 if (getBufferRet != AV_ERR_OK) { 535 // Handle exceptions. 536 } 537 // Write the encoded frame data (specified by buffer) to the output file. 538 uint8_t *addr = OH_AVBuffer_GetAddr(bufferInfo->buffer); 539 if (addr == nullptr) { 540 // Handle exceptions. 541 } 542 if (outputFile != nullptr && outputFile->is_open()) { 543 outputFile->write(reinterpret_cast<char *>(addr), info.size); 544 } 545 // Free the output buffer. index is the index of the buffer. 546 OH_AVErrCode freeOutputRet = OH_VideoEncoder_FreeOutputBuffer(videoEnc, bufferInfo->index); 547 if (freeOutputRet != AV_ERR_OK) { 548 // Handle exceptions. 549 } 550 ``` 551 <!--RP6End--> 552 55314. (Optional) Call **OH_VideoEncoder_Flush()** to refresh the encoder. 554 555 After **OH_VideoEncoder_Flush** is called, the encoder remains in the Running state, but the input and output data and parameter set (such as the H.264 PPS/SPS) buffered in the encoder are cleared. 556 557 To continue encoding, you must call **OH_VideoEncoder_Start** again. 558 559 ```c++ 560 std::unique_lock<std::shared_mutex> lock(codecMutex); 561 // Refresh the encoder. 562 OH_AVErrCode flushRet = OH_VideoEncoder_Flush(videoEnc); 563 if (flushRet != AV_ERR_OK) { 564 // Handle exceptions. 565 } 566 inQueue.Flush(); 567 outQueue.Flush(); 568 // Start encoding again. 569 OH_AVErrCode startRet = OH_VideoEncoder_Start(videoEnc); 570 if (startRet != AV_ERR_OK) { 571 // Handle exceptions. 572 } 573 ``` 574 57515. (Optional) Call **OH_VideoEncoder_Reset()** to reset the encoder. 576 577 After **OH_VideoEncoder_Reset** is called, the encoder returns to the initialized state. To continue encoding, you must call **OH_VideoEncoder_Configure** and then **OH_VideoEncoder_Prepare**. 578 579 ```c++ 580 std::unique_lock<std::shared_mutex> lock(codecMutex); 581 // Reset the encoder. 582 OH_AVErrCode resetRet = OH_VideoEncoder_Reset(videoEnc); 583 if (resetRet != AV_ERR_OK) { 584 // Handle exceptions. 585 } 586 inQueue.Flush(); 587 outQueue.Flush(); 588 // Reconfigure the encoder. 589 auto format = std::shared_ptr<OH_AVFormat>(OH_AVFormat_Create(), OH_AVFormat_Destroy); 590 if (format == nullptr) { 591 // Handle exceptions. 592 } 593 OH_AVErrCode configRet = OH_VideoEncoder_Configure(videoEnc, format.get()); 594 if (configRet != AV_ERR_OK) { 595 // Handle exceptions. 596 } 597 // The encoder is ready again. 598 OH_AVErrCode prepareRet = OH_VideoEncoder_Prepare(videoEnc); 599 if (prepareRet != AV_ERR_OK) { 600 // Handle exceptions. 601 } 602 ``` 603 60416. (Optional) Call **OH_VideoEncoder_Stop()** to stop the encoder. 605 606 After **OH_VideoEncoder_Stop** is called, the encoder retains the encoding instance and releases the input and output buffers. You can directly call **OH_VideoEncoder_Start** to continue encoding. The first **buffer** passed must carry the parameter set, starting from the IDR frame. 607 608 ```c++ 609 std::unique_lock<std::shared_mutex> lock(codecMutex); 610 // Stop the encoder. 611 OH_AVErrCode ret = OH_VideoEncoder_Stop(videoEnc); 612 if (ret != AV_ERR_OK) { 613 // Handle exceptions. 614 } 615 inQueue.Flush(); 616 outQueue.Flush(); 617 ``` 618 61917. Call **OH_VideoEncoder_Destroy()** to destroy the encoder instance and release resources. 620 621 > **NOTE** 622 > 623 > This API cannot be called in the callback function. 624 > 625 > After the call, you must set a null pointer to the encoder to prevent program errors caused by wild pointers. 626 627 ```c++ 628 std::unique_lock<std::shared_mutex> lock(codecMutex); 629 // Release the nativeWindow instance. 630 if(nativeWindow != nullptr){ 631 OH_NativeWindow_DestroyNativeWindow(nativeWindow); 632 nativeWindow = nullptr; 633 } 634 // Call OH_VideoEncoder_Destroy to destroy the encoder. 635 OH_AVErrCode ret = AV_ERR_OK; 636 if (videoEnc != nullptr) { 637 ret = OH_VideoEncoder_Destroy(videoEnc); 638 videoEnc = nullptr; 639 } 640 if (ret != AV_ERR_OK) { 641 // Handle exceptions. 642 } 643 inQueue.Flush(); 644 outQueue.Flush(); 645 ``` 646 647### Buffer Mode 648 649The following walks you through how to implement the entire video encoding process in buffer mode and implement data rotation in asynchronous mode. It uses the YUV file input and H.264 encoding format as an example. 650 6511. Add the header files. 652 653 ```c++ 654 #include <multimedia/player_framework/native_avcodec_videoencoder.h> 655 #include <multimedia/player_framework/native_avcapability.h> 656 #include <multimedia/player_framework/native_avcodec_base.h> 657 #include <multimedia/player_framework/native_avformat.h> 658 #include <multimedia/player_framework/native_avbuffer.h> 659 #include <fstream> 660 ``` 661 6622. Create an encoder instance. 663 664 The procedure is the same as that in surface mode and is not described here. 665 666 ```c++ 667 // Create an encoder by name. If your application has special requirements, for example, expecting an encoder that supports a certain resolution, you can call OH_AVCodec_GetCapability to query the capability first. 668 OH_AVCapability *capability = OH_AVCodec_GetCapability(OH_AVCODEC_MIMETYPE_VIDEO_AVC, true); 669 const char *codecName = OH_AVCapability_GetName(capability); 670 OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByName(codecName); 671 ``` 672 673 ```c++ 674 // Create an encoder by MIME type. Only specific codecs recommended by the system can be created in this way. 675 // If multiple codecs need to be created, create hardware encoder instances first. If the hardware resources are insufficient, create software encoder instances. 676 OH_AVCodec *videoEnc = OH_VideoEncoder_CreateByMime(OH_AVCODEC_MIMETYPE_VIDEO_AVC); 677 ``` 678 6793. Call **OH_VideoEncoder_RegisterCallback()** to register the callback functions. 680 681 Register the **OH_AVCodecCallback** struct that defines the following callback function pointers: 682 - **OH_AVCodecOnError**, a callback used to report a codec operation error. For details about the error codes, see [OH_AVCodecOnError](../../reference/apis-avcodec-kit/_codec_base.md#oh_avcodeconerror). 683 - **OH_AVCodecOnStreamChanged**, a callback used to report a codec stream change, for example, format change. 684 - **OH_AVCodecOnNeedInputBuffer**, a callback used to report input data required, which means that the encoder is ready for receiving YUV/RGB data. 685 - **OH_AVCodecOnNewOutputBuffer**, a callback used to report output data generated, which means that encoding is complete. 686 687 You need to process the callback functions to ensure that the encoder runs properly. 688 689 <!--RP2--><!--RP2End--> 690 691 <!--RP9--> 692 ```c++ 693 bool isFirstFrame = true; 694 ``` 695 <!--RP9End--> 696 697 ```c++ 698 // Implement the OH_AVCodecOnError callback function. 699 static void OnError(OH_AVCodec *codec, int32_t errorCode, void *userData) 700 { 701 // Process the error code in the callback. 702 (void)codec; 703 (void)errorCode; 704 (void)userData; 705 } 706 ``` 707 708 ```c++ 709 // Implement the OH_AVCodecOnStreamChanged callback function. 710 static void OnStreamChanged(OH_AVCodec *codec, OH_AVFormat *format, void *userData) 711 { 712 // In buffer mode, this callback function does not take effect. 713 (void)codec; 714 (void)format; 715 (void)userData; 716 } 717 ``` 718 719 ```c++ 720 // Implement the OH_AVCodecOnNeedInputBuffer callback function. 721 static void OnNeedInputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData) 722 { 723 // Obtain the video width stride and height stride. 724 if (isFirstFrame) { 725 auto format = std::shared_ptr<OH_AVFormat>(OH_VideoEncoder_GetInputDescription(codec), OH_AVFormat_Destroy); 726 if (format == nullptr) { 727 // Handle exceptions. 728 } 729 bool ret = OH_AVFormat_GetIntValue(format.get(), OH_MD_KEY_VIDEO_STRIDE, &widthStride) && 730 OH_AVFormat_GetIntValue(format.get(), OH_MD_KEY_VIDEO_SLICE_HEIGHT, &heightStride); 731 if (!ret) { 732 // Handle exceptions. 733 } 734 isFirstFrame = false; 735 } 736 // The data buffer of the input frame and its index are sent to inQueue. 737 (void)codec; 738 (void)userData; 739 inQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer)); 740 } 741 ``` 742 743 <!--RP10--> 744 ```c++ 745 // Implement the OH_AVCodecOnNewOutputBuffer callback function. 746 static void OnNewOutputBuffer(OH_AVCodec *codec, uint32_t index, OH_AVBuffer *buffer, void *userData) 747 { 748 // The data buffer of the finished frame and its index are sent to outQueue. 749 (void)userData; 750 outQueue.Enqueue(std::make_shared<CodecBufferInfo>(index, buffer)); 751 } 752 ``` 753 <!--RP10End--> 754 755 ```c++ 756 // Call OH_VideoEncoder_RegisterCallback() to register the callback functions. 757 OH_AVCodecCallback cb = {&OnError, &OnStreamChanged, &OnNeedInputBuffer, &OnNewOutputBuffer}; 758 OH_AVErrCode ret = OH_VideoEncoder_RegisterCallback(videoEnc, cb, nullptr); 759 if (ret != AV_ERR_OK) { 760 // Handle exceptions. 761 } 762 ``` 763 764 > **NOTE** 765 > 766 > In the callback functions, pay attention to multi-thread synchronization for operations on the data queue. 767 > 768 7694. Call **OH_VideoEncoder_Configure()** to configure the encoder. 770 771 The procedure is the same as that in surface mode and is not described here. 772 773 ```c++ 774 auto format = std::shared_ptr<OH_AVFormat>(OH_AVFormat_Create(), OH_AVFormat_Destroy); 775 if (format == nullptr) { 776 // Handle exceptions. 777 } 778 // Set the format. 779 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_WIDTH, width); // Mandatory. 780 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_HEIGHT, height); // Mandatory. 781 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_PIXEL_FORMAT, pixelFormat); // Mandatory. 782 // Configure the encoder. 783 OH_AVErrCode ret = OH_VideoEncoder_Configure(videoEnc, format.get()); 784 if (ret != AV_ERR_OK) { 785 // Handle exceptions. 786 } 787 ``` 788 7895. Call **OH_VideoEncoder_Prepare()** to prepare internal resources for the encoder. 790 791 ```c++ 792OH_AVErrCode ret = OH_VideoEncoder_Prepare(videoEnc); 793 if (ret != AV_ERR_OK) { 794 // Handle exceptions. 795 } 796 ``` 797 7986. Call **OH_VideoEncoder_Start()** to start the encoder. 799 800 As soon as the encoder starts, the callback functions will be triggered to respond to events. Therefore, you must configure the input file and output file first. 801 802 ```c++ 803 // Configure the paths of the input and output files. 804 std::string_view inputFilePath = "/*yourpath*.yuv"; 805 std::string_view outputFilePath = "/*yourpath*.h264"; 806 std::unique_ptr<std::ifstream> inputFile = std::make_unique<std::ifstream>(); 807 std::unique_ptr<std::ofstream> outputFile = std::make_unique<std::ofstream>(); 808 if (inputFile != nullptr) { 809 inputFile->open(inputFilePath.data(), std::ios::in | std::ios::binary); 810 } 811 if (outputFile != nullptr) { 812 outputFile->open(outputFilePath.data(), std::ios::out | std::ios::binary | std::ios::ate); 813 } 814 // Start the encoder. 815 OH_AVErrCode ret = OH_VideoEncoder_Start(videoEnc); 816 if (ret != AV_ERR_OK) { 817 // Handle exceptions. 818 } 819 ``` 820 8217. (Optional) Dynamically configure encoder parameters during running. 822 823 <!--RP11--> 824 ```c++ 825 auto format = std::shared_ptr<OH_AVFormat>(OH_AVFormat_Create(), OH_AVFormat_Destroy); 826 if (format == nullptr) { 827 // Handle exceptions. 828 } 829 // Dynamically request IDR frames. 830 OH_AVFormat_SetIntValue(format.get(), OH_MD_KEY_REQUEST_I_FRAME, true); 831 OH_AVErrCode ret = OH_VideoEncoder_SetParameter(videoEnc, format.get()); 832 if (ret != AV_ERR_OK) { 833 // Handle exceptions. 834 } 835 ``` 836 <!--RP11End--> 837 8388. Call **OH_VideoEncoder_PushInputBuffer()** to push the image to the input queue for encoding. 839 840 In the code snippet below, the following variables are used: 841 - **widthStride**: width stride of the obtained buffer data. 842 - **heightStride**: height stride of the obtained buffer data. 843 844 The member variables of **bufferInfo** are as follows: 845 - **buffer**: parameter passed by the callback function **OnNeedInputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr). 846 - **index**: parameter passed by the callback function **OnNeedInputBuffer**, which uniquely corresponds to the buffer. 847 - **isValid**: whether the buffer instance stored in **bufferInfo** is valid. 848 849 850 ```c++ 851 std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue(); 852 std::shared_lock<std::shared_mutex> lock(codecMutex); 853 if (bufferInfo == nullptr || !bufferInfo->isValid) { 854 // Handle exceptions. 855 } 856 // Write image data. 857 int32_t frameSize = 0; 858 if (widthStride == width && heightStride == height) { 859 frameSize = width * height * 3 / 2; // Formula for calculating the data size of each frame in NV12 pixel format. 860 int32_t capacity = OH_AVBuffer_GetCapacity(bufferInfo->buffer); 861 if (frameSize > capacity) { 862 // Handle exceptions. 863 } 864 // Process the file stream and obtain the frame length, and then write the data to encode to the buffer of the specified index. 865 uint8_t *addr = OH_AVBuffer_GetAddr(bufferInfo->buffer); 866 if (addr == nullptr) { 867 // Handle exceptions. 868 } 869 if (inputFile != nullptr && inputFile->is_open()) { 870 inputFile->read(reinterpret_cast<char *>(addr), frameSize); 871 } 872 } else { 873 // If the stride is not equal to the width, perform offset based on the stride. For details, see the following example. 874 } 875 // Configure the buffer information. 876 OH_AVCodecBufferAttr info; 877 info.size = frameSize; 878 info.offset = 0; 879 info.pts = 0; 880 OH_AVErrCode setBufferRet = OH_AVBuffer_SetBufferAttr(bufferInfo->buffer, &info); 881 if (setBufferRet != AV_ERR_OK) { 882 // Handle exceptions. 883 } 884 // Configure the buffer frame-specific information. 885 // You can determine the value. 886 int32_t isIFrame; 887 auto parameter = std::shared_ptr<OH_AVFormat>(OH_AVBuffer_GetParameter(bufferInfo->buffer), OH_AVFormat_Destroy); 888 if (parameter == nullptr) { 889 // Handle exceptions. 890 } 891 OH_AVFormat_SetIntValue(parameter.get(), OH_MD_KEY_REQUEST_I_FRAME, isIFrame); 892 OH_AVErrCode parameterRet = OH_AVBuffer_SetParameter(bufferInfo->buffer, parameter.get()); 893 if (parameterRet != AV_ERR_OK) { 894 // Handle exceptions. 895 } 896 // Send the data to the input buffer for encoding. index is the index of the buffer. 897 OH_AVErrCode pushInputRet = OH_VideoEncoder_PushInputBuffer(videoEnc, bufferInfo->index); 898 if (pushInputRet != AV_ERR_OK) { 899 // Handle exceptions. 900 } 901 ``` 902 903 Offset the stride. The following uses an NV12 image as an example, 904 905 presenting the image layout of **width**, **height**, **wStride**, and **hStride**. 906 907 - **OH_MD_KEY_WIDTH** corresponds to **width**. 908 - **OH_MD_KEY_HEIGHT** corresponds to **height**. 909 - **OH_MD_KEY_VIDEO_STRIDE** corresponds to **wStride**. 910 - **OH_MD_KEY_VIDEO_SLICE_HEIGHT** corresponds to **hStride**. 911 912  913 914 Add the header files. 915 916 ```c++ 917 #include <string.h> 918 ``` 919 920 The sample code is as follows: 921 922 ```c++ 923 struct Rect // Width and height of the source buffer. You can set them as required. 924 { 925 int32_t width; 926 int32_t height; 927 }; 928 929 struct DstRect // Width stride and height stride of the destination buffer. They are obtained by calling OH_VideoEncoder_GetInputDescription. 930 { 931 int32_t wStride; 932 int32_t hStride; 933 }; 934 935 struct SrcRect // Width stride and height stride of the source buffer. You can set them as required. 936 { 937 int32_t wStride; 938 int32_t hStride; 939 }; 940 941 Rect rect = {320, 240}; 942 DstRect dstRect = {320, 256}; 943 SrcRect srcRect = {320, 250}; 944 uint8_t* dst = new uint8_t[dstRect.hStride * dstRect.wStride * 3 / 2]; // Pointer to the target memory area. 945 uint8_t* src = new uint8_t[srcRect.hStride * srcRect.wStride * 3 / 2]; // Pointer to the source memory area. 946 uint8_t* dstTemp = dst; 947 uint8_t* srcTemp = src; 948 rect.height = ((rect.height + 1) / 2) * 2 // This ensures the height is always even. 949 rect.width = ((rect.width + 1) / 2) * 2 // This ensures the width is always even. 950 951 // Y: Copy the source data in the Y region to the target data in another region. 952 for (int32_t i = 0; i < rect.height; ++i) { 953 // Copy a row of data from the source to a row of the target. 954 memcpy(dstTemp, srcTemp, rect.width); 955 // Update the pointers to the source data and target data to copy the next row. The pointers to the source data and target data are moved downwards by one wStride each time the source data and target data are updated. 956 dstTemp += dstRect.wStride; 957 srcTemp += srcRect.wStride; 958 } 959 // Padding. 960 // Update the pointers to the source data and target data. The pointers move downwards by one padding. 961 dstTemp += (dstRect.hStride - rect.height) * dstRect.wStride; 962 srcTemp += (srcRect.hStride - rect.height) * srcRect.wStride; 963 rect.height >>= 1; 964 // UV: Copy the source data in the UV region to the target data in another region. 965 for (int32_t i = 0; i < rect.height; ++i) { 966 memcpy(dstTemp, srcTemp, rect.width); 967 dstTemp += dstRect.wStride; 968 srcTemp += srcRect.wStride; 969 } 970 971 delete[] dst; 972 dst = nullptr; 973 delete[] src; 974 src = nullptr; 975 ``` 976 977 When processing buffer data (before pushing data) during hardware encoding, you must copy the image data after width and height alignment to the input callback AVBuffer. 978 979 Generally, copy the image width, height, stride, and pixel format to ensure correct processing of the data to encode. For details, see step 3 in [Buffer Mode](#buffer-mode). 980 9819. Notify the encoder of EOS. 982 983 In the following example, the member variables of **bufferInfo** are as follows: 984 - **index**: parameter passed by the callback function **OnNeedInputBuffer**, which uniquely corresponds to the buffer. 985 - **buffer**: parameter passed by the callback function **OnNeedInputBuffer**. You can obtain the pointer to the shared memory address by calling [OH_AVBuffer_GetAddr](../../reference/apis-avcodec-kit/_core.md#oh_avbuffer_getaddr). 986 - **isValid**: whether the buffer instance stored in **bufferInfo** is valid. 987 988 The API **OH_VideoEncoder_PushInputBuffer** is used to notify the encoder of EOS. This API is also used in step 8 to push the stream to the input queue for encoding. Therefore, in the current step, you must pass in the **AVCODEC_BUFFER_FLAGS_EOS** flag. 989 990 ```c++ 991 std::shared_ptr<CodecBufferInfo> bufferInfo = inQueue.Dequeue(); 992 std::shared_lock<std::shared_mutex> lock(codecMutex); 993 if (bufferInfo == nullptr || !bufferInfo->isValid) { 994 // Handle exceptions. 995 } 996 OH_AVCodecBufferAttr info; 997 info.size = 0; 998 info.offset = 0; 999 info.pts = 0; 1000 info.flags = AVCODEC_BUFFER_FLAGS_EOS; 1001 OH_AVErrCode setBufferRet = OH_AVBuffer_SetBufferAttr(bufferInfo->buffer, &info); 1002 if (setBufferRet != AV_ERR_OK) { 1003 // Handle exceptions. 1004 } 1005 OH_AVErrCode pushInputRet = OH_VideoEncoder_PushInputBuffer(videoEnc, bufferInfo->index); 1006 if (pushInputRet != AV_ERR_OK) { 1007 // Handle exceptions. 1008 } 1009 ``` 1010 101110. Call **OH_VideoEncoder_FreeOutputBuffer()** to release encoded frames. 1012 1013 The procedure is the same as that in surface mode and is not described here. 1014 1015 ```c++ 1016 std::shared_ptr<CodecBufferInfo> bufferInfo = outQueue.Dequeue(); 1017 std::shared_lock<std::shared_mutex> lock(codecMutex); 1018 if (bufferInfo == nullptr || !bufferInfo->isValid) { 1019 // Handle exceptions. 1020 } 1021 // Obtain the encoded information. 1022 OH_AVCodecBufferAttr info; 1023 OH_AVErrCode getBufferRet = OH_AVBuffer_GetBufferAttr(bufferInfo->buffer, &info); 1024 if (getBufferRet != AV_ERR_OK) { 1025 // Handle exceptions. 1026 } 1027 // Write the encoded frame data (specified by buffer) to the output file. 1028 uint8_t *addr = OH_AVBuffer_GetAddr(bufferInfo->buffer); 1029 if (addr == nullptr) { 1030 // Handle exceptions. 1031 } 1032 if (outputFile != nullptr && outputFile->is_open()) { 1033 outputFile->write(reinterpret_cast<char *>(addr), info.size); 1034 } 1035 // Free the output buffer. index is the index of the buffer. 1036 OH_AVErrCode freeOutputRet = OH_VideoEncoder_FreeOutputBuffer(videoEnc, bufferInfo->index); 1037 if (freeOutputRet != AV_ERR_OK) { 1038 // Handle exceptions. 1039 } 1040 ``` 1041 1042The subsequent processes (including refreshing, resetting, stopping, and destroying the encoder) are the same as those in surface mode. For details, see steps 14–17 in [Surface Mode](#surface-mode). 1043 1044<!--no_check-->