• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Dual-Channel Preview (ArkTS)
2<!--Kit: Camera Kit-->
3<!--Subsystem: Multimedia-->
4<!--Owner: @qano-->
5<!--SE: @leo_ysl-->
6<!--TSE: @xchaosioda-->
7
8Before developing a camera application, request permissions by following the instructions provided in [Requesting Camera Development Permissions](camera-preparation.md).
9
10Dual-channel preview means that an application can use two preview streams at the same time. One preview stream is used for display on the screen, and the other is used for other operations such as image processing, so as to improve the processing efficiency.
11
12The camera application controls the camera device to implement basic operations such as image display (preview), photo saving (photo capture), and video recording. The camera model is developed on the surface model, meaning that an application transfers data through the surface. Specifically, it obtains the photo stream data through the surface of an ImageReceiver object and the preview stream data through the surface of an **XComponent**.
13
14To implement dual-channel preview (there are two preview streams instead of one preview stream plus one photo stream), you must create a previewOutput object through the surface of an ImageReceiver object. Other processes are the same as those of the photo stream and preview stream.
15
16Read [Module Description](../../reference/apis-camera-kit/arkts-apis-camera.md) for the API reference.
17
18## Constraints
19
20- Currently, streams cannot be dynamically added. In other words, you cannot call [addOutput](../../reference/apis-camera-kit/arkts-apis-camera-Session.md#addoutput11) to add streams without calling [session.stop](../../reference/apis-camera-kit/arkts-apis-camera-Session.md#stop11) first.
21- After an ImageReceiver object processes image data obtained, it must release the image buffer so that the buffer queue of the surface properly rotates.
22
23## API Calling Process
24
25The figure below shows the recommended API calling process of the dual-channel preview solution.
26
27![dual-preview-streams-instructions](figures/dual-preview-streams-instructions.png)
28
29## How to Develop
30
31- For the first preview stream used for image processing, create an ImageReceiver object, obtain the surface ID to create the first preview stream, register an image listener, and process each frame of image data in the preview stream as required.
32- For the second preview stream used for image display, create an **XComponent**, obtain the surface ID to create the second preview stream, and render the preview stream data in the component.
33- To enable both preview streams to obtain data, configure a camera session for both preview streams, and start the session.
34
35### First Preview Stream Used for Image Processing
36
371. Import dependencies, including dependencies related to image and camera framework.
38
39    ```ts
40    import { image } from '@kit.ImageKit';
41    import { camera } from '@kit.CameraKit';
42    import { BusinessError } from '@kit.BasicServicesKit';
43    ```
44
452. Obtain the surface ID for the first preview stream. Specifically, create an ImageReceiver object, and obtain the surface ID through the object.
46
47    ```ts
48    let imageWidth: number = 1920; // Use the width in the profile size supported by the device.
49    let imageHeight: number = 1080; // Use the height in the profile size supported by the device.
50
51    async function initImageReceiver():Promise<void>{
52      // Create an ImageReceiver object.
53      let size: image.Size = { width: imageWidth, height: imageHeight };
54      let imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
55      // Obtain the surface ID for the first preview stream.
56      let imageReceiverSurfaceId = await imageReceiver.getReceivingSurfaceId();
57      console.info(`initImageReceiver imageReceiverSurfaceId:${imageReceiverSurfaceId}`);
58    }
59    ```
603. Obtain the image or PixelMap formats of the preview stream received by ImageReceiver. For details about the image format, see the **format** parameter in [Image](../../reference/apis-image-kit/arkts-apis-image-Image.md). For details about the [PixelMap](../../reference/apis-image-kit/arkts-apis-image-PixelMap.md) format, see [PixelMapFormat](../../reference/apis-image-kit/arkts-apis-image-e.md#pixelmapformat7).
61
62    ```ts
63    // Mappings between image formats and PixelMap formats.
64    let formatToPixelMapFormatMap = new Map<number, image.PixelMapFormat>([
65      [12, image.PixelMapFormat.RGBA_8888],
66      [25, image.PixelMapFormat.NV21],
67      [35, image.PixelMapFormat.YCBCR_P010],
68      [36, image.PixelMapFormat.YCRCB_P010]
69    ]);
70    // Mapping of the size of a single pixel for each PixelMapFormat.
71    let pixelMapFormatToSizeMap = new Map<image.PixelMapFormat, number>([
72      [image.PixelMapFormat.RGBA_8888, 4],
73      [image.PixelMapFormat.NV21, 1.5],
74      [image.PixelMapFormat.YCBCR_P010, 3],
75      [image.PixelMapFormat.YCRCB_P010, 3]
76    ]);
77    ```
78
794. Register a listener to process each frame of image data in the preview stream. Specifically, use the **imageArrival** event in the ImageReceiver object to obtain the image data returned by the bottom layer. For details, see [Image API Reference](../../reference/apis-image-kit/arkts-apis-image-ImageReceiver.md).
80
81    > **NOTE**
82    >
83    > - When you create a [PixelMap](../../reference/apis-image-kit/arkts-apis-image-PixelMap.md) instance using the [createPixelMap](../../reference/apis-image-kit/arkts-apis-image-f.md#imagecreatepixelmap8) API, the properties such as **Size** and **srcPixelFormat** must match **Size** and **Format** configured in preview output stream's preview profile. For details about the image pixel format of ImageReceiver, see [PixelMapFormat](../../reference/apis-image-kit/arkts-apis-image-e.md#pixelmapformat7). For details about the output format the preview profile, see [CameraFormat](../../reference/apis-camera-kit/arkts-apis-camera-e.md#cameraformat).
84    > - Due to the variability across different devices, you must obtain the preview profiles supported by the current device by calling [getSupportedOutputCapability](../../reference/apis-camera-kit/arkts-apis-camera-CameraManager.md#getsupportedoutputcapability11) before creating a preview output stream. Then based on actual service requirements, select a suitable preview profile that meets the required [CameraFormat](../../reference/apis-camera-kit/arkts-apis-camera-e.md#cameraformat) and [Size](../../reference/apis-camera-kit/arkts-apis-camera-i.md#size).
85    > - The actual format of the preview stream image data received by ImageReceiver is determined by the **format** parameter in the preview profile that you select based on service requirements when creating the preview output stream. For details, see [Enabling a Preview Stream to Obtain Data](camera-dual-channel-preview.md#enabling-a-preview-stream-to-obtain-data).
86
87
88    ```ts
89    function onImageArrival(receiver: image.ImageReceiver): void {
90      // Subscribe to the imageArrival event.
91      receiver.on('imageArrival', () => {
92        // Obtain an image.
93        receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
94          if (err || nextImage === undefined) {
95            console.error('readNextImage failed');
96            return;
97          }
98          // Parse the image.
99          nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
100            if (err || imgComponent === undefined) {
101              console.error('getComponent failed');
102            }
103            if (imgComponent.byteBuffer) {
104              // For details, see the description of parsing the image buffer data below. This example uses method 1.
105              let width = nextImage.size.width; // Obtain the image width.
106              let height = nextImage.size.height; // Obtain the image height.
107              let stride = imgComponent.rowStride; // Obtain the image stride.
108              let imageFormat = nextImage.format; // Obtain the image format.
109              let pixelMapFormat = formatToPixelMapFormatMap.get(imageFormat) ?? image.PixelMapFormat.NV21;
110              let mSize = pixelMapFormatToSizeMap.get(pixelMapFormat) ?? 1.5;
111              console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
112              // The values of size and srcPixelFormat used during PixelMap creation must match size and format in the preview profile of the preview output stream.
113              // The value of stride is the same as that of width.
114              if (stride == width) {
115                let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
116                  size: { height: height, width: width },
117                  srcPixelFormat: pixelMapFormat,
118                })
119              } else {
120                // The value of stride is different from that of width.
121                const dstBufferSize = width * height * mSize
122                const dstArr = new Uint8Array(dstBufferSize)
123                for (let j = 0; j < height * mSize; j++) {
124                  const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
125                  dstArr.set(srcBuf, j * width)
126                }
127                let pixelMap = await image.createPixelMap(dstArr.buffer, {
128                  size: { height: height, width: width },
129                  srcPixelFormat: pixelMapFormat,
130                })
131              }
132            } else {
133              console.error('byteBuffer is null');
134            }
135            // Release the resource when the buffer is not in use.
136            // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
137            nextImage.release();
138          })
139        })
140      })
141    }
142    ```
143
144    The following methods are available for parsing the image buffer data by using [image.Component](../../reference/apis-image-kit/arkts-apis-image-i.md#component9).
145
146    > **NOTE**
147    > Check whether the width of the image is the same as **rowStride**. If they are different, perform the following operations:
148
149    Method 1: Remove the stride data from **imgComponent.byteBuffer**, obtain a new buffer by means of copy, and process the buffer by calling the API that does not support stride.
150
151    ```ts
152    // The values of size and srcPixelFormat used during PixelMap creation must match size and format in the preview profile of the preview output stream.
153    const dstBufferSize = width * height * mSize;
154    const dstArr = new Uint8Array(dstBufferSize);
155    // Read the buffer data line by line.
156    for (let j = 0; j < height * mSize; j++) {
157      // Copy the first width bytes of each line of data in imgComponent.byteBuffer to dstArr.
158      const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width);
159      dstArr.set(srcBuf, j * width);
160    }
161    let pixelMap = await image.createPixelMap(dstArr.buffer, {
162      size: { height: height, width: width }, srcPixelFormat: pixelMapFormat
163
164    });
165    ```
166
167    Method 2: Create a PixelMap based on the value of stride * height, and call **cropSync** of the PixelMap to crop redundant pixels.
168
169    ```ts
170    // Create a PixelMap, with width set to the value of stride.
171    let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
172      size:{height: height, width: stride}, srcPixelFormat: pixelMapFormat});
173    // Crop extra pixels.
174    pixelMap.cropSync({size:{width:width, height:height}, x:0, y:0});
175    ```
176
177    Method 3: Pass **imgComponent.byteBuffer** and **stride** to the API that supports stride.
178
179
180
181### Second Preview Stream Used for Image Display
182
183To obtain the surface ID of the second preview stream, you must first create an **XComponent** for displaying the preview stream. For details about how to obtain the surface ID, see [getXComponentSurfaceId](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md#getxcomponentsurfaceid9). The **XComponent** capability is provided by the UI. For details, see [XComponent](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md).
184
185```ts
186@Component
187struct example {
188  xComponentCtl: XComponentController = new XComponentController();
189  surfaceId:string = '';
190  imageWidth: number = 1920;
191  imageHeight: number = 1080;
192  private uiContext: UIContext = this.getUIContext();
193
194  build() {
195    XComponent({
196      id: 'componentId',
197      type: XComponentType.SURFACE,
198      controller: this.xComponentCtl
199    })
200      .onLoad(async () => {
201        console.info('onLoad is called');
202        this.surfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
203        // Use the surface ID to create a preview stream and start the camera. The component renders the preview stream data of each frame in real time.
204      })
205      // The width and height of the surface are opposite to those of the XComponent. Alternatively, you can use .renderFit(RenderFit.RESIZE_CONTAIN) to automatically adjust the display without manually setting the width and height.
206      .width(this.uiContext.px2vp(this.imageHeight))
207      .height(this.uiContext.px2vp(this.imageWidth))
208  }
209}
210```
211
212
213
214### Enabling a Preview Stream to Obtain Data
215
216Create two preview outputs with two surface IDs, add the outputs to a camera session, and start the camera session to obtain the preview stream data.
217
218```ts
219function createDualPreviewOutput(cameraManager: camera.CameraManager, previewProfile: camera.Profile,
220  session: camera.Session, imageReceiverSurfaceId: string, xComponentSurfaceId: string): void {
221  // Create the first preview output by using imageReceiverSurfaceId.
222  let previewOutput1 = cameraManager.createPreviewOutput(previewProfile, imageReceiverSurfaceId);
223  if (!previewOutput1) {
224  console.error('createPreviewOutput1 error');
225  }
226  // Create the second preview output by using xComponentSurfaceId.
227  let previewOutput2 = cameraManager.createPreviewOutput(previewProfile, xComponentSurfaceId);
228  if (!previewOutput2) {
229  console.error('createPreviewOutput2 error');
230  }
231  // Add the output of the first preview stream.
232  session.addOutput(previewOutput1);
233  // Add the output of the second preview stream.
234  session.addOutput(previewOutput2);
235}
236```
237
238
239
240## Sample
241
242```ts
243import { camera } from '@kit.CameraKit';
244import { image } from '@kit.ImageKit';
245import { BusinessError } from '@kit.BasicServicesKit';
246import { abilityAccessCtrl, Permissions } from '@kit.AbilityKit';
247
248@Entry
249@Component
250struct Index {
251  private imageReceiver: image.ImageReceiver | undefined = undefined;
252  private imageReceiverSurfaceId: string = '';
253  private xComponentCtl: XComponentController = new XComponentController();
254  private xComponentSurfaceId: string = '';
255  @State imageWidth: number = 1920;
256  @State imageHeight: number = 1080;
257  private cameraManager: camera.CameraManager | undefined = undefined;
258  private cameras: Array<camera.CameraDevice> | Array<camera.CameraDevice> = [];
259  private cameraInput: camera.CameraInput | undefined = undefined;
260  private previewOutput1: camera.PreviewOutput | undefined = undefined;
261  private previewOutput2: camera.PreviewOutput | undefined = undefined;
262  private session: camera.VideoSession | undefined = undefined;
263  private uiContext: UIContext = this.getUIContext();
264  private context: Context | undefined = this.uiContext.getHostContext();
265  private cameraPermission: Permissions = 'ohos.permission.CAMERA'; // For details about how to request permissions, see the instructions provided at the beginning of this topic.
266  @State isShow: boolean = false;
267
268  async requestPermissionsFn(): Promise<void> {
269    let atManager = abilityAccessCtrl.createAtManager();
270    if (this.context) {
271      let res = await atManager.requestPermissionsFromUser(this.context, [this.cameraPermission]);
272      for (let i =0; i < res.permissions.length; i++) {
273        if (this.cameraPermission.toString() === res.permissions[i] && res.authResults[i] === 0) {
274          this.isShow = true;
275        }
276      }
277    }
278  }
279
280  aboutToAppear(): void {
281    this.requestPermissionsFn();
282  }
283
284  onPageShow(): void {
285    console.info('onPageShow');
286    this.initImageReceiver();
287    if (this.xComponentSurfaceId !== '') {
288      this.initCamera();
289    }
290  }
291
292  onPageHide(): void {
293    console.info('onPageHide');
294    this.releaseCamera();
295  }
296
297  /**
298   * Obtain the surface ID of the ImageReceiver object.
299   * @param receiver
300   * @returns
301   */
302  async initImageReceiver(): Promise<void> {
303    if (!this.imageReceiver) {
304      // Create an ImageReceiver object.
305      let size: image.Size = { width: this.imageWidth, height: this.imageHeight };
306      this.imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
307      // Obtain the surface ID for the first preview stream.
308      this.imageReceiverSurfaceId = await this.imageReceiver.getReceivingSurfaceId();
309      console.info(`initImageReceiver imageReceiverSurfaceId:${this.imageReceiverSurfaceId}`);
310      // Register a listener to listen for and process the image data of each frame in the preview stream.
311      this.onImageArrival(this.imageReceiver);
312    }
313  }
314
315  // Mappings between image formats and PixelMap formats.
316  private formatToPixelMapFormatMap = new Map<number, image.PixelMapFormat>([
317    [12, image.PixelMapFormat.RGBA_8888],
318    [25, image.PixelMapFormat.NV21],
319    [35, image.PixelMapFormat.YCBCR_P010],
320    [36, image.PixelMapFormat.YCRCB_P010]
321  ]);
322  // Mapping of the size of a single pixel for each PixelMapFormat.
323  private pixelMapFormatToSizeMap = new Map<image.PixelMapFormat, number>([
324    [image.PixelMapFormat.RGBA_8888, 4],
325    [image.PixelMapFormat.NV21, 1.5],
326    [image.PixelMapFormat.YCBCR_P010, 3],
327    [image.PixelMapFormat.YCRCB_P010, 3]
328  ]);
329
330  /**
331   * Register a listener for the ImageReceiver object.
332   * @param receiver
333   */
334  onImageArrival(receiver: image.ImageReceiver): void {
335    // Subscribe to the imageArrival event.
336    receiver.on('imageArrival', () => {
337      console.info('image arrival');
338      // Obtain an image.
339      receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
340        if (err || nextImage === undefined) {
341          console.error('readNextImage failed');
342          return;
343        }
344        // Parse the image.
345        nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
346          if (err || imgComponent === undefined) {
347            console.error('getComponent failed');
348          }
349          if (imgComponent.byteBuffer) {
350            // Parse the buffer data by referring to step 7. This example uses method 1 as an example.
351            let width = nextImage.size.width; // Obtain the image width.
352            let height = nextImage.size.height; // Obtain the image height.
353            let stride = imgComponent.rowStride; // Obtain the image stride.
354            let imageFormat = nextImage.format; // Obtain the image format.
355            let pixelMapFormat = this.formatToPixelMapFormatMap.get(imageFormat) ?? image.PixelMapFormat.NV21;
356            let mSize =  this.pixelMapFormatToSizeMap.get(pixelMapFormat) ?? 1.5;
357            console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
358            // The values of size and srcPixelFormat used during PixelMap creation must match size and format in the preview profile of the preview output stream.The NV21 format is used as an example.
359            // The value of stride is the same as that of width.
360            if (stride == width) {
361              let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
362                size: { height: height, width: width },
363                srcPixelFormat: pixelMapFormat,
364              })
365            } else {
366              // The value of stride is different from that of width.
367              const dstBufferSize = width * height * mSize // For example, for NV21 (images in YUV_420_SP format), the formula for calculating the YUV_420_SP memory is as follows: YUV_420_SP memory = Width * Height + (Width * Height)/2
368              const dstArr = new Uint8Array(dstBufferSize)
369              for (let j = 0; j < height * mSize; j++) {
370                const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
371                dstArr.set(srcBuf, j * width)
372              }
373              let pixelMap = await image.createPixelMap(dstArr.buffer, {
374                size: { height: height, width: width },
375                srcPixelFormat: pixelMapFormat,
376              })
377            }
378          } else {
379            console.error('byteBuffer is null');
380          }
381          // Release the resource when the buffer is not in use.
382          // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
383          nextImage.release();
384          console.info('image process done');
385        })
386      })
387    })
388  }
389
390  build() {
391    Column() {
392      if (this.isShow) {
393        XComponent({
394          id: 'componentId',
395          type: XComponentType.SURFACE,
396          controller: this.xComponentCtl
397        })
398          .onLoad(async () => {
399            console.info('onLoad is called');
400            this.xComponentSurfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
401            // Initialize the camera. The component renders the preview stream data of each frame in real time.
402            this.initCamera()
403          })
404          .width(this.uiContext.px2vp(this.imageHeight))
405          .height(this.uiContext.px2vp(this.imageWidth))
406      }
407    }
408    .justifyContent(FlexAlign.Center)
409    .height('100%')
410    .width('100%')
411  }
412
413  // Initialize a camera.
414  async initCamera(): Promise<void> {
415    console.info(`initCamera imageReceiverSurfaceId:${this.imageReceiverSurfaceId} xComponentSurfaceId:${this.xComponentSurfaceId}`);
416    try {
417      // Obtain a camera manager instance.
418      this.cameraManager = camera.getCameraManager(this.context);
419      if (!this.cameraManager) {
420        console.error('initCamera getCameraManager');
421      }
422      // Obtain the list of cameras supported by the device.
423      this.cameras = this.cameraManager.getSupportedCameras();
424      if (!this.cameras) {
425        console.error('initCamera getSupportedCameras');
426      }
427      // Select a camera device and create a CameraInput object.
428      this.cameraInput = this.cameraManager.createCameraInput(this.cameras[0]);
429      if (!this.cameraInput) {
430        console.error('initCamera createCameraInput');
431      }
432      // Open the camera.
433      await this.cameraInput.open().catch((err: BusinessError) => {
434        console.error(`initCamera open fail: ${err}`);
435      })
436      // Obtain the profile supported by the camera device.
437      let capability: camera.CameraOutputCapability =
438        this.cameraManager.getSupportedOutputCapability(this.cameras[0], camera.SceneMode.NORMAL_VIDEO);
439      if (!capability) {
440        console.error('initCamera getSupportedOutputCapability');
441      }
442      let minRatioDiff : number = 0.1;
443      let surfaceRatio : number = this.imageWidth / this.imageHeight; // The closest aspect ratio to 16:9.
444      let previewProfile: camera.Profile = capability.previewProfiles[0];
445      // Select a supported preview stream profile based on service requirements.
446      // The following uses the preview stream profile with the CAMERA_FORMAT_YUV_420_SP (NV21) format that meets the resolution constraints as an example.
447      for (let index = 0; index < capability.previewProfiles.length; index++) {
448        const tempProfile = capability.previewProfiles[index];
449        let tempRatio = tempProfile.size.width >= tempProfile.size.height ?
450          tempProfile.size.width / tempProfile.size.height : tempProfile.size.height / tempProfile.size.width;
451        let currentRatio = Math.abs(tempRatio - surfaceRatio);
452        if (currentRatio <= minRatioDiff && tempProfile.format == camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP) {
453          previewProfile = tempProfile;
454          break;
455        }
456      }
457      this.imageWidth = previewProfile.size.width; // Update the width of the XComponent.
458      this.imageHeight = previewProfile.size.height; // Update the height of the XComponent.
459      console.info(`initCamera imageWidth:${this.imageWidth} imageHeight:${this.imageHeight}`);
460      // Create the first preview output by using imageReceiverSurfaceId.
461      this.previewOutput1 = this.cameraManager.createPreviewOutput(previewProfile, this.imageReceiverSurfaceId);
462      if (!this.previewOutput1) {
463        console.error('initCamera createPreviewOutput1');
464      }
465      // Create the second preview output by using xComponentSurfaceId.
466      this.previewOutput2 = this.cameraManager.createPreviewOutput(previewProfile, this.xComponentSurfaceId);
467      if (!this.previewOutput2) {
468        console.error('initCamera createPreviewOutput2');
469      }
470      // Create a camera session in recording mode.
471      this.session = this.cameraManager.createSession(camera.SceneMode.NORMAL_VIDEO) as camera.VideoSession;
472      if (!this.session) {
473        console.error('initCamera createSession');
474      }
475      // Start configuration for the session.
476      this.session.beginConfig();
477      // Add a camera input.
478      this.session.addInput(this.cameraInput);
479      // Add the output of the first preview stream.
480      this.session.addOutput(this.previewOutput1);
481      // Add the output of the second preview stream.
482      this.session.addOutput(this.previewOutput2);
483      // Commit the session configuration.
484      await this.session.commitConfig();
485      // Start the configured input and output streams.
486      await this.session.start();
487    } catch (error) {
488      console.error(`initCamera fail: ${error}`);
489    }
490  }
491
492  // Release the camera.
493  async releaseCamera(): Promise<void> {
494    console.info('releaseCamera E');
495    try {
496      // Stop the session.
497      await this.session?.stop();
498      // Release the camera input stream.
499      await this.cameraInput?.close();
500      // Release the preview output stream.
501      await this.previewOutput1?.release();
502      // Release the photo output stream.
503      await this.previewOutput2?.release();
504      // Release the session.
505      await this.session?.release();
506    } catch (error) {
507      console.error(`initCamera fail: ${error}`);
508    }
509  }
510}
511```
512