• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Dual-Channel Preview (ArkTS)
2
3Before developing a camera application, request permissions by following the instructions provided in [Camera Development Preparations](camera-preparation.md).
4
5Dual-channel preview means that an application can use two preview streams at the same time. One preview stream is used for display on the screen, and the other is used for other operations such as image processing, so as to improve the processing efficiency.
6
7The camera application controls the camera device to implement basic operations such as image display (preview), photo saving (photo capture), and video recording. The camera model is developed on the surface model, meaning that an application transfers data through the surface. Specifically, it obtains the photo stream data through the surface of an ImageReceiver object and the preview stream data through the surface of an **XComponent**.
8
9To implement dual-channel preview (there are two preview streams instead of one preview stream plus one photo stream), you must create a previewOutput object through the surface of an ImageReceiver object. Other processes are the same as those of the photo stream and preview stream.
10
11Read [Camera](../../reference/apis-camera-kit/js-apis-camera.md) for the API reference.
12
13## Constraints
14
15- Currently, streams cannot be dynamically added. In other words, you cannot call [addOutput](../../reference/apis-camera-kit/js-apis-camera.md#addoutput11) to add streams without calling [session.stop](../../reference/apis-camera-kit/js-apis-camera.md#stop11) first.
16- After an ImageReceiver object processes image data obtained, it must release the image buffer so that the buffer queue of the surface properly rotates.
17
18## API Calling Process
19
20The figure below shows the recommended API calling process of the dual-channel preview solution.
21
22![dual-preview-streams-instructions](figures/dual-preview-streams-instructions.png)
23
24## How to Develop
25
26- For the first preview stream used for image processing, create an ImageReceiver object, obtain the surface ID to create the first preview stream, register an image listener, and process each frame of image data in the preview stream as required.
27- For the second preview stream used for image display, create an **XComponent**, obtain the surface ID to create the second preview stream, and render the preview stream data in the component.
28- To enable both preview streams to obtain data, configure a camera session for both preview streams, and start the session.
29
30### First Preview Stream Used for Image Processing
31
321. Obtain the surface ID for the first preview stream. Specifically, create an ImageReceiver object, and obtain the surface ID through the object.
33
34    ```ts
35    import { image } from '@kit.ImageKit';
36    let imageWidth: number = 1920; // Use the width in the profile size supported by the device.
37    let imageHeight: number = 1080; // Use the height in the profile size supported by the device.
38
39    async function initImageReceiver():Promise<void>{
40      // Create an ImageReceiver object.
41      let size: image.Size = { width: imageWidth, height: imageHeight };
42      let imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
43      // Obtain the surface ID for the first preview stream.
44      let imageReceiverSurfaceId = await imageReceiver.getReceivingSurfaceId();
45      console.info(`initImageReceiver imageReceiverSurfaceId:${imageReceiverSurfaceId}`);
46    }
47    ```
48
492. Register a listener to process each frame of image data in the preview stream. Specifically, use the **imageArrival** event in the ImageReceiver object to obtain the image data returned by the bottom layer. For details, see [Image API Reference](../../reference/apis-image-kit/js-apis-image.md).
50
51    ```ts
52    import { BusinessError } from '@kit.BasicServicesKit';
53    import { image } from '@kit.ImageKit';
54
55    function onImageArrival(receiver: image.ImageReceiver): void {
56      // Subscribe to the imageArrival event.
57      receiver.on('imageArrival', () => {
58        // Obtain an image.
59        receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
60          if (err || nextImage === undefined) {
61            console.error('readNextImage failed');
62            return;
63          }
64          // Parse the image.
65          nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
66            if (err || imgComponent === undefined) {
67              console.error('getComponent failed');
68            }
69            if (imgComponent.byteBuffer) {
70              // For details, see the description of parsing the image buffer data below. This example uses method 1.
71              let width = nextImage.size.width; // Obtain the image width.
72              let height = nextImage.size.height; // Obtain the image height.
73              let stride = imgComponent.rowStride; // Obtain the image stride.
74              console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
75              // The value of stride is the same as that of width.
76              if (stride == width) {
77                let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
78                  size: { height: height, width: width },
79                  srcPixelFormat: 8,
80                })
81              } else {
82                // The value of stride is different from that of width.
83                const dstBufferSize = width * height * 1.5
84                const dstArr = new Uint8Array(dstBufferSize)
85                for (let j = 0; j < height * 1.5; j++) {
86                  const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
87                  dstArr.set(srcBuf, j * width)
88                }
89                let pixelMap = await image.createPixelMap(dstArr.buffer, {
90                  size: { height: height, width: width },
91                  srcPixelFormat: 8,
92                })
93              }
94            } else {
95              console.error('byteBuffer is null');
96            }
97            // Release the resource when the buffer is not in use.
98            // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
99            nextImage.release();
100          })
101        })
102      })
103    }
104    ```
105
106    The following methods are available for parsing the image buffer data by using [image.Component](../../reference/apis-image-kit/js-apis-image.md#component9).
107
108    > **NOTE**
109    > Check whether the width of the image is the same as **rowStride**. If they are different, perform the following operations:
110
111    Method 1: Remove the stride data from **imgComponent.byteBuffer**, obtain a new buffer by means of copy, and process the buffer by calling the API that does not support stride.
112
113    ```ts
114    // For example, for NV21 (images in YUV_420_SP format), the formula for calculating the YUV_420_SP memory is as follows: YUV_420_SP memory = Width * Height + (Width * Height)/2.
115    const dstBufferSize = width * height * 1.5;
116    const dstArr = new Uint8Array(dstBufferSize);
117    // Read the buffer data line by line.
118    for (let j = 0; j < height * 1.5; j++) {
119      // Copy the first width bytes of each line of data in imgComponent.byteBuffer to dstArr.
120      const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width);
121      dstArr.set(srcBuf, j * width);
122    }
123    let pixelMap = await image.createPixelMap(dstArr.buffer, {
124      size: { height: height, width: width }, srcPixelFormat: 8
125    });
126    ```
127
128    Method 2: Create a PixelMap based on the value of stride * height, and call **cropSync** of the PixelMap to crop redundant pixels.
129
130    ```ts
131    // Create a PixelMap, with width set to the value of stride.
132    let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
133      size:{height: height, width: stride}, srcPixelFormat: 8});
134    // Crop extra pixels.
135    pixelMap.cropSync({size:{width:width, height:height}, x:0, y:0});
136    ```
137
138    Method 3: Pass **imgComponent.byteBuffer** and **stride** to the API that supports stride.
139
140
141
142### Second Preview Stream Used for Image Display
143
144To obtain the surface ID of the second preview stream, you must first create an **XComponent** for displaying the preview stream. For details about how to obtain the surface ID, see [getXcomponentSurfaceId](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md#getxcomponentsurfaceid9). The **XComponent** capability is provided by the UI. For details, see [XComponent](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md).
145
146```ts
147@Component
148struct example {
149  xComponentCtl: XComponentController = new XComponentController();
150  surfaceId:string = '';
151  imageWidth: number = 1920;
152  imageHeight: number = 1080;
153
154  build() {
155    XComponent({
156      id: 'componentId',
157      type: 'surface',
158      controller: this.xComponentCtl
159    })
160      .onLoad(async () => {
161        console.info('onLoad is called');
162        this.surfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
163        // Use the surface ID to create a preview stream and start the camera. The component renders the preview stream data of each frame in real time.
164      })
165      // The width and height of the surface are opposite to those of the XComponent. Alternatively, you can use .renderFit(RenderFit.RESIZE_CONTAIN) to automatically adjust the display without manually setting the width and height.
166      .width(px2vp(this.imageHeight))
167      .height(px2vp(this.imageWidth))
168  }
169}
170```
171
172
173
174### Enabling a Preview Stream to Obtain Data
175
176Create two preview outputs with two surface IDs, add the outputs to a camera session, and start the camera session to obtain the preview stream data.
177
178```ts
179function createDualPreviewOutput(cameraManager: camera.CameraManager, previewProfile: camera.Profile,
180session: camera.Session,
181imageReceiverSurfaceId: string, xComponentSurfaceId: string): void {
182    // Create the first preview output by using imageReceiverSurfaceId.
183    let previewOutput1 = cameraManager.createPreviewOutput(previewProfile, imageReceiverSurfaceId);
184    if (!previewOutput1) {
185    console.error('createPreviewOutput1 error');
186    }
187    // Create the second preview output by using xComponentSurfaceId.
188    let previewOutput2 = cameraManager.createPreviewOutput(previewProfile, xComponentSurfaceId);
189    if (!previewOutput2) {
190    console.error('createPreviewOutput2 error');
191    }
192    // Add the output of the first preview stream.
193    session.addOutput(previewOutput1);
194    // Add the output of the second preview stream.
195    session.addOutput(previewOutput2);
196}
197```
198
199
200
201## Sample
202
203```ts
204import { camera } from '@kit.CameraKit';
205import { image } from '@kit.ImageKit';
206import { BusinessError } from '@kit.BasicServicesKit';
207
208@Entry
209@Component
210struct Index {
211  private imageReceiver: image.ImageReceiver | undefined = undefined;
212  private imageReceiverSurfaceId: string = '';
213  private xComponentCtl: XComponentController = new XComponentController();
214  private xComponentSurfaceId: string = '';
215  @State imageWidth: number = 1920;
216  @State imageHeight: number = 1080;
217  private cameraManager: camera.CameraManager | undefined = undefined;
218  private cameras: Array<camera.CameraDevice> | Array<camera.CameraDevice> = [];
219  private cameraInput: camera.CameraInput | undefined = undefined;
220  private previewOutput1: camera.PreviewOutput | undefined = undefined;
221  private previewOutput2: camera.PreviewOutput | undefined = undefined;
222  private session: camera.VideoSession | undefined = undefined;
223
224  onPageShow(): void {
225    console.info('onPageShow');
226    this.initImageReceiver();
227    if (this.xComponentSurfaceId !== '') {
228      this.initCamera();
229    }
230  }
231
232  onPageHide(): void {
233    console.info('onPageHide');
234    this.releaseCamera();
235  }
236
237  /**
238   * Obtain the surface ID of the ImageReceiver object.
239   * @param receiver
240   * @returns
241   */
242  async initImageReceiver(): Promise<void> {
243    if (!this.imageReceiver) {
244      // Create an ImageReceiver object.
245      let size: image.Size = { width: this.imageWidth, height: this.imageHeight };
246      this.imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
247      // Obtain the surface ID for the first preview stream.
248      this.imageReceiverSurfaceId = await this.imageReceiver.getReceivingSurfaceId();
249      console.info(`initImageReceiver imageReceiverSurfaceId:${this.imageReceiverSurfaceId}`);
250      // Register a listener to listen for and process the image data of each frame in the preview stream.
251      this.onImageArrival(this.imageReceiver);
252    }
253  }
254
255  /**
256   * Register a listener for the ImageReceiver object.
257   * @param receiver
258   */
259  onImageArrival(receiver: image.ImageReceiver): void {
260    // Subscribe to the imageArrival event.
261    receiver.on('imageArrival', () => {
262      console.info('image arrival');
263      // Obtain an image.
264      receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
265        if (err || nextImage === undefined) {
266          console.error('readNextImage failed');
267          return;
268        }
269        // Parse the image.
270        nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
271          if (err || imgComponent === undefined) {
272            console.error('getComponent failed');
273          }
274          if (imgComponent.byteBuffer) {
275            // Parse the buffer data by referring to step 7. This example uses method 1 as an example.
276            let width = nextImage.size.width; // Obtain the image width.
277            let height = nextImage.size.height; // Obtain the image height.
278            let stride = imgComponent.rowStride; // Obtain the image stride.
279            console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
280            // The value of stride is the same as that of width.
281            if (stride == width) {
282              let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
283                size: { height: height, width: width },
284                srcPixelFormat: 8,
285              })
286            } else {
287              // The value of stride is different from that of width.
288              const dstBufferSize = width * height * 1.5 // For example, for NV21 (images in YUV_420_SP format), the formula for calculating the YUV_420_SP memory is as follows: YUV_420_SP memory = Width * Height + (Width * Height)/2
289              const dstArr = new Uint8Array(dstBufferSize)
290              for (let j = 0; j < height * 1.5; j++) {
291                const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
292                dstArr.set(srcBuf, j * width)
293              }
294              let pixelMap = await image.createPixelMap(dstArr.buffer, {
295                size: { height: height, width: width },
296                srcPixelFormat: 8,
297              })
298            }
299          } else {
300            console.error('byteBuffer is null');
301          }
302          // Release the resource when the buffer is not in use.
303          // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
304          nextImage.release();
305          console.info('image process done');
306        })
307      })
308    })
309  }
310
311  build() {
312    Column() {
313      XComponent({
314        id: 'componentId',
315        type: 'surface',
316        controller: this.xComponentCtl
317      })
318        .onLoad(async () => {
319          console.info('onLoad is called');
320          this.xComponentSurfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
321          // Initialize the camera. The component renders the preview stream data of each frame in real time.
322          this.initCamera()
323        })
324        .width(px2vp(this.imageHeight))
325        .height(px2vp(this.imageWidth))
326    }.justifyContent(FlexAlign.Center)
327    .height('100%')
328    .width('100%')
329  }
330
331  // Initialize a camera.
332  async initCamera(): Promise<void> {
333    console.info(`initCamera imageReceiverSurfaceId:${this.imageReceiverSurfaceId} xComponentSurfaceId:${this.xComponentSurfaceId}`);
334    try {
335      // Obtain a camera manager instance.
336      this.cameraManager = camera.getCameraManager(getContext(this));
337      if (!this.cameraManager) {
338        console.error('initCamera getCameraManager');
339      }
340      // Obtain the list of cameras supported by the device.
341      this.cameras = this.cameraManager.getSupportedCameras();
342      if (!this.cameras) {
343        console.error('initCamera getSupportedCameras');
344      }
345      // Select a camera device and create a CameraInput object.
346      this.cameraInput = this.cameraManager.createCameraInput(this.cameras[0]);
347      if (!this.cameraInput) {
348        console.error('initCamera createCameraInput');
349      }
350      // Open the camera.
351      await this.cameraInput.open().catch((err: BusinessError) => {
352        console.error(`initCamera open fail: ${JSON.stringify(err)}`);
353      })
354      // Obtain the profile supported by the camera device.
355      let capability: camera.CameraOutputCapability =
356        this.cameraManager.getSupportedOutputCapability(this.cameras[0], camera.SceneMode.NORMAL_VIDEO);
357      if (!capability) {
358        console.error('initCamera getSupportedOutputCapability');
359      }
360      // Select a supported preview stream profile based on service requirements.
361      let previewProfile: camera.Profile = capability.previewProfiles[0];
362      this.imageWidth = previewProfile.size.width; // Update the width of the XComponent.
363      this.imageHeight = previewProfile.size.height; // Update the height of the XComponent.
364      console.info(`initCamera imageWidth:${this.imageWidth} imageHeight:${this.imageHeight}`);
365      // Create the first preview output by using imageReceiverSurfaceId.
366      this.previewOutput1 = this.cameraManager.createPreviewOutput(previewProfile, this.imageReceiverSurfaceId);
367      if (!this.previewOutput1) {
368        console.error('initCamera createPreviewOutput1');
369      }
370      // Create the second preview output by using xComponentSurfaceId.
371      this.previewOutput2 = this.cameraManager.createPreviewOutput(previewProfile, this.xComponentSurfaceId);
372      if (!this.previewOutput2) {
373        console.error('initCamera createPreviewOutput2');
374      }
375      // Create a camera session in recording mode.
376      this.session = this.cameraManager.createSession(camera.SceneMode.NORMAL_VIDEO) as camera.VideoSession;
377      if (!this.session) {
378        console.error('initCamera createSession');
379      }
380      // Start configuration for the session.
381      this.session.beginConfig();
382      // Add a camera input.
383      this.session.addInput(this.cameraInput);
384      // Add the output of the first preview stream.
385      this.session.addOutput(this.previewOutput1);
386      // Add the output of the second preview stream.
387      this.session.addOutput(this.previewOutput2);
388      // Commit the session configuration.
389      await this.session.commitConfig();
390      // Start the configured input and output streams.
391      await this.session.start();
392    } catch (error) {
393      console.error(`initCamera fail: ${JSON.stringify(error)}`);
394    }
395  }
396
397  // Release the camera.
398  async releaseCamera(): Promise<void> {
399    console.info('releaseCamera E');
400    try {
401      // Stop the session.
402      await this.session?.stop();
403      // Release the camera input stream.
404      await this.cameraInput?.close();
405      // Release the preview output stream.
406      await this.previewOutput1?.release();
407      // Release the photo output stream.
408      await this.previewOutput2?.release();
409      // Release the session.
410      await this.session?.release();
411    } catch (error) {
412      console.error(`initCamera fail: ${JSON.stringify(error)}`);
413    }
414  }
415}
416```
417