• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Multimedia Development
2<!--Kit: Camera Kit-->
3<!--Subsystem: Multimedia-->
4<!--Owner: @qano-->
5<!--Designer: @leo_ysl-->
6<!--Tester: @xchaosioda-->
7<!--Adviser: @zengyawen-->
8
9## How do I obtain the frame data of a camera when using the XComponent to display the preview output stream of the camera? (API version 9)
10
11**Symptom**
12
13Currently, the API does not support real-time preview of the camera frame data. To obtain the frame data, you must bind an action, for example, photographing.
14
15**Solution**
16
17Create a dual-channel preview to obtain the frame data.
18
191. Use the XComponent to create a preview stream.
20
21   ```
22   // Obtain a PreviewOutput instance.
23   const surfaceId = globalThis.mxXComponentController.getXComponentSurfaceld();
24   this.mPreviewOutput = await Camera.createPreviewOutput(surfaceld) ;
25   ```
26
272. Use imageReceiver to listen for image information.
28
29   ```
30   // Add dual-channel preview.
31   const fullWidth = this.mFullScreenSize.width;
32   const fullHeight = this.mFullScreenSize.height;
33   const imageReceiver = await image.createImageReceiver(fullwidth, fullHeight,
34     formatImage, capacityImage) ;
35   const photoSurfaceId = await imageReceiver.getReceivingSurfaceld();
36   this.mPreviewOutputDouble = await Camera.createPreviewOutput ( photoSurfaceld)
37   ```
38
39
40## How do I obtain the preview image of the front camera? (API version 9)
41
42**Solution**
43
441. Use the \@ohos.multimedia.camera module to obtain the physical camera information.
45
46   ```
47   let cameraManager = await camera.getCameraManager(context);
48   let camerasInfo = await cameraManager.getSupportedCameras();
49   let cameraDevice = this.camerasInfo[0];
50   ```
51
522. Create and start the input stream channel of the physical camera.
53
54   ```
55   let cameraInput = await cameraManager.createCameraInput(cameraDevice);
56   await this.cameraInput.open();
57   ```
58
593. Obtain the output formats supported by the camera, and create a preview output channel based on the surface ID provided by the XComponent.
60
61   ```
62   let outputCapability = await this.cameraManager.getSupportedOutputCapability(cameraDevice);
63   let previewProfile = this.outputCapability.previewProfiles[0];
64   let previewOutput = await cameraManager.createPreviewOutput(previewProfile, previewId);
65   ```
66
674. Create a camera session, add the camera input stream and preview output stream to the session, and start the session. The preview image is displayed on the XComponent.
68
69   ```
70   let captureSession = await cameraManager.createCaptureSession();
71   await captureSession.beginConfig();
72   await captureSession.addInput(cameraInput);
73   await captureSession.addOutput(previewOutput);
74   await this.captureSession.commitConfig()
75   await this.captureSession.start();
76   ```
77
78
79## How do I set the camera focal length? (API version 9)
80
81**Solution**
82
831. Check whether the camera is a front camera. A front camera does not support focal length setting.
84
852. Use **captureSession.getZoomRatioRange()** to obtain the focal length range supported by the device.
86
873. Check whether the target focal length is within the range obtained. If yes, call **captureSession.setZoomRatio()** to set the focal length.
88
89
90## What should I do when multiple video components cannot be used for playback? (API version 9)
91
92**Symptom**
93
94A large number of video components are created. They cannot play media normally or even crash.
95
96**Solution**
97
98A maximum of 13 media player instances can be created.
99
100
101## How do I invoke the image library directly? (API version 9)
102
103**Solution**
104
105```
106let want = {
107  bundleName: 'com.ohos.photos',
108  abilityName: 'com.ohos.photos.MainAbility',
109  parameters: {
110 uri: 'detail'
111  }
112};
113let context = getContext(this) as common.UIAbilityContext;
114context.startAbility(want);
115```
116
117
118## How do I apply for the media read/write permission on a device? (API version 9)
119
120Applicable to: stage model
121
122**Solution**
123
1241. Configure the permissions ohos.permission.READ_MEDIA and ohos.permission.WRITE_MEDIA in the **module.json5** file.
125
126   Example:
127
128   ```
129   {
130     "module" : {
131       "requestPermissions":[
132         {
133           "name" : "ohos.permission.READ_MEDIA",
134           "reason": "$string:reason"
135         },
136         {
137           "name" : "ohos.permission.WRITE_MEDIA",
138           "reason": "$string:reason"
139         }
140       ]
141     }
142   }
143   ```
144
1452. Call **requestPermissionsFromUser** to request the permissions from end users in the form of a dialog box. This operation is required because the grant mode of both permissions is **user_grant**.
146
147   ```
148   let context = getContext(this) as common.UIAbilityContext;
149   let atManager = abilityAccessCtrl.createAtManager();
150   let permissions: Array<string> = ['ohos.permission.READ_MEDIA','ohos.permission.WRITE_MEDIA']
151   atManager.requestPermissionsFromUser(context, permissions)
152   .then((data) => {
153       console.log("Succeed to request permission from user with data: " + JSON.stringify(data))
154   }).catch((error) => {
155       console.log("Failed to request permission from user with error: " + JSON.stringify(error))
156   })
157   ```
158
159
160## How do I obtain the camera status? (API version 9)
161
162Applicable to: stage model
163
164**Solution**
165
166The **cameraManager** class provides a listener to subscribe to the camera status.
167
168```
169cameraManager.on('cameraStatus', (cameraStatusInfo) => {
170  console.log(`camera : ${cameraStatusInfo.camera.cameraId}`);
171  console.log(`status: ${cameraStatusInfo.status}`);
172})
173```
174
175CameraStatus
176
177Enumerates the camera statuses.
178
179**CAMERA_STATUS_APPEAR** (0): A camera appears.
180
181**CAMERA_STATUS_DISAPPEAR** (1): The camera disappears.
182
183**CAMERA_STATUS_AVAILABLE** (2): The camera is available.
184
185**CAMERA_STATUS_UNAVAILABLE** (3): The camera is unavailable.
186
187**References**
188
189[CameraStatus](../reference/apis-camera-kit/arkts-apis-camera-CameraManager.md#oncamerastatus)
190
191## Does SoundPool support audio in WMV format? Which formats are supported? (API version 10)
192
193**Solution**
194
195Currently, WMV is not supported. The supported formats are AAC, MPEG (MP3), FLAC, and Vorbis.
196
197**References**
198
199The formats supported by **SoundPool** are the same as those supported by the bottom layer. For details about the supported formats, see [Audio Decoding](../media/avcodec/audio-decoding.md).
200
201## How do I read the preview image of the camera? (API version 10)
202
203**Solution**
204
205You can call **ImageReceiver.readLatestImage** to obtain the preview image of the camera.
206
207**References**
208
209[readLatestImage](../reference/apis-image-kit/arkts-apis-image-ImageReceiver.md#readlatestimage9)
210
211## How do I listen for recordings? (API version 10)
212
213**Solution**
214
215Audio-related listening of the system is implemented in **AudioStreamManager**. You can call **on(type: 'audioCapturerChange', callback: Callback\<AudioCapturerChangeInfoArray>): void** to listen for audio capturer changes.
216
217**References**
218
219[onaudiocapturerchange](../reference/apis-audio-kit/arkts-apis-audio-AudioStreamManager.md#onaudiocapturerchange9)
220
221## In which audio processing scenarios are 3A algorithms (AEC, ANC, and AGC) embedded? If they are embedded, is there any API related to audio 3A processing? How do I call them? Are independent switches provided for the 3A algorithms? Does the system support 3A in recording scenarios? If not, what is the solution? For example, how do I ensure the sound quality of audio recording when playing music? (API version 10)
222
223**Solution**
224
225The embedded 3A processing is automatically enabled for the audio stream with the **STREAM_USAGE_VOICE_COMMUNICATION** configuration. Currently, an independent switch is not provided. 3A is supported in recording scenarios. You need to configure **AudioScene** and **SourceType** to enable 3A processing in recording scenarios.
226
227**References**
228
229[AudioCapturer](../reference/apis-audio-kit/arkts-apis-audio-AudioCapturer.md)
230
231## How do I implement low latency audio recording?(API 11)
232
233**Solution**
234
235To implement low latency audio recording, use the C APIs provided by the **AudioCapturer** class of the OHAudio module. For details, see [Using OHAudio for Audio Recording (C/C++)](../media/audio/using-ohaudio-for-recording.md).
236
237**References**
238
239[OHAudio](../reference/apis-audio-kit/capi-ohaudio.md)
240
241## How do I implement real-time video stream transmission? How do I implement live broadcast? (API version 10)
242
243**Solution**
244
245Currently, the AVPlayer supports HTTP, HTTPS, and HLS for real-time video stream transmission. In the live broadcast scenario, the AVPlayer can play the data sent by the peer once it receives the live broadcast address. Stream pushing is not supported yet, which means that the Avplayer cannot use the current device for live broadcast.
246
247**References**
248
249- [Media Kit](../media/media/media-kit-intro.md)
250- [AVPlayer](../media/media/using-avplayer-for-playback.md)
251
252## How do I enable the AVPlayer to play in the background? (API version 10)
253
254**Solution**
255
256To continue background playback, the application must request a continuous task and register the AVSession with the system for unified management.
257
258**References**
259
260- [Continuous Task](https://gitcode.com/openharmony/applications_app_samples/tree/master/code/BasicFeature/TaskManagement/ContinuousTask)
261- [Accessing AVSession](../media/avsession/avsession-access-scene.md)
262
263## Why can't a third-party application create albums? (API version 10)
264
265**Symptom**
266
267The read and write permissions of album resources are set to the system_basic level, and the APIs for creating albums are designed as system APIs. What is the reason of this design? (API version 10)
268
269**Solution**
270
271To protect the privacy of users' images and videos, any operation on these files must be notified by the users. Therefore, the read and write permissions are not granted to third-party applications. The system generates source albums based on the image and video storage sources. User-defined albums can only be created in Gallery and can be dragged to the user-defined album area.
272
273## How do I compress an image to a specified size? What are the factors affecting the size after compression? (API version 10)
274
275**Symptom**
276
277What is the relationship between the **quality** parameter in the image compression APIs and the original size and compressed size of an image? How do I set the target image size? For example, if I want to compress an image to 500 KB, how do I set the parameters?
278
279**Solution**
280
281The **quality** parameter affects the target image size for a lossy compression image format (such as JPEG), but not for a lossless compression image format (such as PNG).
282For lossy compression images, the target image size depends on the original image size, compression quality, and image content. Therefore, the current system does not support the setting of the target image size. If an application wants to specify the size, you can adjust the **quality** parameter based on the compression result, or scale the pixel map to a smaller size and then compress the pixel map.
283
284**References**
285
286- [scale](../reference/apis-image-kit/arkts-apis-image-PixelMap.md#scale9)
287- [packing](../reference/apis-image-kit/arkts-apis-image-ImagePacker.md#packing13)
288