1# Distributed Camera Development 2 3 4## Overview 5 6 OpenHarmony distributed camera implements collaboration across devices by breaking hardware boundaries. For example, after devices A and B running OpenHarmony are networked, the application on device A can call the camera resources of device B in real time to obtain images (preview stream, photo stream, or video stream) from device B. In addition, in-depth controls such as resolution adjustment and settings synchronization are supported on device A. Distributed camera achieves the following breakthroughs: 7 - Collaborative creation with multiple users 8 - Remote collaboration with experts 9 - Immersive security system 10 - Distributed audio and video interaction 11 12 13### Basic Concepts 14 15 Before started, you are advised to read the following topics to have a basic understanding of related functions: 16 - [UIAbility Connection Development](abilityconnectmanager-guidelines.md) 17 - [Camera Device Management (ArkTS)](../media/camera/camera-device-management.md) 18 - [Camera Development Preparations](../media/camera/camera-preparation.md) 19 - [Camera Session Management (ArkTS)](../media/camera/camera-session-management.md) 20 - [Photo Capture (ArkTS)](../media/camera/camera-shooting.md) 21 - [Video Recording (ArkTS)](../media/camera/camera-recording.md) 22 23 24## Preparing the Environment 25 26### Environment Requirements 27 28 Successful networking between device A and device B. 29 30 31### Environment Setup 32 33 1. Install [DevEco Studio](https://developer.huawei.com/consumer/en/download/) 5.0 or later. 34 2. Update the public SDK to API version 16 or later. 35 3. Connect device A and device B to the PC using USB cables. 36 4. Connect device A and device B to the same Wi-Fi, identify each other, and start networking. For details, see [UIAbility Connection Development](abilityconnectmanager-guidelines.md#how-to-develop). 37 38 39### Environment Verification 40 41 Run the following shell command on the PC: 42 43 ```shell 44 hdc shell 45 hidumper -s 4700 -a "buscenter -l remote_device_info" 46 ``` 47 48 If the networking is successful, the number of networking devices is displayed, for example, **remote device num = 1**. 49 50 51## How to Develop 52 53 OpenHarmony pools cameras on multiple devices to provide users with the capability of using cameras across devices. 54 55### Development Process 56 57 The figure below shows the recommended development process. 58 59  60 61 62### Development Procedure 63 64#### Importing the Camera and Multimedia Modules 65 66 ```ts 67 import { camera } from '@kit.CameraKit'; 68 import { media } from '@kit.MediaKit'; 69 ``` 70 71#### Granting the Access Permission to the Application 72 73 The application should apply for required permissions, which include but are not limited to the following: 74 - For accessing the location of an image or a video: ohos.permission.MEDIA_LOCATION 75 - For reading files: ohos.permission.READ_MEDIA 76 - For writing files: ohos.permission.WRITE_MEDIA 77 - For using camera: ohos.permission.CAMERA 78 - For multi-device collaboration: ohos.permission.DISTRIBUTED_DATASYNC 79 80 For example, you can call **requestPermissionsFromUser()** to request the corresponding permissions for the UIAbility. 81 ```ts 82 //EntryAbility.ets 83 export default class EntryAbility extends UIAbility { 84 onCreate(want, launchParam) { 85 Logger.info('Sample_VideoRecorder', 'Ability onCreate,requestPermissionsFromUser'); 86 let permissionNames: Array<Permissions> = ['ohos.permission.MEDIA_LOCATION', 'ohos.permission.READ_MEDIA', 87 'ohos.permission.WRITE_MEDIA', 'ohos.permission.CAMERA', 'ohos.permission.MICROPHONE', 'ohos.permission.DISTRIBUTED_DATASYNC']; 88 abilityAccessCtrl.createAtManager().requestPermissionsFromUser(this.context, permissionNames).then((data)=> { 89 console.log("testTag", data); 90 }) 91 .catch((err : BusinessError) => { 92 console.error("testTag", err.message); 93 }); 94 } 95 ``` 96 97 98#### Initiating the Preview Stream and Photo Stream on the Distributed Camera 99 100##### 1. Obtaining the Camera Information of a Remote Device 101 102 After the application networking is successful, you can use **getCameraManager()** to obtain the camera manager instance and **getSupportedCameras()** to obtain the supported camera device object. 103 104 ```ts 105 private cameras?: Array<camera.CameraDevice>; 106 private cameraManager?: camera.CameraManager; 107 private cameraOutputCapability?: camera.CameraOutputCapability; 108 private cameraIndex: number = 0; 109 private curVideoProfiles?: Array<camera.VideoProfile>; 110 111 function initCamera(): void { 112 console.info('init remote camera called'); 113 if (this.cameraManager) { 114 console.info('cameraManager already exits'); 115 return; 116 } 117 console.info('[camera] case to get cameraManager'); 118 this.cameraManager = camera.getCameraManager(globalThis.abilityContext); 119 if (this.cameraManager) { 120 console.info('[camera] case getCameraManager success'); 121 } else { 122 console.error('[camera] case getCameraManager failed'); 123 return; 124 } 125 this.cameras = this.cameraManager.getSupportedCameras(); 126 if (this.cameras) { 127 console.info('[camera] case getCameras success, size ', this.cameras.length); 128 for (let i = 0; i < this.cameras.length; i++) { 129 let came: camera.CameraDevice = this.cameras[i]; 130 console.info('[came] camera json:', JSON.stringify(came)); 131 if (came.connectionType == camera.ConnectionType.CAMERA_CONNECTION_REMOTE) { 132 this.cameraIndex = i; 133 this.cameraOutputCapability = this.cameraManager.getSupportedOutputCapability(came); 134 this.curVideoProfiles = this.cameraOutputCapability.videoProfiles; 135 console.info('init remote camera done'); // The remote camera is successfully initialized. 136 break; 137 } 138 } 139 } else { 140 console.error('[camera] case getCameras failed'); 141 } 142 } 143 ``` 144 145##### 2. Creating a CameraInput Instance 146 147 After obtaining the **CameraManager** instance and the supported camera device object, call **createCameraInput()** to create a **CameraInput** instance. 148 149 ```ts 150 // create camera input 151 async createCameraInput(): Promise<void> { 152 console.log('createCameraInput called'); 153 if (this.cameras && this.cameras.length > 0) { 154 let came: camera.CameraDevice = this.cameras[this.cameraIndex]; 155 console.log('[came]createCameraInput camera json:', JSON.stringify(came)); 156 this.cameraInput = this.cameraManager?.createCameraInput(came); 157 if (this.cameraInput) { 158 console.log('[camera] case createCameraInput success'); 159 await this.cameraInput.open().then(() => { 160 console.log('[camera] case cameraInput.open() success'); 161 }).catch((err: Error) => { 162 console.error('[camera] cameraInput.open then.error:', json.stringify(err)); 163 }); 164 } else { 165 console.error('[camera] case createCameraInput failed'); 166 return; 167 } 168 } 169 } 170 ``` 171 172##### 3. Obtaining the PreviewOutput Object 173 174 Use **createPreviewOutput()** to create a **PreviewOutput** object. 175 176 ```ts 177 private previewOutput?: camera.PreviewOutput; 178 private avConfig: media.AVRecorderConfig = { 179 videoSourceType: media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV, 180 profile: this.avProfile, 181 url: 'fd://', 182 } 183 184 // create camera preview 185 async createPreviewOutput(): Promise<void> { 186 console.log('createPreviewOutput called'); 187 if (this.cameraOutputCapability && this.cameraManager) { 188 this.previewProfiles = this.cameraOutputCapability.previewProfiles; 189 console.log('[camera] this.previewProfiles json ', json.stringify(this.previewProfiles)); 190 if (this.previewProfiles[0].format === camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP) { 191 console.log('[camera] case format is VIDEO_SOURCE_TYPE_SURFACE_YUV'); 192 this.avConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV; 193 } else { 194 console.log('[camera] case format is VIDEO_SOURCE_TYPE_SURFACE_ES'); 195 this.avConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_ES; 196 } 197 this.previewOutput = this.cameraManager.createPreviewOutput(this.previewProfiles[0], this.surfaceId); 198 if (!this.previewOutput) { 199 console.error('create previewOutput failed!'); 200 } 201 console.log('createPreviewOutput done'); 202 } 203 } 204 ``` 205 206 207##### 4. Obtaining the PhotoOutput Object 208 209 Use **createPhotoOutput()** to create a **PhotoOutput** object and **createImageReceiver()** to create an **ImageReceiver** instance. 210 211 ```ts 212 import fileio from '@ohos.fileio'; 213 214 private photoReceiver?: image.ImageReceiver; 215 private photoOutput?: camera.PhotoOutput; 216 private mSaveCameraAsset: SaveCameraAsset = new SaveCameraAsset('Sample_VideoRecorder'); 217 218 async getImageFileFd(): Promise<void> { 219 console.info'getImageFileFd called'); 220 this.mFileAssetId = await this.mSaveCameraAsset.createImageFd(); 221 this.fdPath = 'fd://' + this.mFileAssetId.toString(); 222 this.avConfig.url = this.fdPath; 223 console.info('ImageFileFd is: ' + this.fdPath); 224 console.info('getImageFileFd done'); 225 } 226 227 // close file fd 228 async closeFd(): Promise<void> { 229 console.info('case closeFd called'); 230 if (this.mSaveCameraAsset) { 231 await this.mSaveCameraAsset.closeVideoFile(); 232 this.mFileAssetId = undefined; 233 this.fdPath = undefined; 234 console.info('case closeFd done'); 235 } 236 } 237 238 async createPhotoOutput() { 239 const photoProfile: camera.Profile = { 240 format: camera.CameraFormat.CAMERA_FORMAT_JPEG, 241 size: { 242 "width": 1280, 243 "height": 720 244 } 245 } 246 if (!this.cameraManager) { 247 console.error('createPhotoOutput cameraManager is null') 248 } 249 if (!this.photoReceiver) { 250 this.photoReceiver = image.createImageReceiver(photoProfile.size.width, photoProfile.size.height, photoProfile.format, 8) 251 this.photoReceiver.on("imageArrival",()=>{ 252 this.photoReceiver?.readNextImage((err,image)=>{ 253 if (err || image === undefined) { 254 console.error('photoReceiver imageArrival on error') 255 return 256 } 257 image.getComponent(4, async (err, img) => { 258 if (err || img === undefined) { 259 console.error('image getComponent on error') 260 return 261 } 262 await this.getImageFileFd() 263 fileio.write(this.mFileAssetId, img.byteBuffer) 264 await this.closeFd() 265 await image.release() 266 console.log('photoReceiver image.getComponent save success') 267 }) 268 }) 269 }) 270 await this.photoReceiver.getReceivingSurfaceId().then((surfaceId: string) => { 271 this.photoOutput = this.cameraManager?.createPhotoOutput(photoProfile, surfaceId) 272 if (!this.photoOutput) { 273 console.error('cameraManager.createPhotoOutput on error') 274 } 275 console.log('cameraManager.createPhotoOutput success') 276 this.photoOutput?.on("captureStart", (err, captureId) => { 277 console.log('photoOutput.on captureStart') 278 }) 279 }).catch((err: Error) => { 280 console.error('photoReceiver.getReceivingSurfaceId on error:' + err) 281 }) 282 } 283 } 284 ``` 285 286##### 5. Creating a CaptureSession Instance 287 288 Use **createCaptureSession()** to create a **CaptureSession** instance. You can call **beginConfig()** to configure a session, call **addInput()** and **addOutput()** to add **CameraInput()** and **CameraOutput()** to the session, call **commitConfig()** to submit the configuration information, and use a promise to return the result. 289 290 ```ts 291 private captureSession?: camera.CaptureSession; 292 293 function failureCallback(error: BusinessError): Promise<void> { 294 console.error('case failureCallback called,errMessage is ', json.stringify(error)); 295 } 296 297 function catchCallback(error: BusinessError): Promise<void> { 298 console.error('case catchCallback called,errMessage is ', json.stringify(error)); 299 } 300 301 // create camera capture session 302 async createCaptureSession(): Promise<void> { 303 console.log('createCaptureSession called'); 304 if (this.cameraManager) { 305 this.captureSession = this.cameraManager.createCaptureSession(); 306 if (!this.captureSession) { 307 console.error('createCaptureSession failed!'); 308 return 309 } 310 try { 311 this.captureSession.beginConfig(); 312 this.captureSession.addInput(this.cameraInput); 313 } catch (e) { 314 console.error('case addInput error:' + json.stringify(e)); 315 } 316 try { 317 this.captureSession.addOutput(this.previewOutput); 318 } catch (e) { 319 console.error('case addOutput error:' + json.stringify(e)); 320 } 321 await this.captureSession.commitConfig().then(() => { 322 console.log('captureSession commitConfig success'); 323 }, this.failureCallback).catch(this.catchCallback); 324 } 325 } 326 ``` 327 328##### 6. Starting the Session 329 330 Use **start()** of the **CaptureSession** instance to start the session and use a promise to return the result. 331 332 ```ts 333 // start captureSession 334 async startCaptureSession(): Promise<void> { 335 console.log('startCaptureSession called'); 336 if (!this.captureSession) { 337 console.error('CaptureSession does not exists!'); 338 return 339 } 340 await this.captureSession.start().then(() => { 341 console.log('case start captureSession success'); 342 }, this.failureCallback).catch(this.catchCallback); 343 } 344 ``` 345 346#### Releasing Distributed Camera Resources 347 348 After the service collaboration is complete, the collaboration status needs to be ended in a timely manner to release distributed camera resources. 349 350 ```ts 351 // Release the camera. 352 async releaseCameraInput(): Promise<void> { 353 console.log('releaseCameraInput called'); 354 if (this.cameraInput) { 355 this.cameraInput = undefined; 356 } 357 console.log('releaseCameraInput done'); 358 } 359 360 // Release the preview. 361 async releasePreviewOutput(): Promise<void> { 362 console.log('releasePreviewOutput called'); 363 if (this.previewOutput) { 364 await this.previewOutput.release().then(() => { 365 console.log('[camera] case main previewOutput release called'); 366 }, this.failureCallback).catch(this.catchCallback); 367 this.previewOutput = undefined; 368 } 369 console.log('releasePreviewOutput done'); 370 } 371 372 // Release the video output. 373 async releaseVideoOutput(): Promise<void> { 374 console.log('releaseVideoOutput called'); 375 if (this.videoOutput) { 376 await this.videoOutput.release().then(() => { 377 console.log('[camera] case main videoOutput release called'); 378 }, this.failureCallback).catch(this.catchCallback); 379 this.videoOutput = undefined; 380 } 381 console.log('releaseVideoOutput done'); 382 } 383 384 // Stop the capture session. 385 async stopCaptureSession(): Promise<void> { 386 console.log('stopCaptureSession called'); 387 if (this.captureSession) { 388 await this.captureSession.stop().then(() => { 389 console.log('[camera] case main captureSession stop success'); 390 }, this.failureCallback).catch(this.catchCallback); 391 } 392 console.log('stopCaptureSession done'); 393 } 394 395 // Release the capture session. 396 async releaseCaptureSession(): Promise<void> { 397 console.log('releaseCaptureSession called'); 398 if (this.captureSession) { 399 await this.captureSession.release().then(() => { 400 console.log('[camera] case main captureSession release success'); 401 }, this.failureCallback).catch(this.catchCallback); 402 this.captureSession = undefined; 403 } 404 console.log('releaseCaptureSession done'); 405 } 406 407 // Release the camera resource. 408 async releaseCamera(): Promise<void> { 409 console.log('releaseCamera called'); 410 await this.stopCaptureSession(); 411 await this.releaseCameraInput(); 412 await this.releasePreviewOutput(); 413 await this.releaseVideoOutput(); 414 await this.releaseCaptureSession(); 415 console.log('releaseCamera done'); 416 } 417 ``` 418 419### Debugging and Verification 420 421 After application development is complete, you can install the application on device A and device B. The test procedure is as follows: 422 423 1. Device A starts the distributed camera on device B and initiates a preview. Device A can receive the preview stream. 424 2. Device A starts the distributed camera on device B and takes a photo. Device A can receive the photo. 425 426## FAQs 427 428 429### What should I do if the application on device A cannot start the camera on device B? 430 431**Possible Causes** 432 433 Devices are not networked or are disconnected after networking. 434 435**Solution** 436 437 Enable USB debugging on device A and device B, and use a USB cable to connect the devices to the PC. Run the following shell command on the PC: 438 439 ```shell 440 hdc shell 441 hidumper -s 4700 -a "buscenter -l remote_device_info" 442 ``` 443 If **remote device num = 0** is displayed in the command output, the networking fails. In this case, disable and then enable Wi-Fi, and connect devices to the same Wi-Fi again. If the networking is successful, run the shell command again and the number of networking devices is displayed, for example, **remote device num = 1**. 444