• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# @ohos.ai.mindSporeLite (Inference)
2
3MindSpore Lite is an AI engine that implements AI model inference for different hardware devices. It has been used in a wide range of fields, such as image classification, target recognition, facial recognition, and character recognition.
4The **mindSporeLite** module provides APIs for the MindSpore Lite inference engine to implement model inference.
5
6> **NOTE**
7>
8> The initial APIs of this module are supported since API version 10. Newly added APIs will be marked with a superscript to indicate their earliest API version. Unless otherwise stated, the MindSpore model is used in the sample code.
9>
10> The APIs of this module can be used only in the stage model.
11
12## Modules to Import
13```ts
14import mindSporeLite from '@ohos.ai.mindSporeLite';
15```
16
17## Context
18
19Defines the configuration information of the running environment.
20
21### Attributes
22
23**System capability**: SystemCapability.AI.MindSporeLite
24
25
26| Name  | Type                     | Readable| Writable| Description                                                        |
27| ------ | ------------------------- | ---- | ---- | ------------------------------------------------------------ |
28| target | string[]                  | Yes  | Yes  | Target backend. The value can be **cpu** or **nnrt**. The default value is **cpu**.                |
29| cpu    | [CpuDevice](#cpudevice)   | Yes  | Yes  | CPU backend device option. Set this parameter set only when **target** is set to **cpu**. The default value is the combination of the default value of each **CpuDevice** option.|
30| nnrt   | [NNRTDevice](#nnrtdevice) | Yes  | Yes  | NNRt backend device option. Set this parameter set only when **target** is set to **nnrt**. Currently, this parameter is empty.|
31
32**Example**
33
34```ts
35let context: mindSporeLite.Context = {};
36context.target = ['cpu','nnrt'];
37```
38
39## CpuDevice
40
41Defines the CPU backend device option.
42
43### Attributes
44
45**System capability**: SystemCapability.AI.MindSporeLite
46
47| Name                  | Type                                     | Readable| Writable| Description                                                        |
48| ---------------------- | ----------------------------------------- | ---- | ---- | ------------------------------------------------------------ |
49| threadNum              | number                                    | Yes  | Yes  | Number of runtime threads. The default value is **2**.                             |
50| threadAffinityMode     | [ThreadAffinityMode](#threadaffinitymode) | Yes  | Yes  | Affinity mode for binding runtime threads to CPU cores. The default value is **mindSporeLite.ThreadAffinityMode.NO_AFFINITIES**.|
51| threadAffinityCoreList | number[]                                  | Yes  | Yes  | List of CPU cores bound to runtime threads. Set this parameter only when **threadAffinityMode** is set. If **threadAffinityMode** is set to **mindSporeLite.ThreadAffinityMode.NO_AFFINITIES**, this parameter is empty. The number in the list indicates the SN of the CPU core. The default value is **[]**.|
52| precisionMode          | string                                    | Yes  | Yes  | Whether to enable the Float16 inference mode. The value **preferred_fp16** means to enable half-precision inference and the default value **enforce_fp32** means to disable half-precision inference. Other settings are not supported.|
53
54**Float16 inference mode**: a mode that uses half-precision inference. Float16 uses 16 bits to represent a number and therefore it is also called half-precision.
55
56**Example**
57
58```ts
59let context: mindSporeLite.Context = {};
60context.cpu = {};
61context.target = ['cpu'];
62context.cpu.threadNum = 2;
63context.cpu.threadAffinityMode = 0;
64context.cpu.precisionMode = 'preferred_fp16';
65context.cpu.threadAffinityCoreList = [0, 1, 2];
66```
67
68## NNRTDevice
69
70Represents an NNRt device. Neural Network Runtime (NNRt) is a bridge that connects the upper-layer AI inference framework to the bottom-layer acceleration chip to implement cross-chip inference and computing of AI models. An NNRt backend can be configured for MindSpore Lite. Currently, this API is not supported.
71
72**System capability**: SystemCapability.AI.MindSporeLite
73
74## ThreadAffinityMode
75
76Specifies the affinity mode for binding runtime threads to CPU cores.
77
78**System capability**: SystemCapability.AI.MindSporeLite
79
80| Name              | Value  | Description        |
81| ------------------ | ---- | ------------ |
82| NO_AFFINITIES      | 0    | No affinities.    |
83| BIG_CORES_FIRST    | 1    | Big cores first.|
84| LITTLE_CORES_FIRST | 2    | Small cores first.|
85
86## mindSporeLite.loadModelFromFile
87
88loadModelFromFile(model: string, callback: Callback<Model>): void
89
90Loads the input model from the full path for model inference. This API uses an asynchronous callback to return the result.
91
92**System capability**: SystemCapability.AI.MindSporeLite
93
94**Parameters**
95
96| Name  | Type                     | Mandatory| Description                    |
97| -------- | ------------------------- | ---- | ------------------------ |
98| model    | string                    | Yes  | Complete path of the input model.    |
99| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
100
101**Example**
102
103```ts
104let model_file : string = '/path/to/xxx.ms';
105mindSporeLite.loadModelFromFile(model_file, (result : mindSporeLite.Model) => {
106  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
107  console.log(modelInputs[0].name);
108})
109```
110## mindSporeLite.loadModelFromFile
111
112loadModelFromFile(model: string, context: Context, callback: Callback&lt;Model&gt;): void
113
114Loads the input model from the full path for model inference. This API uses an asynchronous callback to return the result.
115
116**System capability**: SystemCapability.AI.MindSporeLite
117
118**Parameters**
119
120| Name  | Type                               | Mandatory| Description                  |
121| -------- | ----------------------------------- | ---- | ---------------------- |
122| model    | string                              | Yes  | Complete path of the input model.  |
123| context | [Context](#context) | Yes| Configuration information of the running environment.|
124| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
125
126**Example**
127
128```ts
129let context: mindSporeLite.Context = {};
130context.target = ['cpu'];
131let model_file : string = '/path/to/xxx.ms';
132mindSporeLite.loadModelFromFile(model_file, context, (result : mindSporeLite.Model) => {
133  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
134  console.log(modelInputs[0].name);
135})
136```
137## mindSporeLite.loadModelFromFile
138
139loadModelFromFile(model: string, context?: Context): Promise&lt;Model&gt;
140
141Loads the input model from the full path for model inference. This API uses a promise to return the result.
142
143**System capability**: SystemCapability.AI.MindSporeLite
144
145**Parameters**
146
147| Name | Type               | Mandatory| Description                |
148| ------- | ------------------- | ---- | -------------------- |
149| model   | string              | Yes  | Complete path of the input model.|
150| context | [Context](#context) | No  | Configuration information of the running environment.|
151
152**Return value**
153
154| Type                     | Description                        |
155| ------------------------- | ---------------------------- |
156| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
157
158**Example**
159
160```ts
161let model_file = '/path/to/xxx.ms';
162mindSporeLite.loadModelFromFile(model_file).then((result : mindSporeLite.Model) => {
163  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
164  console.log(modelInputs[0].name);
165})
166```
167## mindSporeLite.loadModelFromBuffer
168
169loadModelFromBuffer(model: ArrayBuffer, callback: Callback&lt;Model&gt;): void
170
171Loads the input model from the memory for inference. This API uses an asynchronous callback to return the result.
172
173**System capability**: SystemCapability.AI.MindSporeLite
174
175**Parameters**
176
177| Name  | Type                     | Mandatory| Description                    |
178| -------- | ------------------------- | ---- | ------------------------ |
179| model    | ArrayBuffer               | Yes  | Memory that contains the input model.        |
180| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
181
182**Example**
183
184```ts
185import mindSporeLite from '@ohos.ai.mindSporeLite';
186import common from '@ohos.app.ability.common';
187
188let modelName = '/path/to/xxx.ms';
189getContext(this).resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
190  let modelBuffer : ArrayBuffer = buffer.buffer;
191  mindSporeLite.loadModelFromBuffer(modelBuffer, (result : mindSporeLite.Model) => {
192    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
193    console.log(modelInputs[0].name);
194  })
195})
196```
197## mindSporeLite.loadModelFromBuffer
198
199loadModelFromBuffer(model: ArrayBuffer, context: Context, callback: Callback&lt;Model&gt;): void
200
201Loads the input model from the memory for inference. This API uses an asynchronous callback to return the result.
202
203**System capability**: SystemCapability.AI.MindSporeLite
204
205**Parameters**
206
207| Name  | Type                               | Mandatory| Description                  |
208| -------- | ----------------------------------- | ---- | ---------------------- |
209| model    | ArrayBuffer                   | Yes  | Memory that contains the input model.|
210| context | [Context](#context) | Yes | Configuration information of the running environment.|
211| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
212
213**Example**
214
215```ts
216import resourceManager from '@ohos.resourceManager'
217import { GlobalContext } from '../GlobalContext';
218import mindSporeLite from '@ohos.ai.mindSporeLite';
219import common from '@ohos.app.ability.common';
220let modelName = '/path/to/xxx.ms';
221export class Test {
222  value:number = 0;
223  foo(): void {
224    GlobalContext.getContext().setObject("value", this.value);
225  }
226}
227let globalContext= GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
228
229globalContext.resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
230  let modelBuffer : ArrayBuffer = buffer.buffer;
231  let context: mindSporeLite.Context = {};
232  context.target = ['cpu'];
233  mindSporeLite.loadModelFromBuffer(modelBuffer, context, (result : mindSporeLite.Model) => {
234    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
235    console.log(modelInputs[0].name);
236  })
237})
238```
239## mindSporeLite.loadModelFromBuffer
240
241loadModelFromBuffer(model: ArrayBuffer, context?: Context): Promise&lt;Model&gt;
242
243Loads the input model from the memory for inference. This API uses a promise to return the result.
244
245**System capability**: SystemCapability.AI.MindSporeLite
246
247**Parameters**
248
249| Name | Type               | Mandatory| Description                |
250| ------- | ------------------- | ---- | -------------------- |
251| model   | ArrayBuffer         | Yes  | Memory that contains the input model.    |
252| context | [Context](#context) | No  | Configuration information of the running environment.|
253
254**Return value**
255
256| Type                           | Description                        |
257| ------------------------------- | ---------------------------- |
258| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
259
260**Example**
261
262```ts
263import resourceManager from '@ohos.resourceManager'
264import { GlobalContext } from '../GlobalContext';
265import mindSporeLite from '@ohos.ai.mindSporeLite';
266import common from '@ohos.app.ability.common';
267let modelName = '/path/to/xxx.ms';
268export class Test {
269  value:number = 0;
270  foo(): void {
271    GlobalContext.getContext().setObject("value", this.value);
272  }
273}
274let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
275
276globalContext.resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
277  let modelBuffer : ArrayBuffer = buffer.buffer;
278  mindSporeLite.loadModelFromBuffer(modelBuffer).then((result : mindSporeLite.Model) => {
279    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
280    console.log(modelInputs[0].name);
281  })
282})
283```
284## mindSporeLite.loadModelFromFd
285
286loadModelFromFd(model: number, callback: Callback&lt;Model&gt;): void
287
288Loads the input model based on the specified file descriptor for inference. This API uses an asynchronous callback to return the result.
289
290**System capability**: SystemCapability.AI.MindSporeLite
291
292**Parameters**
293
294| Name  | Type                               | Mandatory| Description                  |
295| -------- | ----------------------------------- | ---- | ---------------------- |
296| model    | number                         | Yes  | File descriptor of the input model.|
297| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
298
299**Example**
300
301```ts
302import fs from '@ohos.file.fs';
303let model_file = '/path/to/xxx.ms';
304let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
305mindSporeLite.loadModelFromFd(file.fd, (result : mindSporeLite.Model) => {
306  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
307  console.log(modelInputs[0].name);
308})
309```
310## mindSporeLite.loadModelFromFd
311
312loadModelFromFd(model: number, context: Context, callback: Callback&lt;Model&gt;): void
313
314Loads the input model based on the specified file descriptor for inference. This API uses an asynchronous callback to return the result.
315
316**System capability**: SystemCapability.AI.MindSporeLite
317
318**Parameters**
319
320| Name  | Type                               | Mandatory| Description                  |
321| -------- | ----------------------------------- | ---- | ---------------------- |
322| model    | number                   | Yes  | File descriptor of the input model.|
323| context | [Context](#context) | Yes | Configuration information of the running environment.|
324| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
325
326**Example**
327
328```ts
329import fs from '@ohos.file.fs';
330let model_file = '/path/to/xxx.ms';
331let context : mindSporeLite.Context = {};
332context.target = ['cpu'];
333let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
334mindSporeLite.loadModelFromFd(file.fd, context, (result : mindSporeLite.Model) => {
335  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
336  console.log(modelInputs[0].name);
337})
338```
339## mindSporeLite.loadModelFromFd
340
341loadModelFromFd(model: number, context?: Context): Promise&lt;Model&gt;
342
343Loads the input model based on the specified file descriptor for inference. This API uses a promise to return the result.
344
345**System capability**: SystemCapability.AI.MindSporeLite
346
347**Parameters**
348
349| Name | Type               | Mandatory| Description                |
350| ------- | ------------------- | ---- | -------------------- |
351| model   | number              | Yes  | File descriptor of the input model.  |
352| context | [Context](#context) | No  | Configuration information of the running environment.|
353
354**Return value**
355
356| Type                     | Description                        |
357| ------------------------- | ---------------------------- |
358| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
359
360**Example**
361
362```ts
363import fs from '@ohos.file.fs';
364let model_file = '/path/to/xxx.ms';
365let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
366let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFd(file.fd);
367let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
368console.log(modelInputs[0].name);
369```
370## Model
371
372Represents a **Model** instance, with properties and APIs defined.
373
374In the following sample code, you first need to use [loadModelFromFile()](#mindsporeliteloadmodelfromfile), [loadModelFromBuffer()](#mindsporeliteloadmodelfrombuffer), or [loadModelFromFd()](#mindsporeliteloadmodelfromfd) to obtain a **Model** instance before calling related APIs.
375
376### getInputs
377
378getInputs(): MSTensor[]
379
380Obtains the model input for inference.
381
382**System capability**: SystemCapability.AI.MindSporeLite
383
384**Return value**
385
386| Type                   | Description              |
387| ----------------------- | ------------------ |
388| [MSTensor](#mstensor)[] | **MSTensor** object.|
389
390**Example**
391
392```ts
393let model_file = '/path/to/xxx.ms';
394mindSporeLite.loadModelFromFile(model_file).then((result : mindSporeLite.Model) => {
395  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
396  console.log(modelInputs[0].name);
397})
398```
399### predict
400
401predict(inputs: MSTensor[], callback: Callback&lt;MSTensor[]&gt;): void
402
403Executes the inference model. This API uses an asynchronous callback to return the result. Ensure that the model object is not reclaimed when being invoked.
404
405**System capability**: SystemCapability.AI.MindSporeLite
406
407**Parameters**
408
409| Name| Type                   | Mandatory| Description                      |
410| ------ | ----------------------- | ---- | -------------------------- |
411| inputs | [MSTensor](#mstensor)[] | Yes  | List of input models.  |
412| callback | Callback<[MSTensor](#mstensor)[]> | Yes  | Callback used to return the result, which is a list of **MSTensor** objects.|
413
414**Example**
415
416```ts
417import resourceManager from '@ohos.resourceManager'
418import { GlobalContext } from '../GlobalContext';
419import mindSporeLite from '@ohos.ai.mindSporeLite';
420import common from '@ohos.app.ability.common';
421export class Test {
422  value:number = 0;
423  foo(): void {
424    GlobalContext.getContext().setObject("value", this.value);
425  }
426}
427let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
428
429let inputName = 'input_data.bin';
430globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
431  let modelBuffer : ArrayBuffer = buffer.buffer;
432  let model_file : string = '/path/to/xxx.ms';
433  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
434  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
435
436  modelInputs[0].setData(modelBuffer);
437  mindSporeLiteModel.predict(modelInputs, (result : mindSporeLite.MSTensor[]) => {
438    let output = new Float32Array(result[0].getData());
439    for (let i = 0; i < output.length; i++) {
440      console.log(output[i].toString());
441    }
442  })
443})
444```
445### predict
446
447predict(inputs: MSTensor[]): Promise&lt;MSTensor[]&gt;
448
449Executes model inference. This API uses a promise to return the result. Ensure that the model object is not reclaimed when being invoked.
450
451**System capability**: SystemCapability.AI.MindSporeLite
452
453**Parameters**
454
455| Name| Type                   | Mandatory| Description                          |
456| ------ | ----------------------- | ---- | ------------------------------ |
457| inputs | [MSTensor](#mstensor)[] | Yes  | List of input models.  |
458
459**Return value**
460
461| Type                   | Description                  |
462| ----------------------- | ---------------------- |
463| Promise<[MSTensor](#mstensor)[]> | Promise used to return the result, List of **MSTensor** objects.|
464
465**Example**
466
467```ts
468import resourceManager from '@ohos.resourceManager'
469import { GlobalContext } from '../GlobalContext';
470import mindSporeLite from '@ohos.ai.mindSporeLite';
471import common from '@ohos.app.ability.common';
472export class Test {
473    value:number = 0;
474    foo(): void {
475    GlobalContext.getContext().setObject("value", this.value);
476}
477}
478let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;;
479let inputName = 'input_data.bin';
480globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
481  let modelBuffer = buffer.buffer;
482  let model_file = '/path/to/xxx.ms';
483  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
484  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
485  modelInputs[0].setData(modelBuffer);
486  mindSporeLiteModel.predict(modelInputs).then((result : mindSporeLite.MSTensor[]) => {
487    let output = new Float32Array(result[0].getData());
488    for (let i = 0; i < output.length; i++) {
489      console.log(output[i].toString());
490    }
491  })
492})
493```
494
495### resize
496
497resize(inputs: MSTensor[], dims: Array&lt;Array&lt;number&gt;&gt;): boolean
498
499Resets the tensor size.
500
501**System capability**: SystemCapability.AI.MindSporeLite
502
503**Parameters**
504
505| Name| Type                 | Mandatory| Description                         |
506| ------ | --------------------- | ---- | ----------------------------- |
507| inputs | [MSTensor](#mstensor)[]            | Yes  | List of input models. |
508| dims   | Array&lt;Array&lt;number&gt;&gt; | Yes  | Target tensor size.|
509
510**Return value**
511
512| Type   | Description                                                        |
513| ------- | ------------------------------------------------------------ |
514| boolean | Result indicating whether the setting is successful. The value **true** indicates that the tensor size is successfully reset, and the value **false** indicates the opposite.|
515
516**Example**
517
518```ts
519let model_file = '/path/to/xxx.ms';
520mindSporeLite.loadModelFromFile(model_file).then((mindSporeLiteModel : mindSporeLite.Model) => {
521  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
522  let new_dim = new Array([1,32,32,1]);
523  mindSporeLiteModel.resize(modelInputs, new_dim);
524})
525```
526
527## MSTensor
528
529Represents an **MSTensor** instance, with properties and APIs defined. It is a special data structure similar to arrays and matrices. It is the basic data structure used in MindSpore Lite network operations.
530
531In the following sample code, you first need to use [getInputs()](#getinputs) to obtain an **MSTensor** instance before calling related APIs.
532
533### Attributes
534
535**System capability**: SystemCapability.AI.MindSporeLite
536
537| Name      | Type                 | Readable| Writable| Description                                                |
538| ---------- | --------------------- | ---- | ---- | ---------------------------------------------------- |
539| name       | string                | Yes  | Yes  | Tensor name. The default value is **null**.                              |
540| shape      | number[]              | Yes  | Yes  | Tensor dimension array. The default value is **0**.                           |
541| elementNum | number                | Yes  | Yes  | Length of the tensor dimension array. The default value is **0**.                     |
542| dataSize   | number                | Yes  | Yes  | Length of tensor data. The default value is **0**.                         |
543| dtype      | [DataType](#datatype) | Yes  | Yes  | Tensor data type. The default value is **0**, indicating **TYPE_UNKNOWN**.       |
544| format     | [Format](#format)     | Yes  | Yes  | Tensor data format. The default value is **-1**, indicating **DEFAULT_FORMAT**.|
545
546**Example**
547
548```ts
549let model_file = '/path/to/xxx.ms';
550mindSporeLite.loadModelFromFile(model_file).then((mindSporeLiteModel : mindSporeLite.Model) => {
551  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
552  console.log(modelInputs[0].name);
553  console.log(modelInputs[0].shape.toString());
554  console.log(modelInputs[0].elementNum.toString());
555  console.log(modelInputs[0].dtype.toString());
556  console.log(modelInputs[0].format.toString());
557  console.log(modelInputs[0].dataSize.toString());
558})
559```
560
561### getData
562
563getData(): ArrayBuffer
564
565Obtains tensor data.
566
567**System capability**: SystemCapability.AI.MindSporeLite
568
569**Return value**
570
571| Type       | Description                |
572| ----------- | -------------------- |
573| ArrayBuffer | Pointer to the tensor data.|
574
575**Example**
576
577```ts
578import resourceManager from '@ohos.resourceManager'
579import { GlobalContext } from '../GlobalContext';
580import mindSporeLite from '@ohos.ai.mindSporeLite';
581import common from '@ohos.app.ability.common';
582export class Test {
583  value:number = 0;
584  foo(): void {
585    GlobalContext.getContext().setObject("value", this.value);
586  }
587}
588let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
589let inputName = 'input_data.bin';
590globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
591  let inputBuffer = buffer.buffer;
592  let model_file = '/path/to/xxx.ms';
593  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
594  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
595  modelInputs[0].setData(inputBuffer);
596  mindSporeLiteModel.predict(modelInputs).then((result : mindSporeLite.MSTensor[]) => {
597    let output = new Float32Array(result[0].getData());
598    for (let i = 0; i < output.length; i++) {
599      console.log(output[i].toString());
600    }
601  })
602})
603```
604
605### setData
606
607setData(inputArray: ArrayBuffer): void
608
609Sets the tensor data.
610
611**System capability**: SystemCapability.AI.MindSporeLite
612
613**Parameters**
614
615| Name    | Type       | Mandatory| Description                  |
616| ---------- | ----------- | ---- | ---------------------- |
617| inputArray | ArrayBuffer | Yes  | Input data buffer of the tensor.|
618
619**Example**
620
621```ts
622import resourceManager from '@ohos.resourceManager'
623import { GlobalContext } from '../GlobalContext';
624import mindSporeLite from '@ohos.ai.mindSporeLite';
625import common from '@ohos.app.ability.common';
626export class Test {
627  value:number = 0;
628  foo(): void {
629    GlobalContext.getContext().setObject("value", this.value);
630  }
631}
632let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
633let inputName = 'input_data.bin';
634globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
635  let inputBuffer = buffer.buffer;
636  let model_file = '/path/to/xxx.ms';
637  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
638  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
639  modelInputs[0].setData(inputBuffer);
640})
641```
642
643## DataType
644
645Tensor data type.
646
647**System capability**: SystemCapability.AI.MindSporeLite
648
649| Name               | Value  | Description               |
650| ------------------- | ---- | ------------------- |
651| TYPE_UNKNOWN        | 0    | Unknown type.         |
652| NUMBER_TYPE_INT8    | 32   | Int8 type.   |
653| NUMBER_TYPE_INT16   | 33   | Int16 type.  |
654| NUMBER_TYPE_INT32   | 34   | Int32 type.  |
655| NUMBER_TYPE_INT64   | 35   | Int64 type.  |
656| NUMBER_TYPE_UINT8   | 37   | UInt8 type.  |
657| NUMBER_TYPE_UINT16  | 38   | UInt16 type. |
658| NUMBER_TYPE_UINT32  | 39   | UInt32 type. |
659| NUMBER_TYPE_UINT64  | 40   | UInt64 type. |
660| NUMBER_TYPE_FLOAT16 | 42   | Float16 type.|
661| NUMBER_TYPE_FLOAT32 | 43   | Float32 type.|
662| NUMBER_TYPE_FLOAT64 | 44   | Float64 type.|
663
664## Format
665
666Enumerates tensor data formats.
667
668**System capability**: SystemCapability.AI.MindSporeLite
669
670| Name          | Value  | Description                 |
671| -------------- | ---- | --------------------- |
672| DEFAULT_FORMAT | -1   | Unknown data format.   |
673| NCHW           | 0    | NCHW format. |
674| NHWC           | 1    | NHWC format. |
675| NHWC4          | 2    | NHWC4 format.|
676| HWKC           | 3    | HWKC format. |
677| HWCK           | 4    | HWCK format. |
678| KCHW           | 5    | KCHW format. |
679