• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# @ohos.ai.mindSporeLite (Inference)
2
3MindSpore Lite is an AI engine that implements AI model inference for different hardware devices. It has been used in a wide range of fields, such as image classification, target recognition, facial recognition, and character recognition.
4The **mindSporeLite** module provides APIs for the MindSpore Lite inference engine to implement model inference.
5
6> **NOTE**
7>
8> The initial APIs of this module are supported since API version 10. Newly added APIs will be marked with a superscript to indicate their earliest API version. Unless otherwise stated, the MindSpore model is used in the sample code.
9>
10> The APIs of this module can be used only in the stage model.
11
12## Modules to Import
13```ts
14import mindSporeLite from '@ohos.ai.mindSporeLite';
15```
16
17## Context
18
19Defines the configuration information of the running environment.
20
21### Attributes
22
23**System capability**: SystemCapability.AI.MindSporeLite
24
25
26| Name  | Type                     | Readable| Writable| Description                                                        |
27| ------ | ------------------------- | ---- | ---- | ------------------------------------------------------------ |
28| target | string[]                  | Yes  | Yes  | Target backend. The value can be **cpu** or **nnrt**. The default value is **cpu**.                |
29| cpu    | [CpuDevice](#cpudevice)   | Yes  | Yes  | CPU backend device option. Set this parameter set only when **target** is set to **cpu**. The default value is the combination of the default value of each **CpuDevice** option.|
30| nnrt   | [NNRTDevice](#nnrtdevice) | Yes  | Yes  | NNRt backend device option. Set this parameter set only when **target** is set to **nnrt**. Currently, this parameter is empty.|
31
32**Example**
33
34```ts
35let context: mindSporeLite.Context = {};
36context.target = ['cpu','nnrt'];
37```
38
39## CpuDevice
40
41Defines the CPU backend device option.
42
43### Attributes
44
45**System capability**: SystemCapability.AI.MindSporeLite
46
47| Name                  | Type                                     | Readable| Writable| Description                                                        |
48| ---------------------- | ----------------------------------------- | ---- | ---- | ------------------------------------------------------------ |
49| threadNum              | number                                    | Yes  | Yes  | Number of runtime threads. The default value is **2**.                             |
50| threadAffinityMode     | [ThreadAffinityMode](#threadaffinitymode) | Yes  | Yes  | Affinity mode for binding runtime threads to CPU cores. The default value is **mindSporeLite.ThreadAffinityMode.NO_AFFINITIES**.|
51| threadAffinityCoreList | number[]                                  | Yes  | Yes  | List of CPU cores bound to runtime threads. Set this parameter only when **threadAffinityMode** is set. If **threadAffinityMode** is set to **mindSporeLite.ThreadAffinityMode.NO_AFFINITIES**, this parameter is empty. The number in the list indicates the SN of the CPU core. The default value is **[]**.|
52| precisionMode          | string                                    | Yes  | Yes  | Whether to enable the Float16 inference mode. The value **preferred_fp16** means to enable half-precision inference and the default value **enforce_fp32** means to disable half-precision inference. Other settings are not supported.|
53
54**Float16 inference mode**: a mode that uses half-precision inference. Float16 uses 16 bits to represent a number and therefore it is also called half-precision.
55
56**Example**
57
58```ts
59let context: mindSporeLite.Context = {};
60context.cpu = {};
61context.target = ['cpu'];
62context.cpu.threadAffinityMode = 0;
63context.cpu.precisionMode = 'preferred_fp16';
64context.cpu.threadAffinityCoreList = [0, 1, 2];
65```
66
67## NNRTDevice
68
69Represents an NNRt device. Neural Network Runtime (NNRt) is a bridge that connects the upper-layer AI inference framework to the bottom-layer acceleration chip to implement cross-chip inference and computing of AI models. An NNRt backend can be configured for MindSpore Lite. Currently, this API is not supported.
70
71**System capability**: SystemCapability.AI.MindSporeLite
72
73## ThreadAffinityMode
74
75Specifies the affinity mode for binding runtime threads to CPU cores.
76
77**System capability**: SystemCapability.AI.MindSporeLite
78
79| Name              | Value  | Description        |
80| ------------------ | ---- | ------------ |
81| NO_AFFINITIES      | 0    | No affinities.    |
82| BIG_CORES_FIRST    | 1    | Big cores first.|
83| LITTLE_CORES_FIRST | 2    | Medium cores first.|
84
85## mindSporeLite.loadModelFromFile
86
87loadModelFromFile(model: string, callback: Callback<Model>): void
88
89Loads the input model from the full path for model inference. This API uses an asynchronous callback to return the result.
90
91**System capability**: SystemCapability.AI.MindSporeLite
92
93**Parameters**
94
95| Name  | Type                     | Mandatory| Description                    |
96| -------- | ------------------------- | ---- | ------------------------ |
97| model    | string                    | Yes  | Complete path of the input model.    |
98| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
99
100**Example**
101
102```ts
103let model_file : string = '/path/to/xxx.ms';
104mindSporeLite.loadModelFromFile(model_file, (result : mindSporeLite.Model) => {
105  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
106  console.log(modelInputs[0].name);
107})
108```
109## mindSporeLite.loadModelFromFile
110
111loadModelFromFile(model: string, context: Context, callback: Callback&lt;Model&gt;): void
112
113Loads the input model from the full path for model inference. This API uses an asynchronous callback to return the result.
114
115**System capability**: SystemCapability.AI.MindSporeLite
116
117**Parameters**
118
119| Name  | Type                               | Mandatory| Description                  |
120| -------- | ----------------------------------- | ---- | ---------------------- |
121| model    | string                              | Yes  | Complete path of the input model.  |
122| context | [Context](#context) | Yes| Configuration information of the running environment.|
123| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
124
125**Example**
126
127```ts
128let context: mindSporeLite.Context = {};
129context.target = ['cpu'];
130let model_file : string = '/path/to/xxx.ms';
131mindSporeLite.loadModelFromFile(model_file, context, (result : mindSporeLite.Model) => {
132  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
133  console.log(modelInputs[0].name);
134})
135```
136## mindSporeLite.loadModelFromFile
137
138loadModelFromFile(model: string, context?: Context): Promise&lt;Model&gt;
139
140Loads the input model from the full path for model inference. This API uses a promise to return the result.
141
142**System capability**: SystemCapability.AI.MindSporeLite
143
144**Parameters**
145
146| Name | Type               | Mandatory| Description                |
147| ------- | ------------------- | ---- | -------------------- |
148| model   | string              | Yes  | Complete path of the input model.|
149| context | [Context](#context) | No  | Configuration information of the running environment.|
150
151**Return value**
152
153| Type                     | Description                        |
154| ------------------------- | ---------------------------- |
155| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
156
157**Example**
158
159```ts
160let model_file = '/path/to/xxx.ms';
161mindSporeLite.loadModelFromFile(model_file).then((result : mindSporeLite.Model) => {
162  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
163  console.log(modelInputs[0].name);
164})
165```
166## mindSporeLite.loadModelFromBuffer
167
168loadModelFromBuffer(model: ArrayBuffer, callback: Callback&lt;Model&gt;): void
169
170Loads the input model from the memory for inference. This API uses an asynchronous callback to return the result.
171
172**System capability**: SystemCapability.AI.MindSporeLite
173
174**Parameters**
175
176| Name  | Type                     | Mandatory| Description                    |
177| -------- | ------------------------- | ---- | ------------------------ |
178| model    | ArrayBuffer               | Yes  | Memory that contains the input model.        |
179| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
180
181**Example**
182```ts
183// Construct a singleton object.
184export class GlobalContext {
185  private constructor() {}
186  private static instance: GlobalContext;
187  private _objects = new Map<string, Object>();
188
189  public static getContext(): GlobalContext {
190    if (!GlobalContext.instance) {
191      GlobalContext.instance = new GlobalContext();
192    }
193    return GlobalContext.instance;
194  }
195
196  getObject(value: string): Object | undefined {
197    return this._objects.get(value);
198  }
199
200  setObject(key: string, objectClass: Object): void {
201    this._objects.set(key, objectClass);
202  }
203
204}
205```
206
207```ts
208import resourceManager from '@ohos.resourceManager'
209import { GlobalContext } from '../GlobalContext';
210import mindSporeLite from '@ohos.ai.mindSporeLite';
211import common from '@ohos.app.ability.common';
212export class Test {
213  value:number = 0;
214  foo(): void {
215    GlobalContext.getContext().setObject("value", this.value);
216  }
217}
218let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
219
220let modelName = '/path/to/xxx.ms';
221globalContext.resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
222  let modelBuffer : ArrayBuffer = buffer.buffer;
223  mindSporeLite.loadModelFromBuffer(modelBuffer, (result : mindSporeLite.Model) => {
224    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
225    console.log(modelInputs[0].name);
226  })
227})
228```
229## mindSporeLite.loadModelFromBuffer
230
231loadModelFromBuffer(model: ArrayBuffer, context: Context, callback: Callback&lt;Model&gt;): void
232
233Loads the input model from the memory for inference. This API uses an asynchronous callback to return the result.
234
235**System capability**: SystemCapability.AI.MindSporeLite
236
237**Parameters**
238
239| Name  | Type                               | Mandatory| Description                  |
240| -------- | ----------------------------------- | ---- | ---------------------- |
241| model    | ArrayBuffer                   | Yes  | Memory that contains the input model.|
242| context | [Context](#context) | Yes | Configuration information of the running environment.|
243| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
244
245**Example**
246
247```ts
248import resourceManager from '@ohos.resourceManager'
249import { GlobalContext } from '../GlobalContext';
250import mindSporeLite from '@ohos.ai.mindSporeLite';
251import common from '@ohos.app.ability.common';
252let modelName = '/path/to/xxx.ms';
253export class Test {
254  value:number = 0;
255  foo(): void {
256    GlobalContext.getContext().setObject("value", this.value);
257  }
258}
259let globalContext= GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
260
261globalContext.resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
262  let modelBuffer : ArrayBuffer = buffer.buffer;
263  let context: mindSporeLite.Context = {};
264  context.target = ['cpu'];
265  mindSporeLite.loadModelFromBuffer(modelBuffer, context, (result : mindSporeLite.Model) => {
266    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
267    console.log(modelInputs[0].name);
268  })
269})
270```
271## mindSporeLite.loadModelFromBuffer
272
273loadModelFromBuffer(model: ArrayBuffer, context?: Context): Promise&lt;Model&gt;
274
275Loads the input model from the memory for inference. This API uses a promise to return the result.
276
277**System capability**: SystemCapability.AI.MindSporeLite
278
279**Parameters**
280
281| Name | Type               | Mandatory| Description                |
282| ------- | ------------------- | ---- | -------------------- |
283| model   | ArrayBuffer         | Yes  | Memory that contains the input model.    |
284| context | [Context](#context) | No  | Configuration information of the running environment.|
285
286**Return value**
287
288| Type                           | Description                        |
289| ------------------------------- | ---------------------------- |
290| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
291
292**Example**
293
294```ts
295import resourceManager from '@ohos.resourceManager'
296import { GlobalContext } from '../GlobalContext';
297import mindSporeLite from '@ohos.ai.mindSporeLite';
298import common from '@ohos.app.ability.common';
299let modelName = '/path/to/xxx.ms';
300export class Test {
301  value:number = 0;
302  foo(): void {
303    GlobalContext.getContext().setObject("value", this.value);
304  }
305}
306let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
307
308globalContext.resourceManager.getRawFileContent(modelName).then((buffer : Uint8Array) => {
309  let modelBuffer : ArrayBuffer = buffer.buffer;
310  mindSporeLite.loadModelFromBuffer(modelBuffer).then((result : mindSporeLite.Model) => {
311    let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
312    console.log(modelInputs[0].name);
313  })
314})
315```
316## mindSporeLite.loadModelFromFd
317
318loadModelFromFd(model: number, callback: Callback&lt;Model&gt;): void
319
320Loads the input model based on the specified file descriptor for inference. This API uses an asynchronous callback to return the result.
321
322**System capability**: SystemCapability.AI.MindSporeLite
323
324**Parameters**
325
326| Name  | Type                               | Mandatory| Description                  |
327| -------- | ----------------------------------- | ---- | ---------------------- |
328| model    | number                         | Yes  | File descriptor of the input model.|
329| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
330
331**Example**
332
333```ts
334import fs from '@ohos.file.fs';
335let model_file = '/path/to/xxx.ms';
336let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
337mindSporeLite.loadModelFromFd(file.fd, (result : mindSporeLite.Model) => {
338  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
339  console.log(modelInputs[0].name);
340})
341```
342## mindSporeLite.loadModelFromFd
343
344loadModelFromFd(model: number, context: Context, callback: Callback&lt;Model&gt;): void
345
346Loads the input model based on the specified file descriptor for inference. This API uses an asynchronous callback to return the result.
347
348**System capability**: SystemCapability.AI.MindSporeLite
349
350**Parameters**
351
352| Name  | Type                               | Mandatory| Description                  |
353| -------- | ----------------------------------- | ---- | ---------------------- |
354| model    | number                   | Yes  | File descriptor of the input model.|
355| context | [Context](#context) | Yes | Configuration information of the running environment.|
356| callback | Callback<[Model](#model)> | Yes  | Callback used to return the result, which is a **Model** object.|
357
358**Example**
359
360```ts
361import fs from '@ohos.file.fs';
362let model_file = '/path/to/xxx.ms';
363let context : mindSporeLite.Context = {};
364context.target = ['cpu'];
365let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
366mindSporeLite.loadModelFromFd(file.fd, context, (result : mindSporeLite.Model) => {
367  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
368  console.log(modelInputs[0].name);
369})
370```
371## mindSporeLite.loadModelFromFd
372
373loadModelFromFd(model: number, context?: Context): Promise&lt; Model&gt;
374
375Loads the input model based on the specified file descriptor for inference. This API uses a promise to return the result.
376
377**System capability**: SystemCapability.AI.MindSporeLite
378
379**Parameters**
380
381| Name | Type               | Mandatory| Description                |
382| ------- | ------------------- | ---- | -------------------- |
383| model   | number              | Yes  | File descriptor of the input model.  |
384| context | [Context](#context) | No  | Configuration information of the running environment.|
385
386**Return value**
387
388| Type                     | Description                        |
389| ------------------------- | ---------------------------- |
390| Promise<[Model](#model)> | Promise used to return the result, which is a **Model** object.|
391
392**Example**
393
394```ts
395import fs from '@ohos.file.fs';
396let model_file = '/path/to/xxx.ms';
397let file = fs.openSync(model_file, fs.OpenMode.READ_ONLY);
398let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFd(file.fd);
399let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
400console.log(modelInputs[0].name);
401```
402## Model
403
404Represents a **Model** instance, with properties and APIs defined.
405
406In the following sample code, you first need to use [loadModelFromFile()](#mindsporeliteloadmodelfromfile), [loadModelFromBuffer()](#mindsporeliteloadmodelfrombuffer), or [loadModelFromFd()](#mindsporeliteloadmodelfromfd) to obtain a **Model** instance before calling related APIs.
407
408### getInputs
409
410getInputs(): MSTensor[]
411
412Obtains the model input for inference.
413
414**System capability**: SystemCapability.AI.MindSporeLite
415
416**Return value**
417
418| Type                   | Description              |
419| ----------------------- | ------------------ |
420| [MSTensor](#mstensor)[] | **MSTensor** object.|
421
422**Example**
423
424```ts
425let model_file = '/path/to/xxx.ms';
426mindSporeLite.loadModelFromFile(model_file).then((result : mindSporeLite.Model) => {
427  let modelInputs : mindSporeLite.MSTensor[] = result.getInputs();
428  console.log(modelInputs[0].name);
429})
430```
431### predict
432
433predict(inputs: MSTensor[], callback: Callback&lt;MSTensor[]&gt;): void
434
435Executes the inference model. This API uses an asynchronous callback to return the result. Ensure that the model object is not reclaimed when being invoked.
436
437**System capability**: SystemCapability.AI.MindSporeLite
438
439**Parameters**
440
441| Name| Type                   | Mandatory| Description                      |
442| ------ | ----------------------- | ---- | -------------------------- |
443| inputs | [MSTensor](#mstensor)[] | Yes  | List of input models.  |
444| callback | Callback<[MSTensor](#mstensor)[]> | Yes  | Callback used to return the result, which is an **MSTensor** object.|
445
446**Example**
447
448```ts
449import resourceManager from '@ohos.resourceManager'
450import { GlobalContext } from '../GlobalContext';
451import mindSporeLite from '@ohos.ai.mindSporeLite';
452import common from '@ohos.app.ability.common';
453export class Test {
454  value:number = 0;
455  foo(): void {
456    GlobalContext.getContext().setObject("value", this.value);
457  }
458}
459let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
460
461let inputName = 'input_data.bin';
462globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
463  let modelBuffer : ArrayBuffer = buffer.buffer;
464  let model_file : string = '/path/to/xxx.ms';
465  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
466  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
467
468  modelInputs[0].setData(modelBuffer);
469  mindSporeLiteModel.predict(modelInputs, (result : mindSporeLite.MSTensor[]) => {
470    let output = new Float32Array(result[0].getData());
471    for (let i = 0; i < output.length; i++) {
472      console.log(output[i].toString());
473    }
474  })
475})
476```
477### predict
478
479predict(inputs: MSTensor[]): Promise&lt;MSTensor[]&gt;
480
481Executes the inference model. This API uses a promise to return the result. Ensure that the model object is not reclaimed when being invoked.
482
483**System capability**: SystemCapability.AI.MindSporeLite
484
485**Parameters**
486
487| Name| Type                   | Mandatory| Description                          |
488| ------ | ----------------------- | ---- | ------------------------------ |
489| inputs | [MSTensor](#mstensor)[] | Yes  | List of input models.  |
490
491**Return value**
492
493| Type                   | Description                  |
494| ----------------------- | ---------------------- |
495| [MSTensor](#mstensor)[] | List of **MSTensor** objects.|
496
497**Example**
498
499```ts
500import resourceManager from '@ohos.resourceManager'
501import { GlobalContext } from '../GlobalContext';
502import mindSporeLite from '@ohos.ai.mindSporeLite';
503import common from '@ohos.app.ability.common';
504export class Test {
505    value:number = 0;
506    foo(): void {
507    GlobalContext.getContext().setObject("value", this.value);
508}
509}
510let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;;
511let inputName = 'input_data.bin';
512globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
513  let inputBuffer = buffer.buffer;
514  let model_file = '/path/to/xxx.ms';
515  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
516  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
517  modelInputs[0].setData(modelBuffer);
518  mindSporeLiteModel.predict(modelInputs).then((result : mindSporeLite.MSTensor[]) => {
519    let output = new Float32Array(result[0].getData());
520    for (let i = 0; i < output.length; i++) {
521      console.log(output[i].toString());
522    }
523  })
524})
525```
526
527### resize
528
529resize(inputs: MSTensor[], dims: Array&lt;Array&lt;number&gt;&gt;): boolean
530
531Resets the tensor size.
532
533**System capability**: SystemCapability.AI.MindSporeLite
534
535**Parameters**
536
537| Name| Type                 | Mandatory| Description                         |
538| ------ | --------------------- | ---- | ----------------------------- |
539| inputs | [MSTensor](#mstensor)[]            | Yes  | List of input models. |
540| dims   | Array&lt;Array&lt;number&gt;&gt; | Yes  | Target tensor size.|
541
542**Return value**
543
544| Type   | Description                                                        |
545| ------- | ------------------------------------------------------------ |
546| boolean | Result indicating whether the setting is successful. The value **true** indicates that the tensor size is successfully reset, and the value **false** indicates the opposite.|
547
548**Example**
549
550```ts
551let model_file = '/path/to/xxx.ms';
552mindSporeLite.loadModelFromFile(model_file).then((mindSporeLiteModel : mindSporeLite.Model) => {
553  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
554  let new_dim = new Array([1,32,32,1]);
555  mindSporeLiteModel.resize(modelInputs, new_dim);
556})
557```
558
559## MSTensor
560
561Represents an **MSTensor** instance, with properties and APIs defined. It is a special data structure similar to arrays and matrices. It is the basic data structure used in MindSpore Lite network operations.
562
563In the following sample code, you first need to use [getInputs()](#getinputs) to obtain an **MSTensor** instance before calling related APIs.
564
565### Attributes
566
567**System capability**: SystemCapability.AI.MindSporeLite
568
569| Name      | Type                 | Readable| Writable| Description                                                |
570| ---------- | --------------------- | ---- | ---- | ---------------------------------------------------- |
571| name       | string                | Yes  | Yes  | Tensor name. The default value is **null**.                              |
572| shape      | number[]              | Yes  | Yes  | Tensor dimension array. The default value is **0**.                           |
573| elementNum | number                | Yes  | Yes  | Length of the tensor dimension array. The default value is **0**.                     |
574| dataSize   | number                | Yes  | Yes  | Length of tensor data. The default value is **0**.                         |
575| dtype      | [DataType](#datatype) | Yes  | Yes  | Tensor data type. The default value is **0**, indicating **TYPE_UNKNOWN**.       |
576| format     | [Format](#format)     | Yes  | Yes  | Tensor data format. The default value is **-1**, indicating **DEFAULT_FORMAT**.|
577
578**Example**
579
580```ts
581let model_file = '/path/to/xxx.ms';
582mindSporeLite.loadModelFromFile(model_file).then((mindSporeLiteModel : mindSporeLite.Model) => {
583  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
584  console.log(modelInputs[0].name);
585  console.log(modelInputs[0].shape.toString());
586  console.log(modelInputs[0].elementNum.toString());
587  console.log(modelInputs[0].dtype.toString());
588  console.log(modelInputs[0].format.toString());
589  console.log(modelInputs[0].dataSize.toString());
590})
591```
592
593### getData
594
595getData(): ArrayBuffer
596
597Obtains tensor data.
598
599**System capability**: SystemCapability.AI.MindSporeLite
600
601**Return value**
602
603| Type       | Description                |
604| ----------- | -------------------- |
605| ArrayBuffer | Pointer to the tensor data.|
606
607**Example**
608
609```ts
610import resourceManager from '@ohos.resourceManager'
611import { GlobalContext } from '../GlobalContext';
612import mindSporeLite from '@ohos.ai.mindSporeLite';
613import common from '@ohos.app.ability.common';
614export class Test {
615  value:number = 0;
616  foo(): void {
617    GlobalContext.getContext().setObject("value", this.value);
618  }
619}
620let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
621let inputName = 'input_data.bin';
622globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
623  let inputBuffer = buffer.buffer;
624  let model_file = '/path/to/xxx.ms';
625  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
626  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
627  modelInputs[0].setData(inputBuffer);
628  mindSporeLiteModel.predict(modelInputs).then((result : mindSporeLite.MSTensor[]) => {
629    let output = new Float32Array(result[0].getData());
630    for (let i = 0; i < output.length; i++) {
631      console.log(output[i].toString());
632    }
633  })
634})
635```
636
637### setData
638
639setData(inputArray: ArrayBuffer): void
640
641Sets the tensor data.
642
643**System capability**: SystemCapability.AI.MindSporeLite
644
645**Parameters**
646
647| Name    | Type       | Mandatory| Description                  |
648| ---------- | ----------- | ---- | ---------------------- |
649| inputArray | ArrayBuffer | Yes  | Input data buffer of the tensor.|
650
651**Example**
652
653```ts
654import resourceManager from '@ohos.resourceManager'
655import { GlobalContext } from '../GlobalContext';
656import mindSporeLite from '@ohos.ai.mindSporeLite';
657import common from '@ohos.app.ability.common';
658export class Test {
659  value:number = 0;
660  foo(): void {
661    GlobalContext.getContext().setObject("value", this.value);
662  }
663}
664let globalContext = GlobalContext.getContext().getObject("value") as common.UIAbilityContext;
665let inputName = 'input_data.bin';
666globalContext.resourceManager.getRawFileContent(inputName).then(async (buffer : Uint8Array) => {
667  let inputBuffer = buffer.buffer;
668  let model_file = '/path/to/xxx.ms';
669  let mindSporeLiteModel : mindSporeLite.Model = await mindSporeLite.loadModelFromFile(model_file);
670  let modelInputs : mindSporeLite.MSTensor[] = mindSporeLiteModel.getInputs();
671  modelInputs[0].setData(inputBuffer);
672})
673```
674
675## DataType
676
677Tensor data type.
678
679**System capability**: SystemCapability.AI.MindSporeLite
680
681| Name               | Value  | Description               |
682| ------------------- | ---- | ------------------- |
683| TYPE_UNKNOWN        | 0    | Unknown type.         |
684| NUMBER_TYPE_INT8    | 32   | Int8 type.   |
685| NUMBER_TYPE_INT16   | 33   | Int16 type.  |
686| NUMBER_TYPE_INT32   | 34   | Int32 type.  |
687| NUMBER_TYPE_INT64   | 35   | Int64 type.  |
688| NUMBER_TYPE_UINT8   | 37   | UInt8 type.  |
689| NUMBER_TYPE_UINT16  | 38   | UInt16 type. |
690| NUMBER_TYPE_UINT32  | 39   | UInt32 type. |
691| NUMBER_TYPE_UINT64  | 40   | UInt64 type. |
692| NUMBER_TYPE_FLOAT16 | 42   | Float16 type.|
693| NUMBER_TYPE_FLOAT32 | 43   | Float32 type.|
694| NUMBER_TYPE_FLOAT64 | 44   | Float64 type.|
695
696## Format
697
698Enumerates tensor data formats.
699
700**System capability**: SystemCapability.AI.MindSporeLite
701
702| Name          | Value  | Description                 |
703| -------------- | ---- | --------------------- |
704| DEFAULT_FORMAT | -1   | Unknown data format.   |
705| NCHW           | 0    | NCHW format. |
706| NHWC           | 1    | NHWC format. |
707| NHWC4          | 2    | NHWC4 format.|
708| HWKC           | 3    | HWKC format. |
709| HWCK           | 4    | HWCK format. |
710| KCHW           | 5    | KCHW format. |
711