• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Development Self-Test Framework User Guide
2
3
4## Overview
5
6OpenHarmony provides you with a comprehensive development self-test framework **developer_test**. As a part of the OpenHarmony test toolset, the development self-test framework is provided for you to test the development by yourself. You can develop relevant test cases based on your test requirements to discover defects at the development phase, greatly improving the code quality.
7
8This document describes how to use the development self-test framework of OpenHarmony.
9
10
11### Introduction
12
13After adding or modifying code, OpenHarmony developers want to quickly verify whether the modified code functions properly, and the system already has a large number of automated test cases of existing functions, such as TDD cases and XTS cases. The development self-test framework aims to help you improve your self-test efficiency so that you can quickly execute the specified automated test cases and conducting development tests at the development phase.
14
15
16### Constraints
17
18When executing test cases using the framework, you must connect to the OpenHarmony device in advance.
19
20
21## Environment Preparations
22
23The development self-test framework depends on the Python environment. It is required that the Python version be 3.7.5 or later. Before using the framework, you can refer to the following document for configuration.
24
25Click [here](https://gitee.com/openharmony/docs/blob/master/en/device-dev/get-code/sourcecode-acquire.md) to obtain the source code.
26
27### Basic Self-Test Framework Environment
28
29| Environment Dependency         | Version                                                    | Description                                                    |
30| ----------------- | ------------------------------------------------------------ | ------------------------------------------------------------ |
31| Operating system         | Ubuntu 18.04 or later                                           | Code compilation environment.                                                |
32| Linux extension component| libreadline-dev                                              | Plugin used to read commands.                                              |
33| python            | 3.7.5 or later                                             | Language used by the test framework.                                                |
34| Python plugins       | pyserial 3.3 or later, paramiko 2.7.1 or later, setuptools 40.8.0 or later, and rsa4.0 or later| - **pserial**: supports Python serial port communication.<br>- **paramiko**: allows Python to use SSH. <br>- **setuptools**: allows Python packages to be created and distributed easily. <br>- **rsa**: implements RSA encryption in Python.|
35| NFS Server        | haneWIN NFS Server 1.2.50 or later or NFS v4 or later            | Devices can be connected using serial ports. Mini- and small-system devices are used.                    |
36| HDC               | 1.1.0                                                        | A tool that enables devices to be connected through the HarmonyOS Device Connector (HDC).                                         |
37
38
39
401. Run the following command to install the Linux extended component readline:
41
42    ```bash
43    sudo apt-get install libreadline-dev
44    ```
45    The installation is successful if the following information is displayed:
46    ```
47    Reading package lists... Done
48    Building dependency tree
49    Reading state information... Done
50    libreadline-dev is already the newest version (7.0-3).
51    0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
52    ```
53
542. Run the following command to install the **setuptools** plugin:
55    ```bash
56    pip3 install setuptools
57    ```
58    The installation is successful if the following information is displayed:
59    ```
60    Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
61    ```
62
633. Run the following command to install the **paramiko** plugin:
64    ```bash
65    pip3 install paramiko
66    ```
67    The installation is successful if the following information is displayed:
68    ```
69    Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko
70    Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0
71    ```
72
734. Run the following command to install the **ras** plugin:
74    ```bash
75    pip3 install rsa
76    ```
77    The installation is successful if the following information is displayed:
78    ```
79    Installing collected packages: pyasn1, rsa
80    Successfully installed pyasn1-0.4.8 rsa-4.7
81    ```
82
835. Run the following command to install the **pyserial** plugin:
84    ```bash
85    pip3 install pyserial
86    ```
87    The installation is successful if the following information is displayed:
88    ```
89    Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
90    ```
91
926. Install the NFS server if the device outputs results only through the serial port.
93
94    > This step is applicable to small-system or mini-system devices.
95
96    - Windows OS: Install the **haneWIN NFS Server1.2.50** package.
97    - Linux OS: Run the following command to install the NFS server:
98    ```bash
99    sudo apt install nfs-kernel-server
100    ```
101    The installation is successful if the following information is displayed:
102    ```
103    Reading package lists... Done
104    Building dependency tree
105    Reading state information... Done
106    nfs-kernel-server is already the newest version (1:1.3.4-2.1ubuntu5.3).
107    0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
108    ```
109
1107. If the device supports HDC connection, install the HDC tool. For details about the installation process, see [HDC-OpenHarmony device connector](https://gitee.com/openharmony/developtools_hdc/blob/master/README.md).
111
112
113### Environment Dependency Check
114
115| Check Item                                            | Operation                                               | Requirement                 |
116| -------------------------------------------------- | --------------------------------------------------- | ------------------------- |
117| Check whether Python is installed successfully.                                | Run the **python --version** command.               | The Python version is 3.7.5 or later.      |
118| Check whether Python plugins are successfully installed.                        | Go to the **test/developertest** directory and run **start.bat** or **start.sh**.| The **>>>** prompt is displayed.|
119| Check the NFS server status (for the devices that support only serial port output).| Log in to the development board through the serial port and run the **mount** command to mount the NFS.           | The file directory can be mounted.   |
120| Check whether the HDC is successfully installed.                                   | Run the **hdc_std -v** command.                     | The HDC version is 1.1.0 or later.      |
121
122
123## Test Case Preparation
124
125The test framework supports multiple types of tests and provides different test case templates for them._
126
127**TDD Test (C++)**
128
129Naming rules for source files
130
131The source file name of test cases must be the same as that of the test suite. The file names must use lowercase letters and in the *Function*_*Sub-function*_**test** format. More specific sub-functions can be added as required.
132Example:
133```
134calculator_sub_test.cpp
135```
136
137Test case example
138```c++
139/*
140 * Copyright (c) 2021 XXXX Device Co., Ltd.
141 * Licensed under the Apache License, Version 2.0 (the "License");
142 * you may not use this file except in compliance with the License.
143 * You may obtain a copy of the License at
144 *
145 *     http://www.apache.org/licenses/LICENSE-2.0
146 *
147 * Unless required by applicable law or agreed to in writing, software
148 * distributed under the License is distributed on an "AS IS" BASIS,
149 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
150 * See the License for the specific language governing permissions and
151 * limitations under the License.
152 */
153
154#include "calculator.h"
155#include <gtest/gtest.h>
156
157using namespace testing::ext;
158
159class CalculatorSubTest : public testing::Test {
160public:
161    static void SetUpTestCase(void);
162    static void TearDownTestCase(void);
163    void SetUp();
164    void TearDown();
165};
166
167void CalculatorSubTest::SetUpTestCase(void)
168{
169    // Set a setup function, which will be called before all test cases.
170}
171
172void CalculatorSubTest::TearDownTestCase(void)
173{
174    // Set a teardown function, which will be called after all test cases.
175}
176
177void CalculatorSubTest::SetUp(void)
178{
179    // Set a setup function, which will be called before all test cases.
180}
181
182void CalculatorSubTest::TearDown(void)
183{
184    // Set a teardown function, which will be called after all test cases.
185}
186
187/**
188 * @tc.name: integer_sub_001
189 * @tc.desc: Verify the sub function.
190 * @tc.type: FUNC
191 * @tc.require: issueNumber
192 */
193HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
194{
195    // Step 1 Call the function to obtain the test result.
196    int actual = Sub(4, 0);
197
198    // Step 2 Use an assertion to compare the obtained result with the expected result.
199    EXPECT_EQ(4, actual);
200}
201```
202The procedure is as follows:
203
2041. Add comment information to the test case file header.
205
206```
207/*
208 * Copyright (c) 2021 XXXX Device Co., Ltd.
209 * Licensed under the Apache License, Version 2.0 (the "License");
210 * you may not use this file except in compliance with the License.
211 * You may obtain a copy of the License at
212 *
213 *     http://www.apache.org/licenses/LICENSE-2.0
214 *
215 * Unless required by applicable law or agreed to in writing, software
216 * distributed under the License is distributed on an "AS IS" BASIS,
217 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
218 * See the License for the specific language governing permissions and
219 * limitations under the License.
220 */
221```
222
2232. Add the test framework header file and namespace.
224
225```c++
226#include <gtest/gtest.h>
227
228using namespace testing::ext;
229```
230
2313. Add the header file of the test class.
232
233```c++
234#include "calculator.h"
235```
236
2374. Define the test suite (test class).
238
239```c++
240class CalculatorSubTest : public testing::Test {
241public:
242    static void SetUpTestCase(void);
243    static void TearDownTestCase(void);
244    void SetUp();
245    void TearDown();
246};
247
248void CalculatorSubTest::SetUpTestCase(void)
249{
250    // Set a setup function, which will be called before all test cases.
251}
252
253void CalculatorSubTest::TearDownTestCase(void)
254{
255    // Set a teardown function, which will be called after all test cases.
256}
257
258void CalculatorSubTest::SetUp(void)
259{
260    // Set a setup function, which will be called before all test cases.
261}
262
263void CalculatorSubTest::TearDown(void)
264{
265    // Set a teardown function, which will be called after all test cases.
266}==
267```
268> **NOTE**
269>
270> When defining a test suite, ensure that the test suite name is the same as the target to build and uses the upper camel case style.
271
2725. Add implementation of the test cases, including test case comments and logic.
273
274```c++
275/**
276 * @tc.name: integer_sub_001
277 * @tc.desc: Verify the sub function.
278 * @tc.type: FUNC
279 * @tc.require: issueNumber
280 */
281HWTEST_F(CalculatorSubTest, integer_sub_001, TestSize.Level1)
282{
283    // Step 1 Call the function to obtain the test result.
284    int actual = Sub(4, 0);
285
286    // Step 2 Use an assertion to compare the obtained result with the expected result.
287    EXPECT_EQ(4, actual);
288}
289```
290> **NOTE**
291>
292> @tc.require: The format must be started with **AR/SR** or **issue**, for example, **issueI56WJ7**.
293
294The following test case templates are provided for your reference.
295
296| Type           | Description                                            |
297| --------------- | ------------------------------------------------ |
298| HWTEST(A,B,C)   | Use this template if the test case execution does not depend on setup or teardown.          |
299| HWTEST_F(A,B,C) | Use this template if the test case execution (excluding parameters) depends on setup or teardown.|
300| HWTEST_P(A,B,C) | Use this template if the test case execution (including parameters) depends on setup or teardown.    |
301
302In the template names:
303
304- **A** indicates the test suite name.
305
306- **B** indicates the test case name, which is in the *Function*_*No.* format. The *No.* is a three-digit number starting from **001**.
307
308- *C* indicates the test case level. There are five test case levels: guard-control level 0 and non-guard-control level 1 to level 4. Of levels 1 to 4, a smaller value indicates a more important function verified by the test case.
309
310
311**NOTE**
312
313- The expected result of each test case must have an assertion.
314
315- The test case level must be specified.
316
317- It is recommended that the test be implemented step by step according to the template.
318
319- The comment must contain the test case name, description, type, and requirement number, which are in the @tc.*xxx*: *value* format. The test case type @**tc.type** can be any of the following:
320
321
322| Test Case Type| Code|
323| ------------ | -------- |
324| Function test    | FUNC     |
325| Performance test    | PERF     |
326| Reliability test  | RELI     |
327| Security test    | SECU     |
328| Fuzzing test    | FUZZ     |
329
330**TDD Test (JS)**
331
332- Naming rules for source files
333
334
335The source file name of a test case must be in the *Function**Sub-function***Test** format, and each part must use the upper camel case style. More specific sub-functions can be added as required.
336Example:
337```
338AppInfoTest.js
339```
340
341- Test case example
342
343```js
344/*
345 * Copyright (C) 2021 XXXX Device Co., Ltd.
346 * Licensed under the Apache License, Version 2.0 (the "License");
347 * you may not use this file except in compliance with the License.
348 * You may obtain a copy of the License at
349 *
350 *     http://www.apache.org/licenses/LICENSE-2.0
351 *
352 * Unless required by applicable law or agreed to in writing, software
353 * distributed under the License is distributed on an "AS IS" BASIS,
354 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
355 * See the License for the specific language governing permissions and
356 * limitations under the License.
357 */
358import app from '@system.app'
359
360import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
361
362describe("AppInfoTest", function () {
363    beforeAll(function() {
364        // Set a setup function, which will be called before all test cases.
365         console.info('beforeAll caled')
366    })
367
368    afterAll(function() {
369         // Set a teardown function, which will be called after all test cases.
370         console.info('afterAll caled')
371    })
372
373    beforeEach(function() {
374        // Set a setup function, which will be called before all test cases.
375         console.info('beforeEach caled')
376    })
377
378    afterEach(function() {
379        // Set a teardown function, which will be called after all test cases.
380         console.info('afterEach caled')
381    })
382
383    /*
384     * @tc.name:appInfoTest001
385     * @tc.desc:verify app info is not null
386     * @tc.type: FUNC
387     * @tc.require: issueNumber
388     */
389    it("appInfoTest001", 0, function () {
390        // Step 1 Call the function to obtain the test result.
391        var info = app.getInfo()
392
393        // Step 2 Use an assertion to compare the obtained result with the expected result.
394        expect(info != null).assertEqual(true)
395    })
396})
397```
398The procedure is as follows:
3991. Add comment information to the test case file header.
400    ```
401    /*
402     * Copyright (C) 2021 XXXX Device Co., Ltd.
403     * Licensed under the Apache License, Version 2.0 (the "License");
404     * you may not use this file except in compliance with the License.
405     * You may obtain a copy of the License at
406     *
407     *     http://www.apache.org/licenses/LICENSE-2.0
408     *
409     * Unless required by applicable law or agreed to in writing, software
410     * distributed under the License is distributed on an "AS IS" BASIS,
411     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
412     * See the License for the specific language governing permissions and
413     * limitations under the License.
414     */
415    ```
4162. Import the APIs and JSUnit test library to test.
417    ```js
418    import app from '@system.app'
419
420    import {describe, beforeAll, beforeEach, afterEach, afterAll, it, expect} from 'deccjsunit/index'
421    ```
4223. Define the test suite (test class).
423    ```js
424    describe("AppInfoTest", function () {
425        beforeAll(function() {
426            // Set a setup function, which will be called before all test cases.
427             console.info('beforeAll caled')
428        })
429
430        afterAll(function() {
431             // Set a teardown function, which will be called after all test cases.
432             console.info('afterAll caled')
433        })
434
435        beforeEach(function() {
436            // Set a setup function, which will be called before all test cases.
437             console.info('beforeEach caled')
438        })
439
440        afterEach(function() {
441            // Set a teardown function, which will be called after all test cases.
442             console.info('afterEach caled')
443        })
444    ```
4454. Add implementation of the test cases.
446    ```JS
447    /*
448     * @tc.name:appInfoTest001
449     * @tc.desc:verify app info is not null
450     * @tc.type: FUNC
451     * @tc.require: issueNumber
452     */
453     it("appInfoTest001", 0, function () {
454        // Step 1 Call the function to obtain the test result.
455        var info = app.getInfo()
456
457        // Step 2 Use an assertion to compare the obtained result with the expected result.
458        expect(info != null).assertEqual(true)
459     })
460    ```
461    > **NOTE**
462    >
463    > @tc.require: The format must be started with **issue**, for example, **issueI56WJ7**.
464
465**Fuzzing Test**
466
467[Fuzzing case specifications](https://gitee.com/openharmony/testfwk_developer_test/blob/master/libs/fuzzlib/README_zh.md)
468
469
470**Benchmark Test**
471
472[Benchmark case specifications](https://gitee.com/openharmony/testfwk_developer_test/blob/master/libs/benchmark/README_zh.md)
473
474## **Test Case Building**
475
476When a test case is executed, the test framework searches for the build file of the test case in the test case directory and builds the test case located. The following describes how to write build files (GN files) in different programming languages.
477
478**TDD Test**
479
480The following provides templates for different languages for your reference.
481
482- **Test case build file example (C++)**
483
484```
485# Copyright (c) 2021 XXXX Device Co., Ltd.
486
487import("//build/test.gni")
488
489module_output_path = "developertest/calculator"
490
491config("module_private_config") {
492  visibility = [ ":*" ]
493
494  include_dirs = [ "../../../include" ]
495}
496
497ohos_unittest("CalculatorSubTest") {
498  module_out_path = module_output_path
499
500  sources = [
501    "../../../include/calculator.h",
502    "../../../src/calculator.cpp",
503  ]
504
505  sources += [ "calculator_sub_test.cpp" ]
506
507  configs = [ ":module_private_config" ]
508
509  deps = [ "//third_party/googletest:gtest_main" ]
510}
511
512group("unittest") {
513  testonly = true
514  deps = [":CalculatorSubTest"]
515}
516```
517The procedure is as follows:
518
5191. Add comment information for the file header.
520    ```
521    # Copyright (c) 2021 XXXX Device Co., Ltd.
522    ```
5232. Import the build template.
524    ```
525    import("//build/test.gni")
526    ```
5273. Specify the file output path.
528  ```
529  module_output_path = "developertest/calculator"
530  ```
531  > **NOTE**<br>The output path is ***Part name*/*Module name***.
532
5334. Configure the directories for dependencies.
534
535  ```
536  config("module_private_config") {
537    visibility = [ ":*" ]
538
539    include_dirs = [ "../../../include" ]
540  }
541  ```
542  > **NOTE**
543  >
544  > Generally, the dependency directories are configured here and directly referenced in the build script of the test case.
545
5465. Set the output build file for the test cases.
547
548  ```
549  ohos_unittest("CalculatorSubTest") {
550  }
551  ```
5526. Write the build script (add the source file, configuration, and dependencies) for the test cases.
553  ```
554  ohos_unittest("CalculatorSubTest") {
555    module_out_path = module_output_path
556    sources = [
557      "../../../include/calculator.h",
558      "../../../src/calculator.cpp",
559      "../../../test/calculator_sub_test.cpp"
560    ]
561    sources += [ "calculator_sub_test.cpp" ]
562    configs = [ ":module_private_config" ]
563    deps = [ "//third_party/googletest:gtest_main" ]
564  }
565  ```
566
567    > **NOTE**
568    >
569    > Set the test type based on actual requirements. The following test types are available:
570    >
571    > - **ohos_unittest**: unit test
572    > - **ohos_moduletest**: module test
573    > - **ohos_systemtest**: system test
574    > - **ohos_performancetest**: performance test
575    > - **ohos_securitytest**: security test
576    > - **ohos_reliabilitytest**: reliability test
577    > - **ohos_distributedtest**: distributed test
578
5797. Group the test case files by test type.
580
581  ```
582  group("unittest") {
583    testonly = true
584    deps = [":CalculatorSubTest"]
585  }
586  ```
587  > **NOTE**
588  >
589  > Grouping test cases by test type allows you to execute a specific type of test cases when required.
590
591- **Test case build file example (JavaScript)**
592
593```
594# Copyright (C) 2021 XXXX Device Co., Ltd.
595
596import("//build/test.gni")
597
598module_output_path = "developertest/app_info"
599
600ohos_js_unittest("GetAppInfoJsTest") {
601  module_out_path = module_output_path
602
603  hap_profile = "./config.json"
604  certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
605}
606
607group("unittest") {
608  testonly = true
609  deps = [ ":GetAppInfoJsTest" ]
610}
611```
612
613The procedure is as follows:
614
6151. Add comment information for the file header.
616
617```
618# Copyright (C) 2021 XXXX Device Co., Ltd.
619```
620
6212. Import the build template.
622
623```
624import("//build/test.gni")
625```
626
6273. Specify the file output path.
628
629```
630module_output_path = "developertest/app_info"
631```
632> **NOTE**
633>
634> The output path is ***Part name*/*Module name***.
635
6364. Set the output build file for the test cases.
637
638```
639ohos_js_unittest("GetAppInfoJsTest") {
640}
641```
642> **NOTE**
643> - Use the **ohos_js_unittest** template to define the JavaScript test suite. Pay attention to the difference between JavaScript and C++.
644> - The file generated for the JavaScript test suite must be in .hap format and named after the test suite name defined here. The test suite name must end with **JsTest**.
645
6465. Configure the **config.json** file and signature file, which are mandatory.
647
648```
649ohos_js_unittest("GetAppInfoJsTest") {
650  module_out_path = module_output_path
651
652  hap_profile = "./config.json"
653  certificate_profile = "//test/developertest/signature/openharmony_sx.p7b"
654}
655```
656**config.json** is the configuration file required for HAP build. You need to set **target** based on the tested SDK version. Default values can be retained for other items. The following is an example:
657
658```json
659{
660  "app": {
661    "bundleName": "com.example.myapplication",
662    "vendor": "example",
663    "version": {
664      "code": 1,
665      "name": "1.0"
666    },
667    "apiVersion": {
668           "compatible": 4,
669         "target": 5     // Set it based on the tested SDK version. In this example, SDK5 is used.
670    }
671  },
672  "deviceConfig": {},
673  "module": {
674    "package": "com.example.myapplication",
675    "name": ".MyApplication",
676    "deviceType": [
677      "phone"
678    ],
679    "distro": {
680      "deliveryWithInstall": true,
681      "moduleName": "entry",
682      "moduleType": "entry"
683    },
684    "abilities": [
685      {
686      "skills": [
687          {
688            "entities": [
689              "entity.system.home"
690            ],
691            "actions": [
692              "action.system.home"
693            ]
694          }
695        ],
696        "name": "com.example.myapplication.MainAbility",
697        "icon": "$media:icon",
698        "description": "$string:mainability_description",
699        "label": "MyApplication",
700        "type": "page",
701        "launchType": "multiton"
702      }
703    ],
704    "js": [
705      {
706        "pages": [
707          "pages/index/index"
708        ],
709        "name": "default",
710          "window": {
711             "designWidth": 720,
712             "autoDesignWidth": false
713          }
714        }
715      ]
716    }
717  }
718```
719
7206. Group the test case files by test type.
721
722```
723group("unittest") {
724  testonly = true
725  deps = [ ":GetAppInfoJsTest" ]
726}
727```
728> **NOTE**
729>
730> Grouping test cases by test type allows you to execute a specific type of test cases when required.
731
732**Fuzzing Test**
733
734[Fuzzing case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/fuzzlib/README_zh.md)
735
736**Benchmark Test**
737
738[Benchmark case specifications](https://gitee.com/openharmony/test_developertest/blob/master/libs/benchmark/README_zh.md)
739
740
741**Configuring ohos.build**
742
743Configure the part build file to associate with specific test cases.
744```
745"partA": {
746    "module_list": [
747
748    ],
749    "inner_list": [
750
751    ],
752    "system_kits": [
753
754    ],
755    "test_list": [ // Test under configuration module calculator.
756      "//system/subsystem/partA/calculator/test:unittest",
757      "//system/subsystem/partA/calculator/test:fuzztest",
758      "//system/subsystem/partA/calculator/test:benchmarktest"
759 }
760```
761> **NOTE**<br>**test_list** contains the test cases of the corresponding module.
762
763## Configuring Test Resources
764
765Test resources include external file resources, such as image files, video files, and third-party libraries, required for test case execution.
766
767Perform the following steps:
768
7691. Create a **resource** directory under the **test** directory of the part, create a corresponding module directory under the **resource** directory, and store the resource files required in this module directory.
770
7712. In the module directory under **resource**, create the **ohos_test.xml** file in the following format:
772
773```xml
774<?xml version="1.0" encoding="UTF-8"?>
775<configuration ver="2.0">
776    <target name="CalculatorSubTest">
777        <preparer>
778            <option name="push" value="test.jpg -> /data/test/resource" src="res"/>
779            <option name="push" value="libc++.z.so -> /data/test/resource" src="out"/>
780        </preparer>
781    </target>
782</configuration>
783```
784
7853. In the build file of the test cases, configure **resource_config_file** to point to the resource file **ohos_test.xml**.
786
787```
788ohos_unittest("CalculatorSubTest") {
789  resource_config_file = "//system/subsystem/partA/test/resource/calculator/ohos_test.xml"
790}
791```
792>**NOTE**
793>- **target_name** indicates the test suite name defined in the **BUILD.gn** file in the **test** directory. **preparer** indicates the action to perform before the test suite is executed.
794>- **src="res"** indicates that the test resources are in the **resource** directory under the **test** directory. **src="out"** indicates that the test resources are in the **out/release/$(*part*)** directory.
795
796## Test Case Execution
797
798### Configuration File
799
800Before executing test cases, you need to modify the configuration based on the device used.
801
802#### Modifying user_config.xml
803```xml
804<user_config>
805  <build>
806    <!-- Whether to build a demo case. The default value is false. If a demo case is required, change the value to true. -->
807    <example>false</example>
808    <!-- Whether to build the version. The default value is false. -->
809    <version>false</version>
810    <!-- Whether to build the test cases. The default value is true. If the build is already complete, change the value to false before executing the test cases.-->
811    <testcase>true</testcase>
812	<!--When compiling test cases, select whether the target CPU is of the 64-bit or 32-bit. The default value is null (32-bit). You can select arm64. -->
813    <parameter>
814       <target_cpu></target_cpu>
815    </parameter>
816  </build>
817  <environment>
818    <!-- Configure the IP address and port number of the remote server to support connection to the device through the OpenHarmony Device Connector (HDC).-->
819    <device type="usb-hdc">
820      <ip></ip>
821      <port></port>
822      <sn></sn>
823    </device>
824    <!-- Configure the serial port information of the device to enable connection through the serial port.-->
825    <device type="com" label="ipcamera">
826      <serial>
827        <com></com>
828        <type>cmd</type>
829        <baud_rate>115200</baud_rate>
830        <data_bits>8</data_bits>
831        <stop_bits>1</stop_bits>
832        <timeout>1</timeout>
833      </serial>
834    </device>
835  </environment>
836  <!-- Configure the test case path. If the test cases have not been built (<testcase> is true), leave this parameter blank. If the build is complete, enter the path of the test cases.-->
837  <test_cases>
838    <dir></dir>
839  </test_cases>
840  <!-- Configure the coverage output path.-->
841  <coverage>
842    <outpath></outpath>
843  </coverage>
844  <!-- Configure the NFS mount information when the tested device supports only the serial port connection. Specify the NFS mapping path. host_dir indicates the NFS directory on the PC, and board_dir indicates the directory created on the board. -->
845  <NFS>
846    <host_dir></host_dir>
847    <mnt_cmd></mnt_cmd>
848    <board_dir></board_dir>
849  </NFS>
850</user_config>
851```
852>**NOTE**
853>
854>If HDC is connected to the device before the test cases are executed, you only need to configure the device IP address and port number, and retain the default settings for other parameters.
855
856### Command Description
857
8581. Start the test framework.
859	```
860	start.bat
861	```
8622. Select the product.
863
864    After the test framework starts, you are asked to select a product. Select the development board to test.
865
866	If you need to manually add a product, add it within the **\<productform\>** tag to **config/framework_config.xml**.
867
8683. Execute test cases.
869
870    Run the following command to execute test cases:
871    ```
872    run -t UT -ts CalculatorSubTest -tc interger_sub_00l
873    ```
874    In the command:
875    ```
876    -**t [TESTTYPE]**: specifies the test type, which can be **UT**, **MST**, **ST**, **PERF**, **FUZZ**, or **BENCHMARK**. This parameter is mandatory.
877    -**tp [TESTPART]**: specifies the part to test. This parameter can be used independently.
878    -**tm [TESTMODULE]**: specifies the module to test. This parameter must be specified together with **-tp**.
879    -**ts [TESTSUITE]**: specifies a test suite. This parameter can be used independently.
880    -**tc [TESTCASE]**: specifies a test case. This parameter must be specified together with **-ts**.
881    -**h**: displays help information.
882    ```
883
884#### Executing Test Cases on Windows
885
886Test cases cannot be built on Windows. You need to run the following command to build test cases on Linux:
887```
888./build.sh --product-name {product_name} --build-target make_test
889```
890
891>**NOTE**
892>- **product-name**: specifies the name of the product to be compiled.
893>- **build-target**: specifies the test case to build. **make_test** indicates all test cases. You can specify the test cases based on requirements.
894
895After the build is complete, the test cases are automatically saved in **out/ohos-arm-release/packages/phone/tests**.
896
897##### Setting Up the Execution Environment
8981. On Windows, create the **Test** directory in the test framework and then create the **testcase** directory in the **Test** directory.
899
9002. Copy **developertest** and **xdevice** from the Linux environment to the **Test** directory on Windows, and copy the test cases to the **testcase** directory.
901
902  >**NOTE**
903  >
904  >Port the test framework and test cases from the Linux environment to the Windows environment for subsequent execution.
905
9063. Modify the **user_config.xml** file.
907  ```xml
908  <build>
909    <!-- Because the test cases have been built, change the value to false. -->
910    <testcase>false</testcase>
911  </build>
912  <test_cases>
913    <!-- The test cases are copied to the Windows environment. Change the test case output path to the path of the test cases in the Windows environment.-->
914    <dir>D:\Test\testcase\tests</dir>
915  </test_cases>
916  ```
917  >**NOTE**
918  >
919  >**\<testcase>** indicates whether to build test cases. **\<dir>** indicates the path for searching for test cases.
920
921#### Executing Test Cases on Linux
922
923If you directly connect to a Linux host, you can directly run commands to execute test cases.
924
925##### Mapping the Remote Port
926To enable test cases to be executed on a remote Linux server or a Linux VM, map the port to enable communication between the device and the remote server or VM. Configure port mapping as follows:
9271. On the HDC server, run the following commands:
928  ```
929  hdc_std kill
930  hdc_std -m -s 0.0.0.0:8710
931  ```
932  >**NOTE**
933  >
934  >The IP address and port number are default values.
935
9362. On the HDC client, run the following command:
937  ```
938  hdc_std -s xx.xx.xx.xx:8710 list targets
939  ```
940  >**NOTE**
941  >
942  >Enter the IP address of the device to test.
943
944## Viewing the Test Result
945
946### Test Report Logs
947
948After the test cases are executed, the test result will be automatically generated. You can view the detailed test result in the related directory.
949
950### Test Result
951You can obtain the test result in the following directory:
952```
953test/developertest/reports/xxxx_xx_xx_xx_xx_xx
954```
955>**NOTE**
956>
957>The folder for test reports is automatically generated.
958
959The folder contains the following files:
960| Type                                | Description              |
961| ------------------------------------ | ------------------ |
962| result/                              | Test cases in standard format.|
963| log/plan_log_xxxx_xx_xx_xx_xx_xx.log | Test case logs.      |
964| summary_report.html                  | Test report summary.      |
965| details_report.html                  | Detailed test report.      |
966
967### Test Framework Logs
968```
969reports/platform_log_xxxx_xx_xx_xx_xx_xx.log
970```
971
972### Latest Test Report
973```
974reports/latest
975```
976