• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Autotest Best Practices
2When the Chrome OS team started using autotest, we tried our best to figure out
3how to fit our code and our tests into the upstream style with little guidance
4and poor documentation.  This went poorly.  With the benefit of hindsight,
5we’re going to lay out some best-practices that we’d like to enforce going
6forward.  In many cases, there is legacy code that contradicts this style; we
7should go through and refactor that code to fit these guidelines as time
8allows.
9
10## Upstream Documentation
11
12There is a sizeable volume of general Autotest documentation available on
13github:
14https://github.com/autotest/autotest/wiki
15
16## Coding style
17
18Basically PEP-8.  See [docs/coding-style.md](coding-style.md)
19
20## Where should my code live?
21
22| Type of Code              | Relative Path           |
23|---------------------------|-------------------------|
24| client-side tests         | client/site_tests/      |
25| server-side tests         | server/site_tests       |
26| common library code       | client/common_lib/cros/ |
27| server-only library code  | server/cros             |
28
29
30## Writing tests
31
32An autotest is really defined by its control file.  A control file contains
33important metadata about the test (name, author, description, duration, what
34suite it’s in, etc) and then pulls in and executes the actual test code.  This
35test code can be shared among multiple distinct test cases by parameterizing it
36and passing those parameters in from separate control files.
37
38Autotests *must*:
39
40 * Be self-contained: assume nothing about the condition of the device
41 * Be hermetic: requiring the Internet to be reachable in order for your test
42   to succeed is unacceptable.
43 * Be automatic: avoid user interaction and run-time specification of input
44   values.
45 * Be integration tests: if you can test the feature in a unit test (or a
46   chrome browser test), do so.
47 * Prefer object composition to inheritance: avoid subclassing test.test to
48   implement common functionality for multiple tests.  Instead, create a class
49   that your tests can instantiate to perform common operations.  This enables
50   us to write tests that use both PyAuto and Servo without dealing with
51   multiple inheritance, for example.
52 * Be deterministic: a test should not validate the timing of some operation.
53   Instead, write a test that records the timing in performance keyvals so that
54   we can track the numbers over time.
55
56Autotests *must not*:
57
58 * Put significant logic in the control file: control files are really just
59   python, so one can put arbitrary logic in there.  Don’t.  Run your test
60   code, perhaps with some parameters.
61
62Autotests *may*:
63
64 * Share parameterized fixtures: a test is defined by a control file.  Control
65   files import and run test code, and can pass simple parameters to the code
66   they run through a well-specified interface.
67
68Autotest has a notion of both client-side tests and server-side tests.  Code in
69a client-side test runs only on the device under test (DUT), and as such isn’t
70capable of maintaining state across reboots or handling a failed suspend/resume
71and the like.  If possible, an autotest should be written as a client-side
72test.  A ‘server’ test runs on the autotest server, but gets assigned a DUT
73just like a client-side test.  It can use various autotest primitives (and
74library code written by the CrOS team) to manipulate that device.  Most, if not
75all, tests that use Servo or remote power management should be server-side
76tests, as an example.
77
78Adding a test involves putting a control file and a properly-written test
79wrapper in the right place in the source tree.  There are conventions that must
80be followed, and a variety of primitives available for use.  When writing any
81code, whether client-side test, server-side test, or library, have a strong
82bias towards using autotest utility code.  This keeps the codebase consistent.
83
84
85## Writing a test
86
87This section explains considerations and requirements for any autotest, whether
88client or server.
89
90### Control files
91
92Upstream documentation
93Our local conventions for autotest control files deviate from the above a bit,
94but the indication about which fields are mandatory still holds.
95
96| Variable     | Required | Value                                                                                                                                                                                                                                                                                                                                    |
97|--------------|----------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
98| AUTHOR       | Yes      | A comma-delimited string of at least one responsible engineer and a backup engineer -- or at worst a backup mailing list. i.e. AUTHOR = ‘msb, snanda’                                                                                                                                                                                    |
99| DEPENDENCIES | No       | list of tags known to the HW test lab.                                                                                                                                                                                                                                                                                                   |
100| DOC          | Yes      | Long description of the test, pass/fail criteria                                                                                                                                                                                                                                                                                         |
101| NAME         | Yes      | Display name of the test. Generally this is the directory where your test lives e.g. hardware_TPMCheck. If you are using multiple run_test calls in the same control file or multiple control files with one test wrapper in the same suite, problems arise with the displaying of your test name. crosbug.com/35795. When in doubt ask. |
102| SYNC\_COUNT  | No       | Integer >= 1.  Number of simultaneous devices needed for a test run.                                                                                                                                                                                                                                                                     |
103| TIME         | Yes      | Test duration: 'FAST' (<1m), 'MEDIUM' (<10m), 'LONG' (<20m), 'LENGTHY' (>30m)                                                                                                                                                                                                                                                            |
104| TEST\_TYPE   | Yes      | Client or Server                                                                                                                                                                                                                                                                                                                         |
105| ATTRIBUTES   | No       | Comma separated list of attribute tags to apply to this control file, used in composing suites. For instance, 'suite:foo, suite:bar, subsystem:baz'.                                                                                                                                                                                     |
106
107### Running tests in suites
108
109Make sure that the suite name is listed in `site_utils/attribute_whitelist.txt`,
110then add the appropriate attribute to the ATTRIBUTES field in tests that make
111up the test suite.  For instance:
112
113```
114...
115ATTRIBUTES = 'suite:suite-a, suite:suite-b'
116...
117```
118
119would indicate that the control file above should be run as part of both
120`suite-a` and `suite-b`.
121
122### Pure python
123
124Lie, cheat and steal to keep your tests in pure python.  It will be easier to
125debug failures, it will be easier to generate meaningful error output, it will
126be simpler to get your tests installed and run, and it will be simpler for the
127lab team to build tools that allow you to quickly iterate.
128
129Shelling out to existing command-line tools is done fairly often, and isn’t a
130terrible thing.  The test author can wind up having to do a lot of output
131parsing, which is often brittle, but this can be a decent tradeoff in lieu of
132having to reimplement large pieces of functionality in python.
133
134Note that you will need to be sure that any commands you use are installed on
135the host.  For a client-side test, “the host” means “the DUT”.  For a
136server-side test, “the host” typically means “the system running autoserv”;
137however, if you use SiteHost.run(), the command will run on the DUT.  On the
138server, your tests will have access to all tools common to both a typical CrOS
139chroot environment and standard Goobuntu.
140
141If you want to use a tool on the DUT, it may be appropriate to include it as a
142dependency of the chromeos-base/chromeos-test package.  This ensures that the
143tool is pre-installed on every test image for every device, and will always be
144available for use.  Otherwise, the tool must be installed as an autotest “dep”.
145
146_Never install your own shell scripts and call them._  Anything you can do in
147shell, you can do in python.
148
149### Reporting failures
150
151Autotest supports several kinds of failure statuses:
152
153| Status   | Exception         | Reason                                                                                                                                                                                                                                                                                                                   |
154|----------|-------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
155| WARN     | error.TestWarn    | error.TestWarn should be used when side effects to the test running are encountered but are not directly related to the test running. For example, if you are testing Wifi and powerd crashes. *Currently* there are not any clear usecases for this and error.TestWarn should be generally avoided until further notice. |
156| TEST\_NA | error.TestNAError | This test does not apply in the current environment.                                                                                                                                                                                                                                                                     |
157| ERROR    | error.TestError   | The test was unable to validate the desired behavior.                                                                                                                                                                                                                                                                    |
158| FAIL     | error.TestFail    | The test determined the desired behavior failed to occur.                                                                                                                                                                                                                                                                |
159
160
161### Considerations when writing client-side tests
162
163All client-side tests authored at Google must live in the client/site\_tests sub-directory of the autotest source tree.
164
165###Compiling and executing binaries
166
167It is possible to compile source that’s included with your test and use the
168products at test runtime.  The build infrastructure will compile this code for
169the appropriate target architecture and package it up along with the rest of
170your test’s resources, but this increases your development iteration time as
171you need to actually re-build and re-package your test to deploy it to the
172device.  While we hope to improve tooling support for this use case in the
173future, avoiding this issue is the ideal.
174
175If you can’t avoid this, here’s how to get your code compiled and installed as
176a part of your test:
1771. Create a src/ directory next to your control file.
1782. Put your source, including its Makefile, in src/
1793. define a method in your test class called “setup(self)” that takes no arguments.
1804. setup(self) should perform all tasks necessary to build your tool.  There are some helpful utility functions in client/common_lib/utils.py.  Trivial example:
181
182```
183    def setup(self):
184        os.chdir(self.srcdir)
185        utils.make('OUT_DIR=.')
186```
187
188### Reusing code (“fixtures”)
189
190Any autotest is, essentially, a single usage of a re-usable test fixture.  This
191is because run\_once() in your test wrapper can take any arguments you want.  As
192such, multiple control files can re-use the same wrapper -- and should, where
193it makes sense.
194
195### Considerations when writing server-side tests
196
197All server-side tests authored at Google must live in the server/site\_tests
198sub-directory of the autotest source tree.
199
200It should be even easier to keep the server-side of a test in pure python, as
201you should simply be driving the DUT and verifying state.
202
203### When/why to write a server-side test
204
205Server-side tests are appropriate when some operation in the test can't be
206executed on the DUT.  The prototypical example is rebooting the DUT.  Other
207examples include tests that manipulate the network around the DUT (e.g. WiFi
208tests), tests that power off the DUT, and tests that rely on a Servo attached
209to the DUT.
210
211One simple criterion for whether to write a server-side test is this:  Is the
212DUT an object that the test must manipulate?  If the answer is “yes”, then a
213server-side test makes sense.
214
215### Control files for server-side tests
216
217Server-side tests commonly operate on the DUT as an object.  Autotest
218represents the DUT with an instance of class Host; the instance is constructed
219and passed to the test from the control file.  Creating the host object in the
220control file can be done using certain definitions present in the global
221environment of every control file:
222
223 * Function hosts.create\_host() will create a host object from a string with
224   the name of the host (an IP address as a string is also acceptable).
225 * Variable machines is a list of the host names available to the test.
226
227Below is a sample fragment for a control file that runs a simple server side test in parallel on all the hosts specified for the test.  The fragment is a complete control file, except for the missing boilerplate comments and documentation definitions required in all control files.
228
229```
230def run(machine):
231    host = hosts.create_host(machine)
232    job.run_test("platform_ServerTest", host=host)
233
234parallel_simple(run, machines)
235```
236
237Note:  The sample above relies on a common convention that the run\_once()
238method of a server-side test defines an argument named host with a default
239value, e.g.
240
241```
242def run_once(self, host=None):
243    # … test code goes here.
244```
245
246### Operations on Host objects
247
248A Host object supports various methods to operate on a DUT.  Below is a short list of important methods supported by instances of Host:
249
250 * run(command) - run a shell command on the host
251 * reboot() - reboot the host, and wait for it to be back on the network
252 * wait_up() - wait for the host to be active on the network
253 * wait_down() - wait until the host is no longer on the network, or until it is known to have rebooted.
254
255More details, including a longer list of available methods, and more about how
256they work can be found in the Autotest documentation for autoserv and Autotest
257documentation for Host.
258
259### Servo-based tests
260
261For server-side tests that use a servo-attached DUT, the host object has a
262servo attribute.  If Autotest determines that the DUT has a Servo attached, the
263servo attribute will be a valid instance of a Servo client object; otherwise
264the attribute will be None.
265
266For a DUT in the lab, Autotest will automatically determine whether there is a
267servo available; however, if a test requires Servo, its control file must have
268additional code to guarantee a properly initialized servo object on the host.
269
270Below is a code snippet outlining the requirements; portions of the control file have been omitted for brevity:
271
272```
273# ... Standard boilerplate variable assignments...
274DEPENDENCIES = "servo"
275# ... more standard boilerplate...
276
277args_dict = utils.args_to_dict(args)
278servo_args = hosts.SiteHost.get_servo_arguments(args_dict)
279
280def run(machine):
281    host = hosts.create_host(machine, servo_args=servo_args)
282    job.run_test("platform_SampleServoTest", host=host)
283
284parallel_simple(run, machines)
285```
286
287The `DEPENDENCIES` setting guarantees that if the test is scheduled in the lab,
288it will be assigned to a DUT that has a servo.
289
290The setting of `servo_args` guarantees two distinct things:  First, it forces
291checks that will make sure that the Servo is functioning properly; this
292guarantees that the host's `servo` attribute will not be None.  Second, the code
293allows you to pass necessary servo specific command-line arguments to
294`test_that`.
295
296If the test control file follows the formula above, the test can be reliably called in a variety of ways:
297 * When used for hosts in the lab, the host’s servo object will use the servo attached to the host, and the test can assume that the servo object is not None.
298 * If you start servod manually on your desktop using the default port, you can use test_that without any special options.
299 * If you need to specify a non-default host or port number (e.g. because servod is remote, or because you have more than one servo board), you can specify them with commands like these:
300
301```
302test_that --args=”servo_host=...” …
303test_that --args=”servo_port=...” …
304test_that --args=”servo_host=... servo_port=...” ...
305```
306
307### Calling client-side tests from a server-side test
308
309Commonly, server-side tests need to do more on the DUT than simply run short
310shell commands.  In those cases, a client-side test should be written and
311invoked from the server-side test.  In particular, a client side test allows
312the client side code to be written in Python that uses standard Autotest
313infrastructure, such as various utility modules or the logging infrastructure.
314
315Below is a short snippet showing the standard form for calling a client-side
316test from server-side code:
317
318```
319from autotest_lib.server import autotest
320
321    # ... inside some function, e.g. in run_once()
322    client_at = autotest.Autotest(host)
323    client_at.run_test("platform_ClientTest")
324```
325
326### Writing library code
327
328There is a large quantity of Chromium OS specific code in the autotest
329codebase.  Much of this exists to provide re-usable modules that enable tests
330to talk to system services.  The guidelines from above apply here as well.
331This code should be as pure python as possible, though it is reasonable to
332shell out to command line tools from time to time.  In some cases we’ve done
333this where we could (now) use the service’s DBus APIs directly.  If you’re
334adding code to allow tests to communicate with your service, it is strongly
335recommended that you use DBus where possible, instead of munging config files
336directly or using command-line tools.
337
338Currently, our library code lives in a concerning variety of places in the
339autotest tree.  This is due to a poor initial understanding of how to do
340things, and new code should follow the following conventions instead:
341
342 * Used only in server-side tests: server/cros
343 * Used in both server- and client-side tests, or only client:
344   client/common\_lib/cros
345
346### Adding test deps
347
348This does not refer to the optional `DEPENDENCIES` field in test control files.
349Rather, this section discusses how and when to use code/data/tools that are not
350pre-installed on test images, and should (or can) not be included right in with
351the test source.
352
353Unfortunately, there is no hard-and-fast rule here.  Generally, if this is some
354small tool or blob of data you need for a single test, you should include it as
355discussed above in Writing client-side tests.  If you’re writing the tool, and
356it has use for developers as well as in one or more tests that you’re writing,
357then make it a first-class CrOS project.  Write an ebuild, write unit tests,
358and then add it to the test image by default.  This can be done by RDEPENDing
359on your new test package from the chromeos-test ebuild.
360
361If your code/data falls in the middle (useful to several tests, not to devs),
362and/or is large (hundreds of megabytes as opposed to tens) then using an
363autotest ‘dep’ may be the right choice.  Conceptually, an autotest test dep is
364simply another kind of archive that the autotest infrastructure knows how to
365fetch and unpack.  There are two components to including a dependency from an
366autotest test -- setup during build time and installing it on your DUT when
367running a test.  The setup phase must be run from your tests setup() method
368like so:
369
370```
371def setup(self):
372  self.job.setup_dep([‘mydep’])
373  logging.debug(‘mydep is at %s’ % (os.path.join(self.autodir,
374deps/mydep’))
375```
376
377The above gets run when you “build” the test.
378
379The other half of this equation is actually installing the dependency so you
380can use it while running a test.  To do this, add the following to either your
381run\_once or initialize methods:
382
383```
384        dep = dep_name
385        dep_dir = os.path.join(self.autodir, 'deps', dep=dep)
386        self.job.install_pkg(dep, 'dep', dep_dir)
387```
388
389
390You can now reference the content of your dep using dep_dir.
391
392Now that you know how to include a dep, the next question is how to write one.
393Before you read further, you should check out client/deps/\* for many examples
394of deps in our autotest tree.
395
396### Create a dep from a third-party package
397
398There are many examples of how to do this in the client/deps directory already.
399The key component is to check in a tarball of the version of the dependency
400you’d like to include under client/deps/your\_dep.
401
402All deps require a control file and an actual python module by the same name.
403They will also need a copy of common.py to import utils.update\_version. Both
404the control and common are straightforward, the python module does all the
405magic.
406
407The deps python module follows a standard convention: a setup function and a
408call to utils.update\_version.  update\_version is used instead of directly
409calling setup as it maintains additional versioning logic ensuring setup is
410only done 1x per dep. The following is its method signature:
411
412```
413def update_version(srcdir, preserve_srcdir, new_version, install,
414                   *args, **dargs)
415```
416
417
418Notably, install should be a pointer to your setup function and `*args` should
419be filled in with params to said setup function.
420
421If you are using a tarball, your setup function should look something like:
422
423```
424def setup(tarball, my_dir)
425    utils.extract_tarball_to_dir(tarball, my_dir)
426    os.chdir(my_dir)
427    utils.make() # this assumes your tarball has a Makefile.
428```
429
430And you would invoke this with:
431
432```
433utils.update_version(os.getcwd(), True, version, setup, tarball_path,
434                     os.getcwd())
435```
436
437
438Note: The developer needs to call this because def setup is a function they are
439defining that can take any number of arguments or install the dep in any way
440they see fit. The above example uses tarballs but some are distributed as
441straight source under the src dir so their setup function only takes a top
442level path. We could avoid this by forcing a convention but that would be
443artificially constraining the deps mechanism.
444
445Once you’ve created the dep, you will also have to add the dep to the
446autotest-deps package in chromiumos-overlay/chromeos-base/autotest-deps,
447‘cros\_workon start’ it, and re-emerge it.
448
449### Create a dep from other chrome-os packages
450
451One can also create autotest deps from code that lives in other CrOS packages,
452or from build products generated by other packages.  This is similar as above
453but you can reference code using the `CHROMEOS_ROOT` env var that points to the
454root of the CrOS source checkout, or the SYSROOT env var (which points to
455/build/<board>) to refer to build products.  Again, read the above. Here’s an
456example of the former with the files I want in
457chromeos\_tree/chromite/my\_dep/\* where this will be the python code in
458autotest/files/client/deps/my\_dep/my\_dep.py module.
459
460```
461import common, os, shutil
462from autotest_lib.client.bin import utils
463
464version = 1
465
466def setup(setup_dir):
467    my_dep_dir = os.path.join(os.environ['CHROMEOS_ROOT'], 'chromite',
468                              'buildbot')
469    shutil.copytree(my_dep_dir, setup_dir)
470
471
472work_dir = os.path.join(os.getcwd(), 'src')
473utils.update_version(os.getcwd(), True, version, setup, work_dir)
474```
475