README.md
1# Node.js Core Benchmarks
2
3This folder contains code and data used to measure performance
4of different Node.js implementations and different ways of
5writing JavaScript run by the built-in JavaScript engine.
6
7For a detailed guide on how to write and run benchmarks in this
8directory, see [the guide on benchmarks](../doc/contributing/writing-and-running-benchmarks.md).
9
10## Table of Contents
11
12* [File tree structure](#file-tree-structure)
13* [Common API](#common-api)
14
15## File tree structure
16
17### Directories
18
19Benchmarks testing the performance of a single node submodule are placed into a
20directory with the corresponding name, so that they can be executed by submodule
21or individually.
22Benchmarks that span multiple submodules may either be placed into the `misc`
23directory or into a directory named after the feature they benchmark.
24E.g. benchmarks for various new ECMAScript features and their pre-ES2015
25counterparts are placed in a directory named `es`.
26Fixtures that are not specific to a certain benchmark but can be reused
27throughout the benchmark suite should be placed in the `fixtures` directory.
28
29### Other Top-level files
30
31The top-level files include common dependencies of the benchmarks
32and the tools for launching benchmarks and visualizing their output.
33The actual benchmark scripts should be placed in their corresponding
34directories.
35
36* `_benchmark_progress.js`: implements the progress bar displayed
37 when running `compare.js`
38* `_cli.js`: parses the command line arguments passed to `compare.js`,
39 `run.js` and `scatter.js`
40* `_cli.R`: parses the command line arguments passed to `compare.R`
41* `_http-benchmarkers.js`: selects and runs external tools for benchmarking
42 the `http` subsystem.
43* `common.js`: see [Common API](#common-api).
44* `compare.js`: command line tool for comparing performance between different
45 Node.js binaries.
46* `compare.R`: R script for statistically analyzing the output of
47 `compare.js`
48* `run.js`: command line tool for running individual benchmark suite(s).
49* `scatter.js`: command line tool for comparing the performance
50 between different parameters in benchmark configurations,
51 for example to analyze the time complexity.
52* `scatter.R`: R script for visualizing the output of `scatter.js` with
53 scatter plots.
54
55## Common API
56
57The common.js module is used by benchmarks for consistency across repeated
58tasks. It has a number of helpful functions and properties to help with
59writing benchmarks.
60
61### `createBenchmark(fn, configs[, options])`
62
63See [the guide on writing benchmarks](../doc/contributing/writing-and-running-benchmarks.md#basics-of-a-benchmark).
64
65### `default_http_benchmarker`
66
67The default benchmarker used to run HTTP benchmarks.
68See [the guide on writing HTTP benchmarks](../doc/contributing/writing-and-running-benchmarks.md#creating-an-http-benchmark).
69
70### `PORT`
71
72The default port used to run HTTP benchmarks.
73See [the guide on writing HTTP benchmarks](../doc/contributing/writing-and-running-benchmarks.md#creating-an-http-benchmark).
74
75### `sendResult(data)`
76
77Used in special benchmarks that can't use `createBenchmark` and the object
78it returns to accomplish what they need. This function reports timing
79data to the parent process (usually created by running `compare.js`, `run.js` or
80`scatter.js`).
81