• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Stream
2
3<!--introduced_in=v0.10.0-->
4
5> Stability: 2 - Stable
6
7<!-- source_link=lib/stream.js -->
8
9A stream is an abstract interface for working with streaming data in Node.js.
10The `stream` module provides an API for implementing the stream interface.
11
12There are many stream objects provided by Node.js. For instance, a
13[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
14are both stream instances.
15
16Streams can be readable, writable, or both. All streams are instances of
17[`EventEmitter`][].
18
19To access the `stream` module:
20
21```js
22const stream = require('stream');
23```
24
25The `stream` module is useful for creating new types of stream instances. It is
26usually not necessary to use the `stream` module to consume streams.
27
28## Organization of this document
29
30This document contains two primary sections and a third section for notes. The
31first section explains how to use existing streams within an application. The
32second section explains how to create new types of streams.
33
34## Types of streams
35
36There are four fundamental stream types within Node.js:
37
38* [`Writable`][]: streams to which data can be written (for example,
39  [`fs.createWriteStream()`][]).
40* [`Readable`][]: streams from which data can be read (for example,
41  [`fs.createReadStream()`][]).
42* [`Duplex`][]: streams that are both `Readable` and `Writable` (for example,
43  [`net.Socket`][]).
44* [`Transform`][]: `Duplex` streams that can modify or transform the data as it
45  is written and read (for example, [`zlib.createDeflate()`][]).
46
47Additionally, this module includes the utility functions
48[`stream.pipeline()`][], [`stream.finished()`][] and
49[`stream.Readable.from()`][].
50
51### Object mode
52
53All streams created by Node.js APIs operate exclusively on strings and `Buffer`
54(or `Uint8Array`) objects. It is possible, however, for stream implementations
55to work with other types of JavaScript values (with the exception of `null`,
56which serves a special purpose within streams). Such streams are considered to
57operate in "object mode".
58
59Stream instances are switched into object mode using the `objectMode` option
60when the stream is created. Attempting to switch an existing stream into
61object mode is not safe.
62
63### Buffering
64
65<!--type=misc-->
66
67Both [`Writable`][] and [`Readable`][] streams will store data in an internal
68buffer that can be retrieved using `writable.writableBuffer` or
69`readable.readableBuffer`, respectively.
70
71The amount of data potentially buffered depends on the `highWaterMark` option
72passed into the stream's constructor. For normal streams, the `highWaterMark`
73option specifies a [total number of bytes][hwm-gotcha]. For streams operating
74in object mode, the `highWaterMark` specifies a total number of objects.
75
76Data is buffered in `Readable` streams when the implementation calls
77[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
78call [`stream.read()`][stream-read], the data will sit in the internal
79queue until it is consumed.
80
81Once the total size of the internal read buffer reaches the threshold specified
82by `highWaterMark`, the stream will temporarily stop reading data from the
83underlying resource until the data currently buffered can be consumed (that is,
84the stream will stop calling the internal [`readable._read()`][] method that is
85used to fill the read buffer).
86
87Data is buffered in `Writable` streams when the
88[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
89total size of the internal write buffer is below the threshold set by
90`highWaterMark`, calls to `writable.write()` will return `true`. Once
91the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
92will be returned.
93
94A key goal of the `stream` API, particularly the [`stream.pipe()`][] method,
95is to limit the buffering of data to acceptable levels such that sources and
96destinations of differing speeds will not overwhelm the available memory.
97
98The `highWaterMark` option is a threshold, not a limit: it dictates the amount
99of data that a stream buffers before it stops asking for more data. It does not
100enforce a strict memory limitation in general. Specific stream implementations
101may choose to enforce stricter limits but doing so is optional.
102
103Because [`Duplex`][] and [`Transform`][] streams are both `Readable` and
104`Writable`, each maintains *two* separate internal buffers used for reading and
105writing, allowing each side to operate independently of the other while
106maintaining an appropriate and efficient flow of data. For example,
107[`net.Socket`][] instances are [`Duplex`][] streams whose `Readable` side allows
108consumption of data received *from* the socket and whose `Writable` side allows
109writing data *to* the socket. Because data may be written to the socket at a
110faster or slower rate than data is received, each side should
111operate (and buffer) independently of the other.
112
113## API for stream consumers
114
115<!--type=misc-->
116
117Almost all Node.js applications, no matter how simple, use streams in some
118manner. The following is an example of using streams in a Node.js application
119that implements an HTTP server:
120
121```js
122const http = require('http');
123
124const server = http.createServer((req, res) => {
125  // `req` is an http.IncomingMessage, which is a readable stream.
126  // `res` is an http.ServerResponse, which is a writable stream.
127
128  let body = '';
129  // Get the data as utf8 strings.
130  // If an encoding is not set, Buffer objects will be received.
131  req.setEncoding('utf8');
132
133  // Readable streams emit 'data' events once a listener is added.
134  req.on('data', (chunk) => {
135    body += chunk;
136  });
137
138  // The 'end' event indicates that the entire body has been received.
139  req.on('end', () => {
140    try {
141      const data = JSON.parse(body);
142      // Write back something interesting to the user:
143      res.write(typeof data);
144      res.end();
145    } catch (er) {
146      // uh oh! bad json!
147      res.statusCode = 400;
148      return res.end(`error: ${er.message}`);
149    }
150  });
151});
152
153server.listen(1337);
154
155// $ curl localhost:1337 -d "{}"
156// object
157// $ curl localhost:1337 -d "\"foo\""
158// string
159// $ curl localhost:1337 -d "not json"
160// error: Unexpected token o in JSON at position 1
161```
162
163[`Writable`][] streams (such as `res` in the example) expose methods such as
164`write()` and `end()` that are used to write data onto the stream.
165
166[`Readable`][] streams use the [`EventEmitter`][] API for notifying application
167code when data is available to be read off the stream. That available data can
168be read from the stream in multiple ways.
169
170Both [`Writable`][] and [`Readable`][] streams use the [`EventEmitter`][] API in
171various ways to communicate the current state of the stream.
172
173[`Duplex`][] and [`Transform`][] streams are both [`Writable`][] and
174[`Readable`][].
175
176Applications that are either writing data to or consuming data from a stream
177are not required to implement the stream interfaces directly and will generally
178have no reason to call `require('stream')`.
179
180Developers wishing to implement new types of streams should refer to the
181section [API for stream implementers][].
182
183### Writable streams
184
185Writable streams are an abstraction for a *destination* to which data is
186written.
187
188Examples of [`Writable`][] streams include:
189
190* [HTTP requests, on the client][]
191* [HTTP responses, on the server][]
192* [fs write streams][]
193* [zlib streams][zlib]
194* [crypto streams][crypto]
195* [TCP sockets][]
196* [child process stdin][]
197* [`process.stdout`][], [`process.stderr`][]
198
199Some of these examples are actually [`Duplex`][] streams that implement the
200[`Writable`][] interface.
201
202All [`Writable`][] streams implement the interface defined by the
203`stream.Writable` class.
204
205While specific instances of [`Writable`][] streams may differ in various ways,
206all `Writable` streams follow the same fundamental usage pattern as illustrated
207in the example below:
208
209```js
210const myStream = getWritableStreamSomehow();
211myStream.write('some data');
212myStream.write('some more data');
213myStream.end('done writing data');
214```
215
216#### Class: `stream.Writable`
217<!-- YAML
218added: v0.9.4
219-->
220
221<!--type=class-->
222
223##### Event: `'close'`
224<!-- YAML
225added: v0.9.4
226changes:
227  - version: v10.0.0
228    pr-url: https://github.com/nodejs/node/pull/18438
229    description: Add `emitClose` option to specify if `'close'` is emitted on
230                 destroy.
231-->
232
233The `'close'` event is emitted when the stream and any of its underlying
234resources (a file descriptor, for example) have been closed. The event indicates
235that no more events will be emitted, and no further computation will occur.
236
237A [`Writable`][] stream will always emit the `'close'` event if it is
238created with the `emitClose` option.
239
240##### Event: `'drain'`
241<!-- YAML
242added: v0.9.4
243-->
244
245If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
246`'drain'` event will be emitted when it is appropriate to resume writing data
247to the stream.
248
249```js
250// Write the data to the supplied writable stream one million times.
251// Be attentive to back-pressure.
252function writeOneMillionTimes(writer, data, encoding, callback) {
253  let i = 1000000;
254  write();
255  function write() {
256    let ok = true;
257    do {
258      i--;
259      if (i === 0) {
260        // Last time!
261        writer.write(data, encoding, callback);
262      } else {
263        // See if we should continue, or wait.
264        // Don't pass the callback, because we're not done yet.
265        ok = writer.write(data, encoding);
266      }
267    } while (i > 0 && ok);
268    if (i > 0) {
269      // Had to stop early!
270      // Write some more once it drains.
271      writer.once('drain', write);
272    }
273  }
274}
275```
276
277##### Event: `'error'`
278<!-- YAML
279added: v0.9.4
280-->
281
282* {Error}
283
284The `'error'` event is emitted if an error occurred while writing or piping
285data. The listener callback is passed a single `Error` argument when called.
286
287The stream is not closed when the `'error'` event is emitted unless the
288[`autoDestroy`][writable-new] option was set to `true` when creating the
289stream.
290
291##### Event: `'finish'`
292<!-- YAML
293added: v0.9.4
294-->
295
296The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
297has been called, and all data has been flushed to the underlying system.
298
299```js
300const writer = getWritableStreamSomehow();
301for (let i = 0; i < 100; i++) {
302  writer.write(`hello, #${i}!\n`);
303}
304writer.on('finish', () => {
305  console.log('All writes are now complete.');
306});
307writer.end('This is the end\n');
308```
309
310##### Event: `'pipe'`
311<!-- YAML
312added: v0.9.4
313-->
314
315* `src` {stream.Readable} source stream that is piping to this writable
316
317The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
318a readable stream, adding this writable to its set of destinations.
319
320```js
321const writer = getWritableStreamSomehow();
322const reader = getReadableStreamSomehow();
323writer.on('pipe', (src) => {
324  console.log('Something is piping into the writer.');
325  assert.equal(src, reader);
326});
327reader.pipe(writer);
328```
329
330##### Event: `'unpipe'`
331<!-- YAML
332added: v0.9.4
333-->
334
335* `src` {stream.Readable} The source stream that
336  [unpiped][`stream.unpipe()`] this writable
337
338The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
339on a [`Readable`][] stream, removing this [`Writable`][] from its set of
340destinations.
341
342This is also emitted in case this [`Writable`][] stream emits an error when a
343[`Readable`][] stream pipes into it.
344
345```js
346const writer = getWritableStreamSomehow();
347const reader = getReadableStreamSomehow();
348writer.on('unpipe', (src) => {
349  console.log('Something has stopped piping into the writer.');
350  assert.equal(src, reader);
351});
352reader.pipe(writer);
353reader.unpipe(writer);
354```
355
356##### `writable.cork()`
357<!-- YAML
358added: v0.11.2
359-->
360
361The `writable.cork()` method forces all written data to be buffered in memory.
362The buffered data will be flushed when either the [`stream.uncork()`][] or
363[`stream.end()`][stream-end] methods are called.
364
365The primary intent of `writable.cork()` is to accommodate a situation in which
366several small chunks are written to the stream in rapid succession. Instead of
367immediately forwarding them to the underlying destination, `writable.cork()`
368buffers all the chunks until `writable.uncork()` is called, which will pass them
369all to `writable._writev()`, if present. This prevents a head-of-line blocking
370situation where data is being buffered while waiting for the first small chunk
371to be processed. However, use of `writable.cork()` without implementing
372`writable._writev()` may have an adverse effect on throughput.
373
374See also: [`writable.uncork()`][], [`writable._writev()`][stream-_writev].
375
376##### `writable.destroy([error])`
377<!-- YAML
378added: v8.0.0
379-->
380
381* `error` {Error} Optional, an error to emit with `'error'` event.
382* Returns: {this}
383
384Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`
385event (unless `emitClose` is set to `false`). After this call, the writable
386stream has ended and subsequent calls to `write()` or `end()` will result in
387an `ERR_STREAM_DESTROYED` error.
388This is a destructive and immediate way to destroy a stream. Previous calls to
389`write()` may not have drained, and may trigger an `ERR_STREAM_DESTROYED` error.
390Use `end()` instead of destroy if data should flush before close, or wait for
391the `'drain'` event before destroying the stream.
392Implementors should not override this method,
393but instead implement [`writable._destroy()`][writable-_destroy].
394
395##### `writable.destroyed`
396<!-- YAML
397added: v8.0.0
398-->
399
400* {boolean}
401
402Is `true` after [`writable.destroy()`][writable-destroy] has been called.
403
404##### `writable.end([chunk[, encoding]][, callback])`
405<!-- YAML
406added: v0.9.4
407changes:
408  - version: v10.0.0
409    pr-url: https://github.com/nodejs/node/pull/18780
410    description: This method now returns a reference to `writable`.
411  - version: v8.0.0
412    pr-url: https://github.com/nodejs/node/pull/11608
413    description: The `chunk` argument can now be a `Uint8Array` instance.
414-->
415
416* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
417  not operating in object mode, `chunk` must be a string, `Buffer` or
418  `Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
419  other than `null`.
420* `encoding` {string} The encoding if `chunk` is a string
421* `callback` {Function} Optional callback for when the stream is finished
422* Returns: {this}
423
424Calling the `writable.end()` method signals that no more data will be written
425to the [`Writable`][]. The optional `chunk` and `encoding` arguments allow one
426final additional chunk of data to be written immediately before closing the
427stream. If provided, the optional `callback` function is attached as a listener
428for the [`'finish'`][] event.
429
430Calling the [`stream.write()`][stream-write] method after calling
431[`stream.end()`][stream-end] will raise an error.
432
433```js
434// Write 'hello, ' and then end with 'world!'.
435const fs = require('fs');
436const file = fs.createWriteStream('example.txt');
437file.write('hello, ');
438file.end('world!');
439// Writing more now is not allowed!
440```
441
442##### `writable.setDefaultEncoding(encoding)`
443<!-- YAML
444added: v0.11.15
445changes:
446  - version: v6.1.0
447    pr-url: https://github.com/nodejs/node/pull/5040
448    description: This method now returns a reference to `writable`.
449-->
450
451* `encoding` {string} The new default encoding
452* Returns: {this}
453
454The `writable.setDefaultEncoding()` method sets the default `encoding` for a
455[`Writable`][] stream.
456
457##### `writable.uncork()`
458<!-- YAML
459added: v0.11.2
460-->
461
462The `writable.uncork()` method flushes all data buffered since
463[`stream.cork()`][] was called.
464
465When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
466of writes to a stream, it is recommended that calls to `writable.uncork()` be
467deferred using `process.nextTick()`. Doing so allows batching of all
468`writable.write()` calls that occur within a given Node.js event loop phase.
469
470```js
471stream.cork();
472stream.write('some ');
473stream.write('data ');
474process.nextTick(() => stream.uncork());
475```
476
477If the [`writable.cork()`][] method is called multiple times on a stream, the
478same number of calls to `writable.uncork()` must be called to flush the buffered
479data.
480
481```js
482stream.cork();
483stream.write('some ');
484stream.cork();
485stream.write('data ');
486process.nextTick(() => {
487  stream.uncork();
488  // The data will not be flushed until uncork() is called a second time.
489  stream.uncork();
490});
491```
492
493See also: [`writable.cork()`][].
494
495##### `writable.writable`
496<!-- YAML
497added: v11.4.0
498-->
499
500* {boolean}
501
502Is `true` if it is safe to call [`writable.write()`][stream-write].
503
504##### `writable.writableEnded`
505<!-- YAML
506added: v12.9.0
507-->
508
509* {boolean}
510
511Is `true` after [`writable.end()`][] has been called. This property
512does not indicate whether the data has been flushed, for this use
513[`writable.writableFinished`][] instead.
514
515##### `writable.writableCorked`
516<!-- YAML
517added: v12.16.0
518-->
519
520* {integer}
521
522Number of times [`writable.uncork()`][stream-uncork] needs to be
523called in order to fully uncork the stream.
524
525##### `writable.writableFinished`
526<!-- YAML
527added: v12.6.0
528-->
529
530* {boolean}
531
532Is set to `true` immediately before the [`'finish'`][] event is emitted.
533
534##### `writable.writableHighWaterMark`
535<!-- YAML
536added: v9.3.0
537-->
538
539* {number}
540
541Return the value of `highWaterMark` passed when constructing this
542`Writable`.
543
544##### `writable.writableLength`
545<!-- YAML
546added: v9.4.0
547-->
548
549* {number}
550
551This property contains the number of bytes (or objects) in the queue
552ready to be written. The value provides introspection data regarding
553the status of the `highWaterMark`.
554
555##### `writable.writableObjectMode`
556<!-- YAML
557added: v12.3.0
558-->
559
560* {boolean}
561
562Getter for the property `objectMode` of a given `Writable` stream.
563
564##### `writable.write(chunk[, encoding][, callback])`
565<!-- YAML
566added: v0.9.4
567changes:
568  - version: v8.0.0
569    pr-url: https://github.com/nodejs/node/pull/11608
570    description: The `chunk` argument can now be a `Uint8Array` instance.
571  - version: v6.0.0
572    pr-url: https://github.com/nodejs/node/pull/6170
573    description: Passing `null` as the `chunk` parameter will always be
574                 considered invalid now, even in object mode.
575-->
576
577* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
578  not operating in object mode, `chunk` must be a string, `Buffer` or
579  `Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
580  other than `null`.
581* `encoding` {string} The encoding, if `chunk` is a string. **Default:** `'utf8'`
582* `callback` {Function} Callback for when this chunk of data is flushed
583* Returns: {boolean} `false` if the stream wishes for the calling code to
584  wait for the `'drain'` event to be emitted before continuing to write
585  additional data; otherwise `true`.
586
587The `writable.write()` method writes some data to the stream, and calls the
588supplied `callback` once the data has been fully handled. If an error
589occurs, the `callback` *may or may not* be called with the error as its
590first argument. To reliably detect write errors, add a listener for the
591`'error'` event.
592
593The return value is `true` if the internal buffer is less than the
594`highWaterMark` configured when the stream was created after admitting `chunk`.
595If `false` is returned, further attempts to write data to the stream should
596stop until the [`'drain'`][] event is emitted.
597
598While a stream is not draining, calls to `write()` will buffer `chunk`, and
599return false. Once all currently buffered chunks are drained (accepted for
600delivery by the operating system), the `'drain'` event will be emitted.
601It is recommended that once `write()` returns false, no more chunks be written
602until the `'drain'` event is emitted. While calling `write()` on a stream that
603is not draining is allowed, Node.js will buffer all written chunks until
604maximum memory usage occurs, at which point it will abort unconditionally.
605Even before it aborts, high memory usage will cause poor garbage collector
606performance and high RSS (which is not typically released back to the system,
607even after the memory is no longer required). Since TCP sockets may never
608drain if the remote peer does not read the data, writing a socket that is
609not draining may lead to a remotely exploitable vulnerability.
610
611Writing data while the stream is not draining is particularly
612problematic for a [`Transform`][], because the `Transform` streams are paused
613by default until they are piped or a `'data'` or `'readable'` event handler
614is added.
615
616If the data to be written can be generated or fetched on demand, it is
617recommended to encapsulate the logic into a [`Readable`][] and use
618[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
619possible to respect backpressure and avoid memory issues using the
620[`'drain'`][] event:
621
622```js
623function write(data, cb) {
624  if (!stream.write(data)) {
625    stream.once('drain', cb);
626  } else {
627    process.nextTick(cb);
628  }
629}
630
631// Wait for cb to be called before doing any other write.
632write('hello', () => {
633  console.log('Write completed, do more writes now.');
634});
635```
636
637A `Writable` stream in object mode will always ignore the `encoding` argument.
638
639### Readable streams
640
641Readable streams are an abstraction for a *source* from which data is
642consumed.
643
644Examples of `Readable` streams include:
645
646* [HTTP responses, on the client][http-incoming-message]
647* [HTTP requests, on the server][http-incoming-message]
648* [fs read streams][]
649* [zlib streams][zlib]
650* [crypto streams][crypto]
651* [TCP sockets][]
652* [child process stdout and stderr][]
653* [`process.stdin`][]
654
655All [`Readable`][] streams implement the interface defined by the
656`stream.Readable` class.
657
658#### Two reading modes
659
660`Readable` streams effectively operate in one of two modes: flowing and
661paused. These modes are separate from [object mode][object-mode].
662A [`Readable`][] stream can be in object mode or not, regardless of whether
663it is in flowing mode or paused mode.
664
665* In flowing mode, data is read from the underlying system automatically
666and provided to an application as quickly as possible using events via the
667[`EventEmitter`][] interface.
668
669* In paused mode, the [`stream.read()`][stream-read] method must be called
670explicitly to read chunks of data from the stream.
671
672All [`Readable`][] streams begin in paused mode but can be switched to flowing
673mode in one of the following ways:
674
675* Adding a [`'data'`][] event handler.
676* Calling the [`stream.resume()`][stream-resume] method.
677* Calling the [`stream.pipe()`][] method to send the data to a [`Writable`][].
678
679The `Readable` can switch back to paused mode using one of the following:
680
681* If there are no pipe destinations, by calling the
682  [`stream.pause()`][stream-pause] method.
683* If there are pipe destinations, by removing all pipe destinations.
684  Multiple pipe destinations may be removed by calling the
685  [`stream.unpipe()`][] method.
686
687The important concept to remember is that a `Readable` will not generate data
688until a mechanism for either consuming or ignoring that data is provided. If
689the consuming mechanism is disabled or taken away, the `Readable` will *attempt*
690to stop generating the data.
691
692For backward compatibility reasons, removing [`'data'`][] event handlers will
693**not** automatically pause the stream. Also, if there are piped destinations,
694then calling [`stream.pause()`][stream-pause] will not guarantee that the
695stream will *remain* paused once those destinations drain and ask for more data.
696
697If a [`Readable`][] is switched into flowing mode and there are no consumers
698available to handle the data, that data will be lost. This can occur, for
699instance, when the `readable.resume()` method is called without a listener
700attached to the `'data'` event, or when a `'data'` event handler is removed
701from the stream.
702
703Adding a [`'readable'`][] event handler automatically makes the stream
704stop flowing, and the data has to be consumed via
705[`readable.read()`][stream-read]. If the [`'readable'`][] event handler is
706removed, then the stream will start flowing again if there is a
707[`'data'`][] event handler.
708
709#### Three states
710
711The "two modes" of operation for a `Readable` stream are a simplified
712abstraction for the more complicated internal state management that is happening
713within the `Readable` stream implementation.
714
715Specifically, at any given point in time, every `Readable` is in one of three
716possible states:
717
718* `readable.readableFlowing === null`
719* `readable.readableFlowing === false`
720* `readable.readableFlowing === true`
721
722When `readable.readableFlowing` is `null`, no mechanism for consuming the
723stream's data is provided. Therefore, the stream will not generate data.
724While in this state, attaching a listener for the `'data'` event, calling the
725`readable.pipe()` method, or calling the `readable.resume()` method will switch
726`readable.readableFlowing` to `true`, causing the `Readable` to begin actively
727emitting events as data is generated.
728
729Calling `readable.pause()`, `readable.unpipe()`, or receiving backpressure
730will cause the `readable.readableFlowing` to be set as `false`,
731temporarily halting the flowing of events but *not* halting the generation of
732data. While in this state, attaching a listener for the `'data'` event
733will not switch `readable.readableFlowing` to `true`.
734
735```js
736const { PassThrough, Writable } = require('stream');
737const pass = new PassThrough();
738const writable = new Writable();
739
740pass.pipe(writable);
741pass.unpipe(writable);
742// readableFlowing is now false.
743
744pass.on('data', (chunk) => { console.log(chunk.toString()); });
745pass.write('ok');  // Will not emit 'data'.
746pass.resume();     // Must be called to make stream emit 'data'.
747```
748
749While `readable.readableFlowing` is `false`, data may be accumulating
750within the stream's internal buffer.
751
752#### Choose one API style
753
754The `Readable` stream API evolved across multiple Node.js versions and provides
755multiple methods of consuming stream data. In general, developers should choose
756*one* of the methods of consuming data and *should never* use multiple methods
757to consume data from a single stream. Specifically, using a combination
758of `on('data')`, `on('readable')`, `pipe()`, or async iterators could
759lead to unintuitive behavior.
760
761Use of the `readable.pipe()` method is recommended for most users as it has been
762implemented to provide the easiest way of consuming stream data. Developers that
763require more fine-grained control over the transfer and generation of data can
764use the [`EventEmitter`][] and `readable.on('readable')`/`readable.read()`
765or the `readable.pause()`/`readable.resume()` APIs.
766
767#### Class: `stream.Readable`
768<!-- YAML
769added: v0.9.4
770-->
771
772<!--type=class-->
773
774##### Event: `'close'`
775<!-- YAML
776added: v0.9.4
777changes:
778  - version: v10.0.0
779    pr-url: https://github.com/nodejs/node/pull/18438
780    description: Add `emitClose` option to specify if `'close'` is emitted on
781                 destroy.
782-->
783
784The `'close'` event is emitted when the stream and any of its underlying
785resources (a file descriptor, for example) have been closed. The event indicates
786that no more events will be emitted, and no further computation will occur.
787
788A [`Readable`][] stream will always emit the `'close'` event if it is
789created with the `emitClose` option.
790
791##### Event: `'data'`
792<!-- YAML
793added: v0.9.4
794-->
795
796* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
797  operating in object mode, the chunk will be either a string or `Buffer`.
798  For streams that are in object mode, the chunk can be any JavaScript value
799  other than `null`.
800
801The `'data'` event is emitted whenever the stream is relinquishing ownership of
802a chunk of data to a consumer. This may occur whenever the stream is switched
803in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
804attaching a listener callback to the `'data'` event. The `'data'` event will
805also be emitted whenever the `readable.read()` method is called and a chunk of
806data is available to be returned.
807
808Attaching a `'data'` event listener to a stream that has not been explicitly
809paused will switch the stream into flowing mode. Data will then be passed as
810soon as it is available.
811
812The listener callback will be passed the chunk of data as a string if a default
813encoding has been specified for the stream using the
814`readable.setEncoding()` method; otherwise the data will be passed as a
815`Buffer`.
816
817```js
818const readable = getReadableStreamSomehow();
819readable.on('data', (chunk) => {
820  console.log(`Received ${chunk.length} bytes of data.`);
821});
822```
823
824##### Event: `'end'`
825<!-- YAML
826added: v0.9.4
827-->
828
829The `'end'` event is emitted when there is no more data to be consumed from
830the stream.
831
832The `'end'` event **will not be emitted** unless the data is completely
833consumed. This can be accomplished by switching the stream into flowing mode,
834or by calling [`stream.read()`][stream-read] repeatedly until all data has been
835consumed.
836
837```js
838const readable = getReadableStreamSomehow();
839readable.on('data', (chunk) => {
840  console.log(`Received ${chunk.length} bytes of data.`);
841});
842readable.on('end', () => {
843  console.log('There will be no more data.');
844});
845```
846
847##### Event: `'error'`
848<!-- YAML
849added: v0.9.4
850-->
851
852* {Error}
853
854The `'error'` event may be emitted by a `Readable` implementation at any time.
855Typically, this may occur if the underlying stream is unable to generate data
856due to an underlying internal failure, or when a stream implementation attempts
857to push an invalid chunk of data.
858
859The listener callback will be passed a single `Error` object.
860
861##### Event: `'pause'`
862<!-- YAML
863added: v0.9.4
864-->
865
866The `'pause'` event is emitted when [`stream.pause()`][stream-pause] is called
867and `readableFlowing` is not `false`.
868
869##### Event: `'readable'`
870<!-- YAML
871added: v0.9.4
872changes:
873  - version: v10.0.0
874    pr-url: https://github.com/nodejs/node/pull/17979
875    description: The `'readable'` is always emitted in the next tick after
876                 `.push()` is called.
877  - version: v10.0.0
878    pr-url: https://github.com/nodejs/node/pull/18994
879    description: Using `'readable'` requires calling `.read()`.
880-->
881
882The `'readable'` event is emitted when there is data available to be read from
883the stream. In some cases, attaching a listener for the `'readable'` event will
884cause some amount of data to be read into an internal buffer.
885
886```js
887const readable = getReadableStreamSomehow();
888readable.on('readable', function() {
889  // There is some data to read now.
890  let data;
891
892  while (data = this.read()) {
893    console.log(data);
894  }
895});
896```
897
898The `'readable'` event will also be emitted once the end of the stream data
899has been reached but before the `'end'` event is emitted.
900
901Effectively, the `'readable'` event indicates that the stream has new
902information: either new data is available or the end of the stream has been
903reached. In the former case, [`stream.read()`][stream-read] will return the
904available data. In the latter case, [`stream.read()`][stream-read] will return
905`null`. For instance, in the following example, `foo.txt` is an empty file:
906
907```js
908const fs = require('fs');
909const rr = fs.createReadStream('foo.txt');
910rr.on('readable', () => {
911  console.log(`readable: ${rr.read()}`);
912});
913rr.on('end', () => {
914  console.log('end');
915});
916```
917
918The output of running this script is:
919
920```console
921$ node test.js
922readable: null
923end
924```
925
926In general, the `readable.pipe()` and `'data'` event mechanisms are easier to
927understand than the `'readable'` event. However, handling `'readable'` might
928result in increased throughput.
929
930If both `'readable'` and [`'data'`][] are used at the same time, `'readable'`
931takes precedence in controlling the flow, i.e. `'data'` will be emitted
932only when [`stream.read()`][stream-read] is called. The
933`readableFlowing` property would become `false`.
934If there are `'data'` listeners when `'readable'` is removed, the stream
935will start flowing, i.e. `'data'` events will be emitted without calling
936`.resume()`.
937
938##### Event: `'resume'`
939<!-- YAML
940added: v0.9.4
941-->
942
943The `'resume'` event is emitted when [`stream.resume()`][stream-resume] is
944called and `readableFlowing` is not `true`.
945
946##### `readable.destroy([error])`
947<!-- YAML
948added: v8.0.0
949-->
950
951* `error` {Error} Error which will be passed as payload in `'error'` event
952* Returns: {this}
953
954Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`
955event (unless `emitClose` is set to `false`). After this call, the readable
956stream will release any internal resources and subsequent calls to `push()`
957will be ignored.
958Implementors should not override this method, but instead implement
959[`readable._destroy()`][readable-_destroy].
960
961##### `readable.destroyed`
962<!-- YAML
963added: v8.0.0
964-->
965
966* {boolean}
967
968Is `true` after [`readable.destroy()`][readable-destroy] has been called.
969
970##### `readable.isPaused()`
971<!-- YAML
972added: v0.11.14
973-->
974
975* Returns: {boolean}
976
977The `readable.isPaused()` method returns the current operating state of the
978`Readable`. This is used primarily by the mechanism that underlies the
979`readable.pipe()` method. In most typical cases, there will be no reason to
980use this method directly.
981
982```js
983const readable = new stream.Readable();
984
985readable.isPaused(); // === false
986readable.pause();
987readable.isPaused(); // === true
988readable.resume();
989readable.isPaused(); // === false
990```
991
992##### `readable.pause()`
993<!-- YAML
994added: v0.9.4
995-->
996
997* Returns: {this}
998
999The `readable.pause()` method will cause a stream in flowing mode to stop
1000emitting [`'data'`][] events, switching out of flowing mode. Any data that
1001becomes available will remain in the internal buffer.
1002
1003```js
1004const readable = getReadableStreamSomehow();
1005readable.on('data', (chunk) => {
1006  console.log(`Received ${chunk.length} bytes of data.`);
1007  readable.pause();
1008  console.log('There will be no additional data for 1 second.');
1009  setTimeout(() => {
1010    console.log('Now data will start flowing again.');
1011    readable.resume();
1012  }, 1000);
1013});
1014```
1015
1016The `readable.pause()` method has no effect if there is a `'readable'`
1017event listener.
1018
1019##### `readable.pipe(destination[, options])`
1020<!-- YAML
1021added: v0.9.4
1022-->
1023
1024* `destination` {stream.Writable} The destination for writing data
1025* `options` {Object} Pipe options
1026  * `end` {boolean} End the writer when the reader ends. **Default:** `true`.
1027* Returns: {stream.Writable} The *destination*, allowing for a chain of pipes if
1028  it is a [`Duplex`][] or a [`Transform`][] stream
1029
1030The `readable.pipe()` method attaches a [`Writable`][] stream to the `readable`,
1031causing it to switch automatically into flowing mode and push all of its data
1032to the attached [`Writable`][]. The flow of data will be automatically managed
1033so that the destination `Writable` stream is not overwhelmed by a faster
1034`Readable` stream.
1035
1036The following example pipes all of the data from the `readable` into a file
1037named `file.txt`:
1038
1039```js
1040const fs = require('fs');
1041const readable = getReadableStreamSomehow();
1042const writable = fs.createWriteStream('file.txt');
1043// All the data from readable goes into 'file.txt'.
1044readable.pipe(writable);
1045```
1046
1047It is possible to attach multiple `Writable` streams to a single `Readable`
1048stream.
1049
1050The `readable.pipe()` method returns a reference to the *destination* stream
1051making it possible to set up chains of piped streams:
1052
1053```js
1054const fs = require('fs');
1055const r = fs.createReadStream('file.txt');
1056const z = zlib.createGzip();
1057const w = fs.createWriteStream('file.txt.gz');
1058r.pipe(z).pipe(w);
1059```
1060
1061By default, [`stream.end()`][stream-end] is called on the destination `Writable`
1062stream when the source `Readable` stream emits [`'end'`][], so that the
1063destination is no longer writable. To disable this default behavior, the `end`
1064option can be passed as `false`, causing the destination stream to remain open:
1065
1066```js
1067reader.pipe(writer, { end: false });
1068reader.on('end', () => {
1069  writer.end('Goodbye\n');
1070});
1071```
1072
1073One important caveat is that if the `Readable` stream emits an error during
1074processing, the `Writable` destination *is not closed* automatically. If an
1075error occurs, it will be necessary to *manually* close each stream in order
1076to prevent memory leaks.
1077
1078The [`process.stderr`][] and [`process.stdout`][] `Writable` streams are never
1079closed until the Node.js process exits, regardless of the specified options.
1080
1081##### `readable.read([size])`
1082<!-- YAML
1083added: v0.9.4
1084-->
1085
1086* `size` {number} Optional argument to specify how much data to read.
1087* Returns: {string|Buffer|null|any}
1088
1089The `readable.read()` method pulls some data out of the internal buffer and
1090returns it. If no data available to be read, `null` is returned. By default,
1091the data will be returned as a `Buffer` object unless an encoding has been
1092specified using the `readable.setEncoding()` method or the stream is operating
1093in object mode.
1094
1095The optional `size` argument specifies a specific number of bytes to read. If
1096`size` bytes are not available to be read, `null` will be returned *unless*
1097the stream has ended, in which case all of the data remaining in the internal
1098buffer will be returned.
1099
1100If the `size` argument is not specified, all of the data contained in the
1101internal buffer will be returned.
1102
1103The `size` argument must be less than or equal to 1 GB.
1104
1105The `readable.read()` method should only be called on `Readable` streams
1106operating in paused mode. In flowing mode, `readable.read()` is called
1107automatically until the internal buffer is fully drained.
1108
1109```js
1110const readable = getReadableStreamSomehow();
1111
1112// 'readable' may be triggered multiple times as data is buffered in
1113readable.on('readable', () => {
1114  let chunk;
1115  console.log('Stream is readable (new data received in buffer)');
1116  // Use a loop to make sure we read all currently available data
1117  while (null !== (chunk = readable.read())) {
1118    console.log(`Read ${chunk.length} bytes of data...`);
1119  }
1120});
1121
1122// 'end' will be triggered once when there is no more data available
1123readable.on('end', () => {
1124  console.log('Reached end of stream.');
1125});
1126```
1127
1128Each call to `readable.read()` returns a chunk of data, or `null`. The chunks
1129are not concatenated. A `while` loop is necessary to consume all data
1130currently in the buffer. When reading a large file `.read()` may return `null`,
1131having consumed all buffered content so far, but there is still more data to
1132come not yet buffered. In this case a new `'readable'` event will be emitted
1133when there is more data in the buffer. Finally the `'end'` event will be
1134emitted when there is no more data to come.
1135
1136Therefore to read a file's whole contents from a `readable`, it is necessary
1137to collect chunks across multiple `'readable'` events:
1138
1139```js
1140const chunks = [];
1141
1142readable.on('readable', () => {
1143  let chunk;
1144  while (null !== (chunk = readable.read())) {
1145    chunks.push(chunk);
1146  }
1147});
1148
1149readable.on('end', () => {
1150  const content = chunks.join('');
1151});
1152```
1153
1154A `Readable` stream in object mode will always return a single item from
1155a call to [`readable.read(size)`][stream-read], regardless of the value of the
1156`size` argument.
1157
1158If the `readable.read()` method returns a chunk of data, a `'data'` event will
1159also be emitted.
1160
1161Calling [`stream.read([size])`][stream-read] after the [`'end'`][] event has
1162been emitted will return `null`. No runtime error will be raised.
1163
1164##### `readable.readable`
1165<!-- YAML
1166added: v11.4.0
1167-->
1168
1169* {boolean}
1170
1171Is `true` if it is safe to call [`readable.read()`][stream-read].
1172
1173##### `readable.readableEncoding`
1174<!-- YAML
1175added: v12.7.0
1176-->
1177
1178* {null|string}
1179
1180Getter for the property `encoding` of a given `Readable` stream. The `encoding`
1181property can be set using the [`readable.setEncoding()`][] method.
1182
1183##### `readable.readableEnded`
1184<!-- YAML
1185added: v12.9.0
1186-->
1187
1188* {boolean}
1189
1190Becomes `true` when [`'end'`][] event is emitted.
1191
1192##### `readable.readableFlowing`
1193<!-- YAML
1194added: v9.4.0
1195-->
1196
1197* {boolean}
1198
1199This property reflects the current state of a `Readable` stream as described
1200in the [Three states][] section.
1201
1202##### `readable.readableHighWaterMark`
1203<!-- YAML
1204added: v9.3.0
1205-->
1206
1207* {number}
1208
1209Returns the value of `highWaterMark` passed when constructing this
1210`Readable`.
1211
1212##### `readable.readableLength`
1213<!-- YAML
1214added: v9.4.0
1215-->
1216
1217* {number}
1218
1219This property contains the number of bytes (or objects) in the queue
1220ready to be read. The value provides introspection data regarding
1221the status of the `highWaterMark`.
1222
1223##### `readable.readableObjectMode`
1224<!-- YAML
1225added: v12.3.0
1226-->
1227
1228* {boolean}
1229
1230Getter for the property `objectMode` of a given `Readable` stream.
1231
1232##### `readable.resume()`
1233<!-- YAML
1234added: v0.9.4
1235changes:
1236  - version: v10.0.0
1237    pr-url: https://github.com/nodejs/node/pull/18994
1238    description: The `resume()` has no effect if there is a `'readable'` event
1239                 listening.
1240-->
1241
1242* Returns: {this}
1243
1244The `readable.resume()` method causes an explicitly paused `Readable` stream to
1245resume emitting [`'data'`][] events, switching the stream into flowing mode.
1246
1247The `readable.resume()` method can be used to fully consume the data from a
1248stream without actually processing any of that data:
1249
1250```js
1251getReadableStreamSomehow()
1252  .resume()
1253  .on('end', () => {
1254    console.log('Reached the end, but did not read anything.');
1255  });
1256```
1257
1258The `readable.resume()` method has no effect if there is a `'readable'`
1259event listener.
1260
1261##### `readable.setEncoding(encoding)`
1262<!-- YAML
1263added: v0.9.4
1264-->
1265
1266* `encoding` {string} The encoding to use.
1267* Returns: {this}
1268
1269The `readable.setEncoding()` method sets the character encoding for
1270data read from the `Readable` stream.
1271
1272By default, no encoding is assigned and stream data will be returned as
1273`Buffer` objects. Setting an encoding causes the stream data
1274to be returned as strings of the specified encoding rather than as `Buffer`
1275objects. For instance, calling `readable.setEncoding('utf8')` will cause the
1276output data to be interpreted as UTF-8 data, and passed as strings. Calling
1277`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal
1278string format.
1279
1280The `Readable` stream will properly handle multi-byte characters delivered
1281through the stream that would otherwise become improperly decoded if simply
1282pulled from the stream as `Buffer` objects.
1283
1284```js
1285const readable = getReadableStreamSomehow();
1286readable.setEncoding('utf8');
1287readable.on('data', (chunk) => {
1288  assert.equal(typeof chunk, 'string');
1289  console.log('Got %d characters of string data:', chunk.length);
1290});
1291```
1292
1293##### `readable.unpipe([destination])`
1294<!-- YAML
1295added: v0.9.4
1296-->
1297
1298* `destination` {stream.Writable} Optional specific stream to unpipe
1299* Returns: {this}
1300
1301The `readable.unpipe()` method detaches a `Writable` stream previously attached
1302using the [`stream.pipe()`][] method.
1303
1304If the `destination` is not specified, then *all* pipes are detached.
1305
1306If the `destination` is specified, but no pipe is set up for it, then
1307the method does nothing.
1308
1309```js
1310const fs = require('fs');
1311const readable = getReadableStreamSomehow();
1312const writable = fs.createWriteStream('file.txt');
1313// All the data from readable goes into 'file.txt',
1314// but only for the first second.
1315readable.pipe(writable);
1316setTimeout(() => {
1317  console.log('Stop writing to file.txt.');
1318  readable.unpipe(writable);
1319  console.log('Manually close the file stream.');
1320  writable.end();
1321}, 1000);
1322```
1323
1324##### `readable.unshift(chunk[, encoding])`
1325<!-- YAML
1326added: v0.9.11
1327changes:
1328  - version: v8.0.0
1329    pr-url: https://github.com/nodejs/node/pull/11608
1330    description: The `chunk` argument can now be a `Uint8Array` instance.
1331-->
1332
1333* `chunk` {Buffer|Uint8Array|string|null|any} Chunk of data to unshift onto the
1334  read queue. For streams not operating in object mode, `chunk` must be a
1335  string, `Buffer`, `Uint8Array` or `null`. For object mode streams, `chunk`
1336  may be any JavaScript value.
1337* `encoding` {string} Encoding of string chunks. Must be a valid
1338  `Buffer` encoding, such as `'utf8'` or `'ascii'`.
1339
1340Passing `chunk` as `null` signals the end of the stream (EOF) and behaves the
1341same as `readable.push(null)`, after which no more data can be written. The EOF
1342signal is put at the end of the buffer and any buffered data will still be
1343flushed.
1344
1345The `readable.unshift()` method pushes a chunk of data back into the internal
1346buffer. This is useful in certain situations where a stream is being consumed by
1347code that needs to "un-consume" some amount of data that it has optimistically
1348pulled out of the source, so that the data can be passed on to some other party.
1349
1350The `stream.unshift(chunk)` method cannot be called after the [`'end'`][] event
1351has been emitted or a runtime error will be thrown.
1352
1353Developers using `stream.unshift()` often should consider switching to
1354use of a [`Transform`][] stream instead. See the [API for stream implementers][]
1355section for more information.
1356
1357```js
1358// Pull off a header delimited by \n\n.
1359// Use unshift() if we get too much.
1360// Call the callback with (error, header, stream).
1361const { StringDecoder } = require('string_decoder');
1362function parseHeader(stream, callback) {
1363  stream.on('error', callback);
1364  stream.on('readable', onReadable);
1365  const decoder = new StringDecoder('utf8');
1366  let header = '';
1367  function onReadable() {
1368    let chunk;
1369    while (null !== (chunk = stream.read())) {
1370      const str = decoder.write(chunk);
1371      if (str.match(/\n\n/)) {
1372        // Found the header boundary.
1373        const split = str.split(/\n\n/);
1374        header += split.shift();
1375        const remaining = split.join('\n\n');
1376        const buf = Buffer.from(remaining, 'utf8');
1377        stream.removeListener('error', callback);
1378        // Remove the 'readable' listener before unshifting.
1379        stream.removeListener('readable', onReadable);
1380        if (buf.length)
1381          stream.unshift(buf);
1382        // Now the body of the message can be read from the stream.
1383        callback(null, header, stream);
1384      } else {
1385        // Still reading the header.
1386        header += str;
1387      }
1388    }
1389  }
1390}
1391```
1392
1393Unlike [`stream.push(chunk)`][stream-push], `stream.unshift(chunk)` will not
1394end the reading process by resetting the internal reading state of the stream.
1395This can cause unexpected results if `readable.unshift()` is called during a
1396read (i.e. from within a [`stream._read()`][stream-_read] implementation on a
1397custom stream). Following the call to `readable.unshift()` with an immediate
1398[`stream.push('')`][stream-push] will reset the reading state appropriately,
1399however it is best to simply avoid calling `readable.unshift()` while in the
1400process of performing a read.
1401
1402##### `readable.wrap(stream)`
1403<!-- YAML
1404added: v0.9.4
1405-->
1406
1407* `stream` {Stream} An "old style" readable stream
1408* Returns: {this}
1409
1410Prior to Node.js 0.10, streams did not implement the entire `stream` module API
1411as it is currently defined. (See [Compatibility][] for more information.)
1412
1413When using an older Node.js library that emits [`'data'`][] events and has a
1414[`stream.pause()`][stream-pause] method that is advisory only, the
1415`readable.wrap()` method can be used to create a [`Readable`][] stream that uses
1416the old stream as its data source.
1417
1418It will rarely be necessary to use `readable.wrap()` but the method has been
1419provided as a convenience for interacting with older Node.js applications and
1420libraries.
1421
1422```js
1423const { OldReader } = require('./old-api-module.js');
1424const { Readable } = require('stream');
1425const oreader = new OldReader();
1426const myReader = new Readable().wrap(oreader);
1427
1428myReader.on('readable', () => {
1429  myReader.read(); // etc.
1430});
1431```
1432
1433##### `readable[Symbol.asyncIterator]()`
1434<!-- YAML
1435added: v10.0.0
1436changes:
1437  - version: v11.14.0
1438    pr-url: https://github.com/nodejs/node/pull/26989
1439    description: Symbol.asyncIterator support is no longer experimental.
1440-->
1441
1442* Returns: {AsyncIterator} to fully consume the stream.
1443
1444```js
1445const fs = require('fs');
1446
1447async function print(readable) {
1448  readable.setEncoding('utf8');
1449  let data = '';
1450  for await (const chunk of readable) {
1451    data += chunk;
1452  }
1453  console.log(data);
1454}
1455
1456print(fs.createReadStream('file')).catch(console.error);
1457```
1458
1459If the loop terminates with a `break` or a `throw`, the stream will be
1460destroyed. In other terms, iterating over a stream will consume the stream
1461fully. The stream will be read in chunks of size equal to the `highWaterMark`
1462option. In the code example above, data will be in a single chunk if the file
1463has less then 64KB of data because no `highWaterMark` option is provided to
1464[`fs.createReadStream()`][].
1465
1466### Duplex and transform streams
1467
1468#### Class: `stream.Duplex`
1469<!-- YAML
1470added: v0.9.4
1471changes:
1472  - version: v6.8.0
1473    pr-url: https://github.com/nodejs/node/pull/8834
1474    description: Instances of `Duplex` now return `true` when
1475                 checking `instanceof stream.Writable`.
1476-->
1477
1478<!--type=class-->
1479
1480Duplex streams are streams that implement both the [`Readable`][] and
1481[`Writable`][] interfaces.
1482
1483Examples of `Duplex` streams include:
1484
1485* [TCP sockets][]
1486* [zlib streams][zlib]
1487* [crypto streams][crypto]
1488
1489#### Class: `stream.Transform`
1490<!-- YAML
1491added: v0.9.4
1492-->
1493
1494<!--type=class-->
1495
1496Transform streams are [`Duplex`][] streams where the output is in some way
1497related to the input. Like all [`Duplex`][] streams, `Transform` streams
1498implement both the [`Readable`][] and [`Writable`][] interfaces.
1499
1500Examples of `Transform` streams include:
1501
1502* [zlib streams][zlib]
1503* [crypto streams][crypto]
1504
1505##### `transform.destroy([error])`
1506<!-- YAML
1507added: v8.0.0
1508-->
1509
1510* `error` {Error}
1511* Returns: {this}
1512
1513Destroy the stream, and optionally emit an `'error'` event. After this call, the
1514transform stream would release any internal resources.
1515Implementors should not override this method, but instead implement
1516[`readable._destroy()`][readable-_destroy].
1517The default implementation of `_destroy()` for `Transform` also emit `'close'`
1518unless `emitClose` is set in false.
1519
1520### `stream.finished(stream[, options], callback)`
1521<!-- YAML
1522added: v10.0.0
1523-->
1524
1525* `stream` {Stream} A readable and/or writable stream.
1526* `options` {Object}
1527  * `error` {boolean} If set to `false`, then a call to `emit('error', err)` is
1528    not treated as finished. **Default**: `true`.
1529  * `readable` {boolean} When set to `false`, the callback will be called when
1530    the stream ends even though the stream might still be readable.
1531    **Default**: `true`.
1532  * `writable` {boolean} When set to `false`, the callback will be called when
1533    the stream ends even though the stream might still be writable.
1534    **Default**: `true`.
1535* `callback` {Function} A callback function that takes an optional error
1536  argument.
1537* Returns: {Function} A cleanup function which removes all registered
1538  listeners.
1539
1540A function to get notified when a stream is no longer readable, writable
1541or has experienced an error or a premature close event.
1542
1543```js
1544const { finished } = require('stream');
1545
1546const rs = fs.createReadStream('archive.tar');
1547
1548finished(rs, (err) => {
1549  if (err) {
1550    console.error('Stream failed.', err);
1551  } else {
1552    console.log('Stream is done reading.');
1553  }
1554});
1555
1556rs.resume(); // Drain the stream.
1557```
1558
1559Especially useful in error handling scenarios where a stream is destroyed
1560prematurely (like an aborted HTTP request), and will not emit `'end'`
1561or `'finish'`.
1562
1563The `finished` API is promisify-able as well;
1564
1565```js
1566const finished = util.promisify(stream.finished);
1567
1568const rs = fs.createReadStream('archive.tar');
1569
1570async function run() {
1571  await finished(rs);
1572  console.log('Stream is done reading.');
1573}
1574
1575run().catch(console.error);
1576rs.resume(); // Drain the stream.
1577```
1578
1579`stream.finished()` leaves dangling event listeners (in particular
1580`'error'`, `'end'`, `'finish'` and `'close'`) after `callback` has been
1581invoked. The reason for this is so that unexpected `'error'` events (due to
1582incorrect stream implementations) do not cause unexpected crashes.
1583If this is unwanted behavior then the returned cleanup function needs to be
1584invoked in the callback:
1585
1586```js
1587const cleanup = finished(rs, (err) => {
1588  cleanup();
1589  // ...
1590});
1591```
1592
1593### `stream.pipeline(...streams, callback)`
1594<!-- YAML
1595added: v10.0.0
1596-->
1597
1598* `...streams` {Stream} Two or more streams to pipe between.
1599* `callback` {Function} Called when the pipeline is fully done.
1600  * `err` {Error}
1601
1602A module method to pipe between streams forwarding errors and properly cleaning
1603up and provide a callback when the pipeline is complete.
1604
1605```js
1606const { pipeline } = require('stream');
1607const fs = require('fs');
1608const zlib = require('zlib');
1609
1610// Use the pipeline API to easily pipe a series of streams
1611// together and get notified when the pipeline is fully done.
1612
1613// A pipeline to gzip a potentially huge tar file efficiently:
1614
1615pipeline(
1616  fs.createReadStream('archive.tar'),
1617  zlib.createGzip(),
1618  fs.createWriteStream('archive.tar.gz'),
1619  (err) => {
1620    if (err) {
1621      console.error('Pipeline failed.', err);
1622    } else {
1623      console.log('Pipeline succeeded.');
1624    }
1625  }
1626);
1627```
1628
1629The `pipeline` API is promisify-able as well:
1630
1631```js
1632const pipeline = util.promisify(stream.pipeline);
1633
1634async function run() {
1635  await pipeline(
1636    fs.createReadStream('archive.tar'),
1637    zlib.createGzip(),
1638    fs.createWriteStream('archive.tar.gz')
1639  );
1640  console.log('Pipeline succeeded.');
1641}
1642
1643run().catch(console.error);
1644```
1645
1646`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
1647* `Readable` streams which have emitted `'end'` or `'close'`.
1648* `Writable` streams which have emitted `'finish'` or `'close'`.
1649
1650`stream.pipeline()` leaves dangling event listeners on the streams
1651after the `callback` has been invoked. In the case of reuse of streams after
1652failure, this can cause event listener leaks and swallowed errors.
1653
1654### `stream.Readable.from(iterable, [options])`
1655<!-- YAML
1656added:
1657  - v12.3.0
1658  - v10.17.0
1659-->
1660
1661* `iterable` {Iterable} Object implementing the `Symbol.asyncIterator` or
1662  `Symbol.iterator` iterable protocol. Emits an 'error' event if a null
1663   value is passed.
1664* `options` {Object} Options provided to `new stream.Readable([options])`.
1665  By default, `Readable.from()` will set `options.objectMode` to `true`, unless
1666  this is explicitly opted out by setting `options.objectMode` to `false`.
1667* Returns: {stream.Readable}
1668
1669A utility method for creating readable streams out of iterators.
1670
1671```js
1672const { Readable } = require('stream');
1673
1674async function * generate() {
1675  yield 'hello';
1676  yield 'streams';
1677}
1678
1679const readable = Readable.from(generate());
1680
1681readable.on('data', (chunk) => {
1682  console.log(chunk);
1683});
1684```
1685
1686Calling `Readable.from(string)` or `Readable.from(buffer)` will not have
1687the strings or buffers be iterated to match the other streams semantics
1688for performance reasons.
1689
1690## API for stream implementers
1691
1692<!--type=misc-->
1693
1694The `stream` module API has been designed to make it possible to easily
1695implement streams using JavaScript's prototypal inheritance model.
1696
1697First, a stream developer would declare a new JavaScript class that extends one
1698of the four basic stream classes (`stream.Writable`, `stream.Readable`,
1699`stream.Duplex`, or `stream.Transform`), making sure they call the appropriate
1700parent class constructor:
1701
1702<!-- eslint-disable no-useless-constructor -->
1703```js
1704const { Writable } = require('stream');
1705
1706class MyWritable extends Writable {
1707  constructor({ highWaterMark, ...options }) {
1708    super({
1709      highWaterMark,
1710      autoDestroy: true,
1711      emitClose: true
1712    });
1713    // ...
1714  }
1715}
1716```
1717
1718When extending streams, keep in mind what options the user
1719can and should provide before forwarding these to the base constructor. For
1720example, if the implementation makes assumptions in regard to the
1721`autoDestroy` and `emitClose` options, do not allow the
1722user to override these. Be explicit about what
1723options are forwarded instead of implicitly forwarding all options.
1724
1725The new stream class must then implement one or more specific methods, depending
1726on the type of stream being created, as detailed in the chart below:
1727
1728| Use-case | Class | Method(s) to implement |
1729| -------- | ----- | ---------------------- |
1730| Reading only | [`Readable`][] | [`_read()`][stream-_read] |
1731| Writing only | [`Writable`][] | [`_write()`][stream-_write], [`_writev()`][stream-_writev], [`_final()`][stream-_final] |
1732| Reading and writing | [`Duplex`][] | [`_read()`][stream-_read], [`_write()`][stream-_write], [`_writev()`][stream-_writev], [`_final()`][stream-_final] |
1733| Operate on written data, then read the result | [`Transform`][] | [`_transform()`][stream-_transform], [`_flush()`][stream-_flush], [`_final()`][stream-_final] |
1734
1735The implementation code for a stream should *never* call the "public" methods
1736of a stream that are intended for use by consumers (as described in the
1737[API for stream consumers][] section). Doing so may lead to adverse side effects
1738in application code consuming the stream.
1739
1740Avoid overriding public methods such as `write()`, `end()`, `cork()`,
1741`uncork()`, `read()` and `destroy()`, or emitting internal events such
1742as `'error'`, `'data'`, `'end'`, `'finish'` and `'close'` through `.emit()`.
1743Doing so can break current and future stream invariants leading to behavior
1744and/or compatibility issues with other streams, stream utilities, and user
1745expectations.
1746
1747### Simplified construction
1748<!-- YAML
1749added: v1.2.0
1750-->
1751
1752For many simple cases, it is possible to construct a stream without relying on
1753inheritance. This can be accomplished by directly creating instances of the
1754`stream.Writable`, `stream.Readable`, `stream.Duplex` or `stream.Transform`
1755objects and passing appropriate methods as constructor options.
1756
1757```js
1758const { Writable } = require('stream');
1759
1760const myWritable = new Writable({
1761  write(chunk, encoding, callback) {
1762    // ...
1763  }
1764});
1765```
1766
1767### Implementing a writable stream
1768
1769The `stream.Writable` class is extended to implement a [`Writable`][] stream.
1770
1771Custom `Writable` streams *must* call the `new stream.Writable([options])`
1772constructor and implement the `writable._write()` and/or `writable._writev()`
1773method.
1774
1775#### `new stream.Writable([options])`
1776<!-- YAML
1777changes:
1778  - version: v10.0.0
1779    pr-url: https://github.com/nodejs/node/pull/18438
1780    description: Add `emitClose` option to specify if `'close'` is emitted on
1781                 destroy.
1782  - version: v11.2.0
1783    pr-url: https://github.com/nodejs/node/pull/22795
1784    description: Add `autoDestroy` option to automatically `destroy()` the
1785                 stream when it emits `'finish'` or errors.
1786-->
1787
1788* `options` {Object}
1789  * `highWaterMark` {number} Buffer level when
1790    [`stream.write()`][stream-write] starts returning `false`. **Default:**
1791    `16384` (16KB), or `16` for `objectMode` streams.
1792  * `decodeStrings` {boolean} Whether to encode `string`s passed to
1793    [`stream.write()`][stream-write] to `Buffer`s (with the encoding
1794    specified in the [`stream.write()`][stream-write] call) before passing
1795    them to [`stream._write()`][stream-_write]. Other types of data are not
1796    converted (i.e. `Buffer`s are not decoded into `string`s). Setting to
1797    false will prevent `string`s from being converted. **Default:** `true`.
1798  * `defaultEncoding` {string} The default encoding that is used when no
1799    encoding is specified as an argument to [`stream.write()`][stream-write].
1800    **Default:** `'utf8'`.
1801  * `objectMode` {boolean} Whether or not the
1802    [`stream.write(anyObj)`][stream-write] is a valid operation. When set,
1803    it becomes possible to write JavaScript values other than string,
1804    `Buffer` or `Uint8Array` if supported by the stream implementation.
1805    **Default:** `false`.
1806  * `emitClose` {boolean} Whether or not the stream should emit `'close'`
1807    after it has been destroyed. **Default:** `true`.
1808  * `write` {Function} Implementation for the
1809    [`stream._write()`][stream-_write] method.
1810  * `writev` {Function} Implementation for the
1811    [`stream._writev()`][stream-_writev] method.
1812  * `destroy` {Function} Implementation for the
1813    [`stream._destroy()`][writable-_destroy] method.
1814  * `final` {Function} Implementation for the
1815    [`stream._final()`][stream-_final] method.
1816  * `autoDestroy` {boolean} Whether this stream should automatically call
1817    `.destroy()` on itself after ending. **Default:** `false`.
1818
1819<!-- eslint-disable no-useless-constructor -->
1820```js
1821const { Writable } = require('stream');
1822
1823class MyWritable extends Writable {
1824  constructor(options) {
1825    // Calls the stream.Writable() constructor.
1826    super(options);
1827    // ...
1828  }
1829}
1830```
1831
1832Or, when using pre-ES6 style constructors:
1833
1834```js
1835const { Writable } = require('stream');
1836const util = require('util');
1837
1838function MyWritable(options) {
1839  if (!(this instanceof MyWritable))
1840    return new MyWritable(options);
1841  Writable.call(this, options);
1842}
1843util.inherits(MyWritable, Writable);
1844```
1845
1846Or, using the simplified constructor approach:
1847
1848```js
1849const { Writable } = require('stream');
1850
1851const myWritable = new Writable({
1852  write(chunk, encoding, callback) {
1853    // ...
1854  },
1855  writev(chunks, callback) {
1856    // ...
1857  }
1858});
1859```
1860
1861#### `writable._write(chunk, encoding, callback)`
1862<!-- YAML
1863changes:
1864  - version: v12.11.0
1865    pr-url: https://github.com/nodejs/node/pull/29639
1866    description: _write() is optional when providing _writev().
1867-->
1868
1869* `chunk` {Buffer|string|any} The `Buffer` to be written, converted from the
1870  `string` passed to [`stream.write()`][stream-write]. If the stream's
1871  `decodeStrings` option is `false` or the stream is operating in object mode,
1872  the chunk will not be converted & will be whatever was passed to
1873  [`stream.write()`][stream-write].
1874* `encoding` {string} If the chunk is a string, then `encoding` is the
1875  character encoding of that string. If chunk is a `Buffer`, or if the
1876  stream is operating in object mode, `encoding` may be ignored.
1877* `callback` {Function} Call this function (optionally with an error
1878  argument) when processing is complete for the supplied chunk.
1879
1880All `Writable` stream implementations must provide a
1881[`writable._write()`][stream-_write] and/or
1882[`writable._writev()`][stream-_writev] method to send data to the underlying
1883resource.
1884
1885[`Transform`][] streams provide their own implementation of the
1886[`writable._write()`][stream-_write].
1887
1888This function MUST NOT be called by application code directly. It should be
1889implemented by child classes, and called by the internal `Writable` class
1890methods only.
1891
1892The `callback` method must be called to signal either that the write completed
1893successfully or failed with an error. The first argument passed to the
1894`callback` must be the `Error` object if the call failed or `null` if the
1895write succeeded.
1896
1897All calls to `writable.write()` that occur between the time `writable._write()`
1898is called and the `callback` is called will cause the written data to be
1899buffered. When the `callback` is invoked, the stream might emit a [`'drain'`][]
1900event. If a stream implementation is capable of processing multiple chunks of
1901data at once, the `writable._writev()` method should be implemented.
1902
1903If the `decodeStrings` property is explicitly set to `false` in the constructor
1904options, then `chunk` will remain the same object that is passed to `.write()`,
1905and may be a string rather than a `Buffer`. This is to support implementations
1906that have an optimized handling for certain string data encodings. In that case,
1907the `encoding` argument will indicate the character encoding of the string.
1908Otherwise, the `encoding` argument can be safely ignored.
1909
1910The `writable._write()` method is prefixed with an underscore because it is
1911internal to the class that defines it, and should never be called directly by
1912user programs.
1913
1914#### `writable._writev(chunks, callback)`
1915
1916* `chunks` {Object[]} The chunks to be written. Each chunk has following
1917  format: `{ chunk: ..., encoding: ... }`.
1918* `callback` {Function} A callback function (optionally with an error
1919  argument) to be invoked when processing is complete for the supplied chunks.
1920
1921This function MUST NOT be called by application code directly. It should be
1922implemented by child classes, and called by the internal `Writable` class
1923methods only.
1924
1925The `writable._writev()` method may be implemented in addition or alternatively
1926to `writable._write()` in stream implementations that are capable of processing
1927multiple chunks of data at once. If implemented and if there is buffered data
1928from previous writes, `_writev()` will be called instead of `_write()`.
1929
1930The `writable._writev()` method is prefixed with an underscore because it is
1931internal to the class that defines it, and should never be called directly by
1932user programs.
1933
1934#### `writable._destroy(err, callback)`
1935<!-- YAML
1936added: v8.0.0
1937-->
1938
1939* `err` {Error} A possible error.
1940* `callback` {Function} A callback function that takes an optional error
1941  argument.
1942
1943The `_destroy()` method is called by [`writable.destroy()`][writable-destroy].
1944It can be overridden by child classes but it **must not** be called directly.
1945
1946#### `writable._final(callback)`
1947<!-- YAML
1948added: v8.0.0
1949-->
1950
1951* `callback` {Function} Call this function (optionally with an error
1952  argument) when finished writing any remaining data.
1953
1954The `_final()` method **must not** be called directly. It may be implemented
1955by child classes, and if so, will be called by the internal `Writable`
1956class methods only.
1957
1958This optional function will be called before the stream closes, delaying the
1959`'finish'` event until `callback` is called. This is useful to close resources
1960or write buffered data before a stream ends.
1961
1962#### Errors while writing
1963
1964Errors occurring during the processing of the [`writable._write()`][],
1965[`writable._writev()`][] and [`writable._final()`][] methods must be propagated
1966by invoking the callback and passing the error as the first argument.
1967Throwing an `Error` from within these methods or manually emitting an `'error'`
1968event results in undefined behavior.
1969
1970If a `Readable` stream pipes into a `Writable` stream when `Writable` emits an
1971error, the `Readable` stream will be unpiped.
1972
1973```js
1974const { Writable } = require('stream');
1975
1976const myWritable = new Writable({
1977  write(chunk, encoding, callback) {
1978    if (chunk.toString().indexOf('a') >= 0) {
1979      callback(new Error('chunk is invalid'));
1980    } else {
1981      callback();
1982    }
1983  }
1984});
1985```
1986
1987#### An example writable stream
1988
1989The following illustrates a rather simplistic (and somewhat pointless) custom
1990`Writable` stream implementation. While this specific `Writable` stream instance
1991is not of any real particular usefulness, the example illustrates each of the
1992required elements of a custom [`Writable`][] stream instance:
1993
1994```js
1995const { Writable } = require('stream');
1996
1997class MyWritable extends Writable {
1998  _write(chunk, encoding, callback) {
1999    if (chunk.toString().indexOf('a') >= 0) {
2000      callback(new Error('chunk is invalid'));
2001    } else {
2002      callback();
2003    }
2004  }
2005}
2006```
2007
2008#### Decoding buffers in a writable stream
2009
2010Decoding buffers is a common task, for instance, when using transformers whose
2011input is a string. This is not a trivial process when using multi-byte
2012characters encoding, such as UTF-8. The following example shows how to decode
2013multi-byte strings using `StringDecoder` and [`Writable`][].
2014
2015```js
2016const { Writable } = require('stream');
2017const { StringDecoder } = require('string_decoder');
2018
2019class StringWritable extends Writable {
2020  constructor(options) {
2021    super(options);
2022    this._decoder = new StringDecoder(options && options.defaultEncoding);
2023    this.data = '';
2024  }
2025  _write(chunk, encoding, callback) {
2026    if (encoding === 'buffer') {
2027      chunk = this._decoder.write(chunk);
2028    }
2029    this.data += chunk;
2030    callback();
2031  }
2032  _final(callback) {
2033    this.data += this._decoder.end();
2034    callback();
2035  }
2036}
2037
2038const euro = [[0xE2, 0x82], [0xAC]].map(Buffer.from);
2039const w = new StringWritable();
2040
2041w.write('currency: ');
2042w.write(euro[0]);
2043w.end(euro[1]);
2044
2045console.log(w.data); // currency: €
2046```
2047
2048### Implementing a readable stream
2049
2050The `stream.Readable` class is extended to implement a [`Readable`][] stream.
2051
2052Custom `Readable` streams *must* call the `new stream.Readable([options])`
2053constructor and implement the [`readable._read()`][] method.
2054
2055#### `new stream.Readable([options])`
2056<!-- YAML
2057changes:
2058  - version: v11.2.0
2059    pr-url: https://github.com/nodejs/node/pull/22795
2060    description: Add `autoDestroy` option to automatically `destroy()` the
2061                 stream when it emits `'end'` or errors.
2062-->
2063
2064* `options` {Object}
2065  * `highWaterMark` {number} The maximum [number of bytes][hwm-gotcha] to store
2066    in the internal buffer before ceasing to read from the underlying resource.
2067    **Default:** `16384` (16KB), or `16` for `objectMode` streams.
2068  * `encoding` {string} If specified, then buffers will be decoded to
2069    strings using the specified encoding. **Default:** `null`.
2070  * `objectMode` {boolean} Whether this stream should behave
2071    as a stream of objects. Meaning that [`stream.read(n)`][stream-read] returns
2072    a single value instead of a `Buffer` of size `n`. **Default:** `false`.
2073  * `emitClose` {boolean} Whether or not the stream should emit `'close'`
2074    after it has been destroyed. **Default:** `true`.
2075  * `read` {Function} Implementation for the [`stream._read()`][stream-_read]
2076    method.
2077  * `destroy` {Function} Implementation for the
2078    [`stream._destroy()`][readable-_destroy] method.
2079  * `autoDestroy` {boolean} Whether this stream should automatically call
2080    `.destroy()` on itself after ending. **Default:** `false`.
2081
2082<!-- eslint-disable no-useless-constructor -->
2083```js
2084const { Readable } = require('stream');
2085
2086class MyReadable extends Readable {
2087  constructor(options) {
2088    // Calls the stream.Readable(options) constructor.
2089    super(options);
2090    // ...
2091  }
2092}
2093```
2094
2095Or, when using pre-ES6 style constructors:
2096
2097```js
2098const { Readable } = require('stream');
2099const util = require('util');
2100
2101function MyReadable(options) {
2102  if (!(this instanceof MyReadable))
2103    return new MyReadable(options);
2104  Readable.call(this, options);
2105}
2106util.inherits(MyReadable, Readable);
2107```
2108
2109Or, using the simplified constructor approach:
2110
2111```js
2112const { Readable } = require('stream');
2113
2114const myReadable = new Readable({
2115  read(size) {
2116    // ...
2117  }
2118});
2119```
2120
2121#### `readable._read(size)`
2122<!-- YAML
2123added: v0.9.4
2124-->
2125
2126* `size` {number} Number of bytes to read asynchronously
2127
2128This function MUST NOT be called by application code directly. It should be
2129implemented by child classes, and called by the internal `Readable` class
2130methods only.
2131
2132All `Readable` stream implementations must provide an implementation of the
2133[`readable._read()`][] method to fetch data from the underlying resource.
2134
2135When [`readable._read()`][] is called, if data is available from the resource,
2136the implementation should begin pushing that data into the read queue using the
2137[`this.push(dataChunk)`][stream-push] method. `_read()` should continue reading
2138from the resource and pushing data until `readable.push()` returns `false`. Only
2139when `_read()` is called again after it has stopped should it resume pushing
2140additional data onto the queue.
2141
2142Once the [`readable._read()`][] method has been called, it will not be called
2143again until more data is pushed through the [`readable.push()`][stream-push]
2144method. Empty data such as empty buffers and strings will not cause
2145[`readable._read()`][] to be called.
2146
2147The `size` argument is advisory. For implementations where a "read" is a
2148single operation that returns data can use the `size` argument to determine how
2149much data to fetch. Other implementations may ignore this argument and simply
2150provide data whenever it becomes available. There is no need to "wait" until
2151`size` bytes are available before calling [`stream.push(chunk)`][stream-push].
2152
2153The [`readable._read()`][] method is prefixed with an underscore because it is
2154internal to the class that defines it, and should never be called directly by
2155user programs.
2156
2157#### `readable._destroy(err, callback)`
2158<!-- YAML
2159added: v8.0.0
2160-->
2161
2162* `err` {Error} A possible error.
2163* `callback` {Function} A callback function that takes an optional error
2164  argument.
2165
2166The `_destroy()` method is called by [`readable.destroy()`][readable-destroy].
2167It can be overridden by child classes but it **must not** be called directly.
2168
2169#### `readable.push(chunk[, encoding])`
2170<!-- YAML
2171changes:
2172  - version: v8.0.0
2173    pr-url: https://github.com/nodejs/node/pull/11608
2174    description: The `chunk` argument can now be a `Uint8Array` instance.
2175-->
2176
2177* `chunk` {Buffer|Uint8Array|string|null|any} Chunk of data to push into the
2178  read queue. For streams not operating in object mode, `chunk` must be a
2179  string, `Buffer` or `Uint8Array`. For object mode streams, `chunk` may be
2180  any JavaScript value.
2181* `encoding` {string} Encoding of string chunks. Must be a valid
2182  `Buffer` encoding, such as `'utf8'` or `'ascii'`.
2183* Returns: {boolean} `true` if additional chunks of data may continue to be
2184  pushed; `false` otherwise.
2185
2186When `chunk` is a `Buffer`, `Uint8Array` or `string`, the `chunk` of data will
2187be added to the internal queue for users of the stream to consume.
2188Passing `chunk` as `null` signals the end of the stream (EOF), after which no
2189more data can be written.
2190
2191When the `Readable` is operating in paused mode, the data added with
2192`readable.push()` can be read out by calling the
2193[`readable.read()`][stream-read] method when the [`'readable'`][] event is
2194emitted.
2195
2196When the `Readable` is operating in flowing mode, the data added with
2197`readable.push()` will be delivered by emitting a `'data'` event.
2198
2199The `readable.push()` method is designed to be as flexible as possible. For
2200example, when wrapping a lower-level source that provides some form of
2201pause/resume mechanism, and a data callback, the low-level source can be wrapped
2202by the custom `Readable` instance:
2203
2204```js
2205// `_source` is an object with readStop() and readStart() methods,
2206// and an `ondata` member that gets called when it has data, and
2207// an `onend` member that gets called when the data is over.
2208
2209class SourceWrapper extends Readable {
2210  constructor(options) {
2211    super(options);
2212
2213    this._source = getLowLevelSourceObject();
2214
2215    // Every time there's data, push it into the internal buffer.
2216    this._source.ondata = (chunk) => {
2217      // If push() returns false, then stop reading from source.
2218      if (!this.push(chunk))
2219        this._source.readStop();
2220    };
2221
2222    // When the source ends, push the EOF-signaling `null` chunk.
2223    this._source.onend = () => {
2224      this.push(null);
2225    };
2226  }
2227  // _read() will be called when the stream wants to pull more data in.
2228  // The advisory size argument is ignored in this case.
2229  _read(size) {
2230    this._source.readStart();
2231  }
2232}
2233```
2234
2235The `readable.push()` method is used to push the content
2236into the internal buffer. It can be driven by the [`readable._read()`][] method.
2237
2238For streams not operating in object mode, if the `chunk` parameter of
2239`readable.push()` is `undefined`, it will be treated as empty string or
2240buffer. See [`readable.push('')`][] for more information.
2241
2242#### Errors while reading
2243
2244Errors occurring during processing of the [`readable._read()`][] must be
2245propagated through the [`readable.destroy(err)`][readable-_destroy] method.
2246Throwing an `Error` from within [`readable._read()`][] or manually emitting an
2247`'error'` event results in undefined behavior.
2248
2249```js
2250const { Readable } = require('stream');
2251
2252const myReadable = new Readable({
2253  read(size) {
2254    const err = checkSomeErrorCondition();
2255    if (err) {
2256      this.destroy(err);
2257    } else {
2258      // Do some work.
2259    }
2260  }
2261});
2262```
2263
2264#### An example counting stream
2265
2266<!--type=example-->
2267
2268The following is a basic example of a `Readable` stream that emits the numerals
2269from 1 to 1,000,000 in ascending order, and then ends.
2270
2271```js
2272const { Readable } = require('stream');
2273
2274class Counter extends Readable {
2275  constructor(opt) {
2276    super(opt);
2277    this._max = 1000000;
2278    this._index = 1;
2279  }
2280
2281  _read() {
2282    const i = this._index++;
2283    if (i > this._max)
2284      this.push(null);
2285    else {
2286      const str = String(i);
2287      const buf = Buffer.from(str, 'ascii');
2288      this.push(buf);
2289    }
2290  }
2291}
2292```
2293
2294### Implementing a duplex stream
2295
2296A [`Duplex`][] stream is one that implements both [`Readable`][] and
2297[`Writable`][], such as a TCP socket connection.
2298
2299Because JavaScript does not have support for multiple inheritance, the
2300`stream.Duplex` class is extended to implement a [`Duplex`][] stream (as opposed
2301to extending the `stream.Readable` *and* `stream.Writable` classes).
2302
2303The `stream.Duplex` class prototypically inherits from `stream.Readable` and
2304parasitically from `stream.Writable`, but `instanceof` will work properly for
2305both base classes due to overriding [`Symbol.hasInstance`][] on
2306`stream.Writable`.
2307
2308Custom `Duplex` streams *must* call the `new stream.Duplex([options])`
2309constructor and implement *both* the [`readable._read()`][] and
2310`writable._write()` methods.
2311
2312#### `new stream.Duplex(options)`
2313<!-- YAML
2314changes:
2315  - version: v8.4.0
2316    pr-url: https://github.com/nodejs/node/pull/14636
2317    description: The `readableHighWaterMark` and `writableHighWaterMark` options
2318                 are supported now.
2319-->
2320
2321* `options` {Object} Passed to both `Writable` and `Readable`
2322  constructors. Also has the following fields:
2323  * `allowHalfOpen` {boolean} If set to `false`, then the stream will
2324    automatically end the writable side when the readable side ends.
2325    **Default:** `true`.
2326  * `readable` {boolean} Sets whether the `Duplex` should be readable.
2327    **Default:** `true`.
2328  * `writable` {boolean} Sets whether the `Duplex` should be writable.
2329    **Default:** `true`.
2330  * `readableObjectMode` {boolean} Sets `objectMode` for readable side of the
2331    stream. Has no effect if `objectMode` is `true`. **Default:** `false`.
2332  * `writableObjectMode` {boolean} Sets `objectMode` for writable side of the
2333    stream. Has no effect if `objectMode` is `true`. **Default:** `false`.
2334  * `readableHighWaterMark` {number} Sets `highWaterMark` for the readable side
2335    of the stream. Has no effect if `highWaterMark` is provided.
2336  * `writableHighWaterMark` {number} Sets `highWaterMark` for the writable side
2337    of the stream. Has no effect if `highWaterMark` is provided.
2338
2339<!-- eslint-disable no-useless-constructor -->
2340```js
2341const { Duplex } = require('stream');
2342
2343class MyDuplex extends Duplex {
2344  constructor(options) {
2345    super(options);
2346    // ...
2347  }
2348}
2349```
2350
2351Or, when using pre-ES6 style constructors:
2352
2353```js
2354const { Duplex } = require('stream');
2355const util = require('util');
2356
2357function MyDuplex(options) {
2358  if (!(this instanceof MyDuplex))
2359    return new MyDuplex(options);
2360  Duplex.call(this, options);
2361}
2362util.inherits(MyDuplex, Duplex);
2363```
2364
2365Or, using the simplified constructor approach:
2366
2367```js
2368const { Duplex } = require('stream');
2369
2370const myDuplex = new Duplex({
2371  read(size) {
2372    // ...
2373  },
2374  write(chunk, encoding, callback) {
2375    // ...
2376  }
2377});
2378```
2379
2380#### An example duplex stream
2381
2382The following illustrates a simple example of a `Duplex` stream that wraps a
2383hypothetical lower-level source object to which data can be written, and
2384from which data can be read, albeit using an API that is not compatible with
2385Node.js streams.
2386The following illustrates a simple example of a `Duplex` stream that buffers
2387incoming written data via the [`Writable`][] interface that is read back out
2388via the [`Readable`][] interface.
2389
2390```js
2391const { Duplex } = require('stream');
2392const kSource = Symbol('source');
2393
2394class MyDuplex extends Duplex {
2395  constructor(source, options) {
2396    super(options);
2397    this[kSource] = source;
2398  }
2399
2400  _write(chunk, encoding, callback) {
2401    // The underlying source only deals with strings.
2402    if (Buffer.isBuffer(chunk))
2403      chunk = chunk.toString();
2404    this[kSource].writeSomeData(chunk);
2405    callback();
2406  }
2407
2408  _read(size) {
2409    this[kSource].fetchSomeData(size, (data, encoding) => {
2410      this.push(Buffer.from(data, encoding));
2411    });
2412  }
2413}
2414```
2415
2416The most important aspect of a `Duplex` stream is that the `Readable` and
2417`Writable` sides operate independently of one another despite co-existing within
2418a single object instance.
2419
2420#### Object mode duplex streams
2421
2422For `Duplex` streams, `objectMode` can be set exclusively for either the
2423`Readable` or `Writable` side using the `readableObjectMode` and
2424`writableObjectMode` options respectively.
2425
2426In the following example, for instance, a new `Transform` stream (which is a
2427type of [`Duplex`][] stream) is created that has an object mode `Writable` side
2428that accepts JavaScript numbers that are converted to hexadecimal strings on
2429the `Readable` side.
2430
2431```js
2432const { Transform } = require('stream');
2433
2434// All Transform streams are also Duplex Streams.
2435const myTransform = new Transform({
2436  writableObjectMode: true,
2437
2438  transform(chunk, encoding, callback) {
2439    // Coerce the chunk to a number if necessary.
2440    chunk |= 0;
2441
2442    // Transform the chunk into something else.
2443    const data = chunk.toString(16);
2444
2445    // Push the data onto the readable queue.
2446    callback(null, '0'.repeat(data.length % 2) + data);
2447  }
2448});
2449
2450myTransform.setEncoding('ascii');
2451myTransform.on('data', (chunk) => console.log(chunk));
2452
2453myTransform.write(1);
2454// Prints: 01
2455myTransform.write(10);
2456// Prints: 0a
2457myTransform.write(100);
2458// Prints: 64
2459```
2460
2461### Implementing a transform stream
2462
2463A [`Transform`][] stream is a [`Duplex`][] stream where the output is computed
2464in some way from the input. Examples include [zlib][] streams or [crypto][]
2465streams that compress, encrypt, or decrypt data.
2466
2467There is no requirement that the output be the same size as the input, the same
2468number of chunks, or arrive at the same time. For example, a `Hash` stream will
2469only ever have a single chunk of output which is provided when the input is
2470ended. A `zlib` stream will produce output that is either much smaller or much
2471larger than its input.
2472
2473The `stream.Transform` class is extended to implement a [`Transform`][] stream.
2474
2475The `stream.Transform` class prototypically inherits from `stream.Duplex` and
2476implements its own versions of the `writable._write()` and
2477[`readable._read()`][] methods. Custom `Transform` implementations *must*
2478implement the [`transform._transform()`][stream-_transform] method and *may*
2479also implement the [`transform._flush()`][stream-_flush] method.
2480
2481Care must be taken when using `Transform` streams in that data written to the
2482stream can cause the `Writable` side of the stream to become paused if the
2483output on the `Readable` side is not consumed.
2484
2485#### `new stream.Transform([options])`
2486
2487* `options` {Object} Passed to both `Writable` and `Readable`
2488  constructors. Also has the following fields:
2489  * `transform` {Function} Implementation for the
2490    [`stream._transform()`][stream-_transform] method.
2491  * `flush` {Function} Implementation for the [`stream._flush()`][stream-_flush]
2492    method.
2493
2494<!-- eslint-disable no-useless-constructor -->
2495```js
2496const { Transform } = require('stream');
2497
2498class MyTransform extends Transform {
2499  constructor(options) {
2500    super(options);
2501    // ...
2502  }
2503}
2504```
2505
2506Or, when using pre-ES6 style constructors:
2507
2508```js
2509const { Transform } = require('stream');
2510const util = require('util');
2511
2512function MyTransform(options) {
2513  if (!(this instanceof MyTransform))
2514    return new MyTransform(options);
2515  Transform.call(this, options);
2516}
2517util.inherits(MyTransform, Transform);
2518```
2519
2520Or, using the simplified constructor approach:
2521
2522```js
2523const { Transform } = require('stream');
2524
2525const myTransform = new Transform({
2526  transform(chunk, encoding, callback) {
2527    // ...
2528  }
2529});
2530```
2531
2532#### Events: `'finish'` and `'end'`
2533
2534The [`'finish'`][] and [`'end'`][] events are from the `stream.Writable`
2535and `stream.Readable` classes, respectively. The `'finish'` event is emitted
2536after [`stream.end()`][stream-end] is called and all chunks have been processed
2537by [`stream._transform()`][stream-_transform]. The `'end'` event is emitted
2538after all data has been output, which occurs after the callback in
2539[`transform._flush()`][stream-_flush] has been called.
2540
2541#### `transform._flush(callback)`
2542
2543* `callback` {Function} A callback function (optionally with an error
2544  argument and data) to be called when remaining data has been flushed.
2545
2546This function MUST NOT be called by application code directly. It should be
2547implemented by child classes, and called by the internal `Readable` class
2548methods only.
2549
2550In some cases, a transform operation may need to emit an additional bit of
2551data at the end of the stream. For example, a `zlib` compression stream will
2552store an amount of internal state used to optimally compress the output. When
2553the stream ends, however, that additional data needs to be flushed so that the
2554compressed data will be complete.
2555
2556Custom [`Transform`][] implementations *may* implement the `transform._flush()`
2557method. This will be called when there is no more written data to be consumed,
2558but before the [`'end'`][] event is emitted signaling the end of the
2559[`Readable`][] stream.
2560
2561Within the `transform._flush()` implementation, the `transform.push()` method
2562may be called zero or more times, as appropriate. The `callback` function must
2563be called when the flush operation is complete.
2564
2565The `transform._flush()` method is prefixed with an underscore because it is
2566internal to the class that defines it, and should never be called directly by
2567user programs.
2568
2569#### `transform._transform(chunk, encoding, callback)`
2570
2571* `chunk` {Buffer|string|any} The `Buffer` to be transformed, converted from
2572  the `string` passed to [`stream.write()`][stream-write]. If the stream's
2573  `decodeStrings` option is `false` or the stream is operating in object mode,
2574  the chunk will not be converted & will be whatever was passed to
2575  [`stream.write()`][stream-write].
2576* `encoding` {string} If the chunk is a string, then this is the
2577  encoding type. If chunk is a buffer, then this is the special
2578  value `'buffer'`. Ignore it in that case.
2579* `callback` {Function} A callback function (optionally with an error
2580  argument and data) to be called after the supplied `chunk` has been
2581  processed.
2582
2583This function MUST NOT be called by application code directly. It should be
2584implemented by child classes, and called by the internal `Readable` class
2585methods only.
2586
2587All `Transform` stream implementations must provide a `_transform()`
2588method to accept input and produce output. The `transform._transform()`
2589implementation handles the bytes being written, computes an output, then passes
2590that output off to the readable portion using the `transform.push()` method.
2591
2592The `transform.push()` method may be called zero or more times to generate
2593output from a single input chunk, depending on how much is to be output
2594as a result of the chunk.
2595
2596It is possible that no output is generated from any given chunk of input data.
2597
2598The `callback` function must be called only when the current chunk is completely
2599consumed. The first argument passed to the `callback` must be an `Error` object
2600if an error occurred while processing the input or `null` otherwise. If a second
2601argument is passed to the `callback`, it will be forwarded on to the
2602`transform.push()` method. In other words, the following are equivalent:
2603
2604```js
2605transform.prototype._transform = function(data, encoding, callback) {
2606  this.push(data);
2607  callback();
2608};
2609
2610transform.prototype._transform = function(data, encoding, callback) {
2611  callback(null, data);
2612};
2613```
2614
2615The `transform._transform()` method is prefixed with an underscore because it
2616is internal to the class that defines it, and should never be called directly by
2617user programs.
2618
2619`transform._transform()` is never called in parallel; streams implement a
2620queue mechanism, and to receive the next chunk, `callback` must be
2621called, either synchronously or asynchronously.
2622
2623#### Class: `stream.PassThrough`
2624
2625The `stream.PassThrough` class is a trivial implementation of a [`Transform`][]
2626stream that simply passes the input bytes across to the output. Its purpose is
2627primarily for examples and testing, but there are some use cases where
2628`stream.PassThrough` is useful as a building block for novel sorts of streams.
2629
2630## Additional notes
2631
2632<!--type=misc-->
2633
2634### Streams compatibility with async generators and async iterators
2635
2636With the support of async generators and iterators in JavaScript, async
2637generators are effectively a first-class language-level stream construct at
2638this point.
2639
2640Some common interop cases of using Node.js streams with async generators
2641and async iterators are provided below.
2642
2643#### Consuming readable streams with async iterators
2644
2645```js
2646(async function() {
2647  for await (const chunk of readable) {
2648    console.log(chunk);
2649  }
2650})();
2651```
2652
2653Async iterators register a permanent error handler on the stream to prevent any
2654unhandled post-destroy errors.
2655
2656#### Creating readable streams with async generators
2657
2658We can construct a Node.js readable stream from an asynchronous generator
2659using the `Readable.from()` utility method:
2660
2661```js
2662const { Readable } = require('stream');
2663
2664async function * generate() {
2665  yield 'a';
2666  yield 'b';
2667  yield 'c';
2668}
2669
2670const readable = Readable.from(generate());
2671
2672readable.on('data', (chunk) => {
2673  console.log(chunk);
2674});
2675```
2676
2677#### Piping to writable streams from async iterators
2678
2679In the scenario of writing to a writable stream from an async iterator, ensure
2680the correct handling of backpressure and errors.
2681
2682```js
2683const { once } = require('events');
2684const finished = util.promisify(stream.finished);
2685
2686const writable = fs.createWriteStream('./file');
2687
2688function drain(writable) {
2689  if (writable.destroyed) {
2690    return Promise.reject(new Error('premature close'));
2691  }
2692  return Promise.race([
2693    once(writable, 'drain'),
2694    once(writable, 'close')
2695      .then(() => Promise.reject(new Error('premature close')))
2696  ]);
2697}
2698
2699async function pump(iterable, writable) {
2700  for await (const chunk of iterable) {
2701    // Handle backpressure on write().
2702    if (!writable.write(chunk)) {
2703      await drain(writable);
2704    }
2705  }
2706  writable.end();
2707}
2708
2709(async function() {
2710  // Ensure completion without errors.
2711  await Promise.all([
2712    pump(iterable, writable),
2713    finished(writable)
2714  ]);
2715})();
2716```
2717
2718In the above, errors on `write()` would be caught and thrown by the
2719`once()` listener for the `'drain'` event, since `once()` will also handle the
2720`'error'` event. To ensure completion of the write stream without errors,
2721it is safer to use the `finished()` method as above, instead of using the
2722`once()` listener for the `'finish'` event. Under certain cases, an `'error'`
2723event could be emitted by the writable stream after `'finish'` and as `once()`
2724will release the `'error'` handler on handling the `'finish'` event, it could
2725result in an unhandled error.
2726
2727Alternatively, the readable stream could be wrapped with `Readable.from()` and
2728then piped via `.pipe()`:
2729
2730```js
2731const finished = util.promisify(stream.finished);
2732
2733const writable = fs.createWriteStream('./file');
2734
2735(async function() {
2736  const readable = Readable.from(iterable);
2737  readable.pipe(writable);
2738  // Ensure completion without errors.
2739  await finished(writable);
2740})();
2741```
2742
2743Or, using `stream.pipeline()` to pipe streams:
2744
2745```js
2746const pipeline = util.promisify(stream.pipeline);
2747
2748const writable = fs.createWriteStream('./file');
2749
2750(async function() {
2751  const readable = Readable.from(iterable);
2752  await pipeline(readable, writable);
2753})();
2754```
2755
2756<!--type=misc-->
2757
2758### Compatibility with older Node.js versions
2759
2760<!--type=misc-->
2761
2762Prior to Node.js 0.10, the `Readable` stream interface was simpler, but also
2763less powerful and less useful.
2764
2765* Rather than waiting for calls to the [`stream.read()`][stream-read] method,
2766  [`'data'`][] events would begin emitting immediately. Applications that
2767  would need to perform some amount of work to decide how to handle data
2768  were required to store read data into buffers so the data would not be lost.
2769* The [`stream.pause()`][stream-pause] method was advisory, rather than
2770  guaranteed. This meant that it was still necessary to be prepared to receive
2771  [`'data'`][] events *even when the stream was in a paused state*.
2772
2773In Node.js 0.10, the [`Readable`][] class was added. For backward
2774compatibility with older Node.js programs, `Readable` streams switch into
2775"flowing mode" when a [`'data'`][] event handler is added, or when the
2776[`stream.resume()`][stream-resume] method is called. The effect is that, even
2777when not using the new [`stream.read()`][stream-read] method and
2778[`'readable'`][] event, it is no longer necessary to worry about losing
2779[`'data'`][] chunks.
2780
2781While most applications will continue to function normally, this introduces an
2782edge case in the following conditions:
2783
2784* No [`'data'`][] event listener is added.
2785* The [`stream.resume()`][stream-resume] method is never called.
2786* The stream is not piped to any writable destination.
2787
2788For example, consider the following code:
2789
2790```js
2791// WARNING!  BROKEN!
2792net.createServer((socket) => {
2793
2794  // We add an 'end' listener, but never consume the data.
2795  socket.on('end', () => {
2796    // It will never get here.
2797    socket.end('The message was received but was not processed.\n');
2798  });
2799
2800}).listen(1337);
2801```
2802
2803Prior to Node.js 0.10, the incoming message data would be simply discarded.
2804However, in Node.js 0.10 and beyond, the socket remains paused forever.
2805
2806The workaround in this situation is to call the
2807[`stream.resume()`][stream-resume] method to begin the flow of data:
2808
2809```js
2810// Workaround.
2811net.createServer((socket) => {
2812  socket.on('end', () => {
2813    socket.end('The message was received but was not processed.\n');
2814  });
2815
2816  // Start the flow of data, discarding it.
2817  socket.resume();
2818}).listen(1337);
2819```
2820
2821In addition to new `Readable` streams switching into flowing mode,
2822pre-0.10 style streams can be wrapped in a `Readable` class using the
2823[`readable.wrap()`][`stream.wrap()`] method.
2824
2825### `readable.read(0)`
2826
2827There are some cases where it is necessary to trigger a refresh of the
2828underlying readable stream mechanisms, without actually consuming any
2829data. In such cases, it is possible to call `readable.read(0)`, which will
2830always return `null`.
2831
2832If the internal read buffer is below the `highWaterMark`, and the
2833stream is not currently reading, then calling `stream.read(0)` will trigger
2834a low-level [`stream._read()`][stream-_read] call.
2835
2836While most applications will almost never need to do this, there are
2837situations within Node.js where this is done, particularly in the
2838`Readable` stream class internals.
2839
2840### `readable.push('')`
2841
2842Use of `readable.push('')` is not recommended.
2843
2844Pushing a zero-byte string, `Buffer` or `Uint8Array` to a stream that is not in
2845object mode has an interesting side effect. Because it *is* a call to
2846[`readable.push()`][stream-push], the call will end the reading process.
2847However, because the argument is an empty string, no data is added to the
2848readable buffer so there is nothing for a user to consume.
2849
2850### `highWaterMark` discrepancy after calling `readable.setEncoding()`
2851
2852The use of `readable.setEncoding()` will change the behavior of how the
2853`highWaterMark` operates in non-object mode.
2854
2855Typically, the size of the current buffer is measured against the
2856`highWaterMark` in _bytes_. However, after `setEncoding()` is called, the
2857comparison function will begin to measure the buffer's size in _characters_.
2858
2859This is not a problem in common cases with `latin1` or `ascii`. But it is
2860advised to be mindful about this behavior when working with strings that could
2861contain multi-byte characters.
2862
2863[`'data'`]: #stream_event_data
2864[`'drain'`]: #stream_event_drain
2865[`'end'`]: #stream_event_end
2866[`'finish'`]: #stream_event_finish
2867[`'readable'`]: #stream_event_readable
2868[`Duplex`]: #stream_class_stream_duplex
2869[`EventEmitter`]: events.html#events_class_eventemitter
2870[`Readable`]: #stream_class_stream_readable
2871[`Symbol.hasInstance`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Symbol/hasInstance
2872[`Transform`]: #stream_class_stream_transform
2873[`Writable`]: #stream_class_stream_writable
2874[`fs.createReadStream()`]: fs.html#fs_fs_createreadstream_path_options
2875[`fs.createWriteStream()`]: fs.html#fs_fs_createwritestream_path_options
2876[`net.Socket`]: net.html#net_class_net_socket
2877[`process.stderr`]: process.html#process_process_stderr
2878[`process.stdin`]: process.html#process_process_stdin
2879[`process.stdout`]: process.html#process_process_stdout
2880[`readable._read()`]: #stream_readable_read_size_1
2881[`readable.push('')`]: #stream_readable_push
2882[`readable.setEncoding()`]: #stream_readable_setencoding_encoding
2883[`stream.Readable.from()`]: #stream_stream_readable_from_iterable_options
2884[`stream.cork()`]: #stream_writable_cork
2885[`stream.finished()`]: #stream_stream_finished_stream_options_callback
2886[`stream.pipe()`]: #stream_readable_pipe_destination_options
2887[`stream.pipeline()`]: #stream_stream_pipeline_streams_callback
2888[`stream.uncork()`]: #stream_writable_uncork
2889[`stream.unpipe()`]: #stream_readable_unpipe_destination
2890[`stream.wrap()`]: #stream_readable_wrap_stream
2891[`writable._final()`]: #stream_writable_final_callback
2892[`writable._write()`]: #stream_writable_write_chunk_encoding_callback_1
2893[`writable._writev()`]: #stream_writable_writev_chunks_callback
2894[`writable.cork()`]: #stream_writable_cork
2895[`writable.end()`]: #stream_writable_end_chunk_encoding_callback
2896[`writable.uncork()`]: #stream_writable_uncork
2897[`writable.writableFinished`]: #stream_writable_writablefinished
2898[`zlib.createDeflate()`]: zlib.html#zlib_zlib_createdeflate_options
2899[API for stream consumers]: #stream_api_for_stream_consumers
2900[API for stream implementers]: #stream_api_for_stream_implementers
2901[Compatibility]: #stream_compatibility_with_older_node_js_versions
2902[HTTP requests, on the client]: http.html#http_class_http_clientrequest
2903[HTTP responses, on the server]: http.html#http_class_http_serverresponse
2904[TCP sockets]: net.html#net_class_net_socket
2905[child process stdin]: child_process.html#child_process_subprocess_stdin
2906[child process stdout and stderr]: child_process.html#child_process_subprocess_stdout
2907[crypto]: crypto.html
2908[fs read streams]: fs.html#fs_class_fs_readstream
2909[fs write streams]: fs.html#fs_class_fs_writestream
2910[http-incoming-message]: http.html#http_class_http_incomingmessage
2911[hwm-gotcha]: #stream_highwatermark_discrepancy_after_calling_readable_setencoding
2912[object-mode]: #stream_object_mode
2913[readable-_destroy]: #stream_readable_destroy_err_callback
2914[readable-destroy]: #stream_readable_destroy_error
2915[stream-_final]: #stream_writable_final_callback
2916[stream-_flush]: #stream_transform_flush_callback
2917[stream-_read]: #stream_readable_read_size_1
2918[stream-_transform]: #stream_transform_transform_chunk_encoding_callback
2919[stream-_write]: #stream_writable_write_chunk_encoding_callback_1
2920[stream-_writev]: #stream_writable_writev_chunks_callback
2921[stream-end]: #stream_writable_end_chunk_encoding_callback
2922[stream-pause]: #stream_readable_pause
2923[stream-push]: #stream_readable_push_chunk_encoding
2924[stream-read]: #stream_readable_read_size
2925[stream-resume]: #stream_readable_resume
2926[stream-uncork]: #stream_writable_uncork
2927[stream-write]: #stream_writable_write_chunk_encoding_callback
2928[Three states]: #stream_three_states
2929[writable-_destroy]: #stream_writable_destroy_err_callback
2930[writable-destroy]: #stream_writable_destroy_error
2931[writable-new]: #stream_new_stream_writable_options
2932[zlib]: zlib.html
2933