• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Stream
2
3<!--introduced_in=v0.10.0-->
4
5> Stability: 2 - Stable
6
7<!-- source_link=lib/stream.js -->
8
9A stream is an abstract interface for working with streaming data in Node.js.
10The `stream` module provides an API for implementing the stream interface.
11
12There are many stream objects provided by Node.js. For instance, a
13[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
14are both stream instances.
15
16Streams can be readable, writable, or both. All streams are instances of
17[`EventEmitter`][].
18
19To access the `stream` module:
20
21```js
22const stream = require('stream');
23```
24
25The `stream` module is useful for creating new types of stream instances. It is
26usually not necessary to use the `stream` module to consume streams.
27
28## Organization of this document
29
30This document contains two primary sections and a third section for notes. The
31first section explains how to use existing streams within an application. The
32second section explains how to create new types of streams.
33
34## Types of streams
35
36There are four fundamental stream types within Node.js:
37
38* [`Writable`][]: streams to which data can be written (for example,
39  [`fs.createWriteStream()`][]).
40* [`Readable`][]: streams from which data can be read (for example,
41  [`fs.createReadStream()`][]).
42* [`Duplex`][]: streams that are both `Readable` and `Writable` (for example,
43  [`net.Socket`][]).
44* [`Transform`][]: `Duplex` streams that can modify or transform the data as it
45  is written and read (for example, [`zlib.createDeflate()`][]).
46
47Additionally, this module includes the utility functions
48[`stream.pipeline()`][], [`stream.finished()`][] and
49[`stream.Readable.from()`][].
50
51### Object mode
52
53All streams created by Node.js APIs operate exclusively on strings and `Buffer`
54(or `Uint8Array`) objects. It is possible, however, for stream implementations
55to work with other types of JavaScript values (with the exception of `null`,
56which serves a special purpose within streams). Such streams are considered to
57operate in "object mode".
58
59Stream instances are switched into object mode using the `objectMode` option
60when the stream is created. Attempting to switch an existing stream into
61object mode is not safe.
62
63### Buffering
64
65<!--type=misc-->
66
67Both [`Writable`][] and [`Readable`][] streams will store data in an internal
68buffer.
69
70The amount of data potentially buffered depends on the `highWaterMark` option
71passed into the stream's constructor. For normal streams, the `highWaterMark`
72option specifies a [total number of bytes][hwm-gotcha]. For streams operating
73in object mode, the `highWaterMark` specifies a total number of objects.
74
75Data is buffered in `Readable` streams when the implementation calls
76[`stream.push(chunk)`][stream-push]. If the consumer of the Stream does not
77call [`stream.read()`][stream-read], the data will sit in the internal
78queue until it is consumed.
79
80Once the total size of the internal read buffer reaches the threshold specified
81by `highWaterMark`, the stream will temporarily stop reading data from the
82underlying resource until the data currently buffered can be consumed (that is,
83the stream will stop calling the internal [`readable._read()`][] method that is
84used to fill the read buffer).
85
86Data is buffered in `Writable` streams when the
87[`writable.write(chunk)`][stream-write] method is called repeatedly. While the
88total size of the internal write buffer is below the threshold set by
89`highWaterMark`, calls to `writable.write()` will return `true`. Once
90the size of the internal buffer reaches or exceeds the `highWaterMark`, `false`
91will be returned.
92
93A key goal of the `stream` API, particularly the [`stream.pipe()`][] method,
94is to limit the buffering of data to acceptable levels such that sources and
95destinations of differing speeds will not overwhelm the available memory.
96
97The `highWaterMark` option is a threshold, not a limit: it dictates the amount
98of data that a stream buffers before it stops asking for more data. It does not
99enforce a strict memory limitation in general. Specific stream implementations
100may choose to enforce stricter limits but doing so is optional.
101
102Because [`Duplex`][] and [`Transform`][] streams are both `Readable` and
103`Writable`, each maintains *two* separate internal buffers used for reading and
104writing, allowing each side to operate independently of the other while
105maintaining an appropriate and efficient flow of data. For example,
106[`net.Socket`][] instances are [`Duplex`][] streams whose `Readable` side allows
107consumption of data received *from* the socket and whose `Writable` side allows
108writing data *to* the socket. Because data may be written to the socket at a
109faster or slower rate than data is received, each side should
110operate (and buffer) independently of the other.
111
112The mechanics of the internal buffering are an internal implementation detail
113and may be changed at any time. However, for certain advanced implementations,
114the internal buffers can be retrieved using `writable.writableBuffer` or
115`readable.readableBuffer`. Use of these undocumented properties is discouraged.
116
117## API for stream consumers
118
119<!--type=misc-->
120
121Almost all Node.js applications, no matter how simple, use streams in some
122manner. The following is an example of using streams in a Node.js application
123that implements an HTTP server:
124
125```js
126const http = require('http');
127
128const server = http.createServer((req, res) => {
129  // `req` is an http.IncomingMessage, which is a readable stream.
130  // `res` is an http.ServerResponse, which is a writable stream.
131
132  let body = '';
133  // Get the data as utf8 strings.
134  // If an encoding is not set, Buffer objects will be received.
135  req.setEncoding('utf8');
136
137  // Readable streams emit 'data' events once a listener is added.
138  req.on('data', (chunk) => {
139    body += chunk;
140  });
141
142  // The 'end' event indicates that the entire body has been received.
143  req.on('end', () => {
144    try {
145      const data = JSON.parse(body);
146      // Write back something interesting to the user:
147      res.write(typeof data);
148      res.end();
149    } catch (er) {
150      // uh oh! bad json!
151      res.statusCode = 400;
152      return res.end(`error: ${er.message}`);
153    }
154  });
155});
156
157server.listen(1337);
158
159// $ curl localhost:1337 -d "{}"
160// object
161// $ curl localhost:1337 -d "\"foo\""
162// string
163// $ curl localhost:1337 -d "not json"
164// error: Unexpected token o in JSON at position 1
165```
166
167[`Writable`][] streams (such as `res` in the example) expose methods such as
168`write()` and `end()` that are used to write data onto the stream.
169
170[`Readable`][] streams use the [`EventEmitter`][] API for notifying application
171code when data is available to be read off the stream. That available data can
172be read from the stream in multiple ways.
173
174Both [`Writable`][] and [`Readable`][] streams use the [`EventEmitter`][] API in
175various ways to communicate the current state of the stream.
176
177[`Duplex`][] and [`Transform`][] streams are both [`Writable`][] and
178[`Readable`][].
179
180Applications that are either writing data to or consuming data from a stream
181are not required to implement the stream interfaces directly and will generally
182have no reason to call `require('stream')`.
183
184Developers wishing to implement new types of streams should refer to the
185section [API for stream implementers][].
186
187### Writable streams
188
189Writable streams are an abstraction for a *destination* to which data is
190written.
191
192Examples of [`Writable`][] streams include:
193
194* [HTTP requests, on the client][]
195* [HTTP responses, on the server][]
196* [fs write streams][]
197* [zlib streams][zlib]
198* [crypto streams][crypto]
199* [TCP sockets][]
200* [child process stdin][]
201* [`process.stdout`][], [`process.stderr`][]
202
203Some of these examples are actually [`Duplex`][] streams that implement the
204[`Writable`][] interface.
205
206All [`Writable`][] streams implement the interface defined by the
207`stream.Writable` class.
208
209While specific instances of [`Writable`][] streams may differ in various ways,
210all `Writable` streams follow the same fundamental usage pattern as illustrated
211in the example below:
212
213```js
214const myStream = getWritableStreamSomehow();
215myStream.write('some data');
216myStream.write('some more data');
217myStream.end('done writing data');
218```
219
220#### Class: `stream.Writable`
221<!-- YAML
222added: v0.9.4
223-->
224
225<!--type=class-->
226
227##### Event: `'close'`
228<!-- YAML
229added: v0.9.4
230changes:
231  - version: v10.0.0
232    pr-url: https://github.com/nodejs/node/pull/18438
233    description: Add `emitClose` option to specify if `'close'` is emitted on
234                 destroy.
235-->
236
237The `'close'` event is emitted when the stream and any of its underlying
238resources (a file descriptor, for example) have been closed. The event indicates
239that no more events will be emitted, and no further computation will occur.
240
241A [`Writable`][] stream will always emit the `'close'` event if it is
242created with the `emitClose` option.
243
244##### Event: `'drain'`
245<!-- YAML
246added: v0.9.4
247-->
248
249If a call to [`stream.write(chunk)`][stream-write] returns `false`, the
250`'drain'` event will be emitted when it is appropriate to resume writing data
251to the stream.
252
253```js
254// Write the data to the supplied writable stream one million times.
255// Be attentive to back-pressure.
256function writeOneMillionTimes(writer, data, encoding, callback) {
257  let i = 1000000;
258  write();
259  function write() {
260    let ok = true;
261    do {
262      i--;
263      if (i === 0) {
264        // Last time!
265        writer.write(data, encoding, callback);
266      } else {
267        // See if we should continue, or wait.
268        // Don't pass the callback, because we're not done yet.
269        ok = writer.write(data, encoding);
270      }
271    } while (i > 0 && ok);
272    if (i > 0) {
273      // Had to stop early!
274      // Write some more once it drains.
275      writer.once('drain', write);
276    }
277  }
278}
279```
280
281##### Event: `'error'`
282<!-- YAML
283added: v0.9.4
284-->
285
286* {Error}
287
288The `'error'` event is emitted if an error occurred while writing or piping
289data. The listener callback is passed a single `Error` argument when called.
290
291The stream is closed when the `'error'` event is emitted unless the
292[`autoDestroy`][writable-new] option was set to `false` when creating the
293stream.
294
295After `'error'`, no further events other than `'close'` *should* be emitted
296(including `'error'` events).
297
298##### Event: `'finish'`
299<!-- YAML
300added: v0.9.4
301-->
302
303The `'finish'` event is emitted after the [`stream.end()`][stream-end] method
304has been called, and all data has been flushed to the underlying system.
305
306```js
307const writer = getWritableStreamSomehow();
308for (let i = 0; i < 100; i++) {
309  writer.write(`hello, #${i}!\n`);
310}
311writer.on('finish', () => {
312  console.log('All writes are now complete.');
313});
314writer.end('This is the end\n');
315```
316
317##### Event: `'pipe'`
318<!-- YAML
319added: v0.9.4
320-->
321
322* `src` {stream.Readable} source stream that is piping to this writable
323
324The `'pipe'` event is emitted when the [`stream.pipe()`][] method is called on
325a readable stream, adding this writable to its set of destinations.
326
327```js
328const writer = getWritableStreamSomehow();
329const reader = getReadableStreamSomehow();
330writer.on('pipe', (src) => {
331  console.log('Something is piping into the writer.');
332  assert.equal(src, reader);
333});
334reader.pipe(writer);
335```
336
337##### Event: `'unpipe'`
338<!-- YAML
339added: v0.9.4
340-->
341
342* `src` {stream.Readable} The source stream that
343  [unpiped][`stream.unpipe()`] this writable
344
345The `'unpipe'` event is emitted when the [`stream.unpipe()`][] method is called
346on a [`Readable`][] stream, removing this [`Writable`][] from its set of
347destinations.
348
349This is also emitted in case this [`Writable`][] stream emits an error when a
350[`Readable`][] stream pipes into it.
351
352```js
353const writer = getWritableStreamSomehow();
354const reader = getReadableStreamSomehow();
355writer.on('unpipe', (src) => {
356  console.log('Something has stopped piping into the writer.');
357  assert.equal(src, reader);
358});
359reader.pipe(writer);
360reader.unpipe(writer);
361```
362
363##### `writable.cork()`
364<!-- YAML
365added: v0.11.2
366-->
367
368The `writable.cork()` method forces all written data to be buffered in memory.
369The buffered data will be flushed when either the [`stream.uncork()`][] or
370[`stream.end()`][stream-end] methods are called.
371
372The primary intent of `writable.cork()` is to accommodate a situation in which
373several small chunks are written to the stream in rapid succession. Instead of
374immediately forwarding them to the underlying destination, `writable.cork()`
375buffers all the chunks until `writable.uncork()` is called, which will pass them
376all to `writable._writev()`, if present. This prevents a head-of-line blocking
377situation where data is being buffered while waiting for the first small chunk
378to be processed. However, use of `writable.cork()` without implementing
379`writable._writev()` may have an adverse effect on throughput.
380
381See also: [`writable.uncork()`][], [`writable._writev()`][stream-_writev].
382
383##### `writable.destroy([error])`
384<!-- YAML
385added: v8.0.0
386changes:
387  - version: v14.0.0
388    pr-url: https://github.com/nodejs/node/pull/29197
389    description: Work as a no-op on a stream that has already been destroyed.
390-->
391
392* `error` {Error} Optional, an error to emit with `'error'` event.
393* Returns: {this}
394
395Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`
396event (unless `emitClose` is set to `false`). After this call, the writable
397stream has ended and subsequent calls to `write()` or `end()` will result in
398an `ERR_STREAM_DESTROYED` error.
399This is a destructive and immediate way to destroy a stream. Previous calls to
400`write()` may not have drained, and may trigger an `ERR_STREAM_DESTROYED` error.
401Use `end()` instead of destroy if data should flush before close, or wait for
402the `'drain'` event before destroying the stream.
403
404```cjs
405const { Writable } = require('stream');
406
407const myStream = new Writable();
408
409const fooErr = new Error('foo error');
410myStream.destroy(fooErr);
411myStream.on('error', (fooErr) => console.error(fooErr.message)); // foo error
412```
413
414```cjs
415const { Writable } = require('stream');
416
417const myStream = new Writable();
418
419myStream.destroy();
420myStream.on('error', function wontHappen() {});
421```
422
423```cjs
424const { Writable } = require('stream');
425
426const myStream = new Writable();
427myStream.destroy();
428
429myStream.write('foo', (error) => console.error(error.code));
430// ERR_STREAM_DESTROYED
431```
432
433Once `destroy()` has been called any further calls will be a no-op and no
434further errors except from `_destroy()` may be emitted as `'error'`.
435
436Implementors should not override this method,
437but instead implement [`writable._destroy()`][writable-_destroy].
438
439##### `writable.destroyed`
440<!-- YAML
441added: v8.0.0
442-->
443
444* {boolean}
445
446Is `true` after [`writable.destroy()`][writable-destroy] has been called.
447
448```cjs
449const { Writable } = require('stream');
450
451const myStream = new Writable();
452
453console.log(myStream.destroyed); // false
454myStream.destroy();
455console.log(myStream.destroyed); // true
456```
457
458##### `writable.end([chunk[, encoding]][, callback])`
459<!-- YAML
460added: v0.9.4
461changes:
462  - version: v14.0.0
463    pr-url: https://github.com/nodejs/node/pull/29747
464    description: The `callback` is invoked if 'finish' or 'error' is emitted.
465  - version: v10.0.0
466    pr-url: https://github.com/nodejs/node/pull/18780
467    description: This method now returns a reference to `writable`.
468  - version: v8.0.0
469    pr-url: https://github.com/nodejs/node/pull/11608
470    description: The `chunk` argument can now be a `Uint8Array` instance.
471-->
472
473* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
474  not operating in object mode, `chunk` must be a string, `Buffer` or
475  `Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
476  other than `null`.
477* `encoding` {string} The encoding if `chunk` is a string
478* `callback` {Function} Optional callback for when the stream finishes
479  or errors
480* Returns: {this}
481
482Calling the `writable.end()` method signals that no more data will be written
483to the [`Writable`][]. The optional `chunk` and `encoding` arguments allow one
484final additional chunk of data to be written immediately before closing the
485stream. If provided, the optional `callback` function is attached as a listener
486for the [`'finish'`][] and the `'error'` event.
487
488Calling the [`stream.write()`][stream-write] method after calling
489[`stream.end()`][stream-end] will raise an error.
490
491```js
492// Write 'hello, ' and then end with 'world!'.
493const fs = require('fs');
494const file = fs.createWriteStream('example.txt');
495file.write('hello, ');
496file.end('world!');
497// Writing more now is not allowed!
498```
499
500##### `writable.setDefaultEncoding(encoding)`
501<!-- YAML
502added: v0.11.15
503changes:
504  - version: v6.1.0
505    pr-url: https://github.com/nodejs/node/pull/5040
506    description: This method now returns a reference to `writable`.
507-->
508
509* `encoding` {string} The new default encoding
510* Returns: {this}
511
512The `writable.setDefaultEncoding()` method sets the default `encoding` for a
513[`Writable`][] stream.
514
515##### `writable.uncork()`
516<!-- YAML
517added: v0.11.2
518-->
519
520The `writable.uncork()` method flushes all data buffered since
521[`stream.cork()`][] was called.
522
523When using [`writable.cork()`][] and `writable.uncork()` to manage the buffering
524of writes to a stream, it is recommended that calls to `writable.uncork()` be
525deferred using `process.nextTick()`. Doing so allows batching of all
526`writable.write()` calls that occur within a given Node.js event loop phase.
527
528```js
529stream.cork();
530stream.write('some ');
531stream.write('data ');
532process.nextTick(() => stream.uncork());
533```
534
535If the [`writable.cork()`][] method is called multiple times on a stream, the
536same number of calls to `writable.uncork()` must be called to flush the buffered
537data.
538
539```js
540stream.cork();
541stream.write('some ');
542stream.cork();
543stream.write('data ');
544process.nextTick(() => {
545  stream.uncork();
546  // The data will not be flushed until uncork() is called a second time.
547  stream.uncork();
548});
549```
550
551See also: [`writable.cork()`][].
552
553##### `writable.writable`
554<!-- YAML
555added: v11.4.0
556-->
557
558* {boolean}
559
560Is `true` if it is safe to call [`writable.write()`][stream-write], which means
561the stream has not been destroyed, errored or ended.
562
563##### `writable.writableEnded`
564<!-- YAML
565added: v12.9.0
566-->
567
568* {boolean}
569
570Is `true` after [`writable.end()`][] has been called. This property
571does not indicate whether the data has been flushed, for this use
572[`writable.writableFinished`][] instead.
573
574##### `writable.writableCorked`
575<!-- YAML
576added:
577 - v13.2.0
578 - v12.16.0
579-->
580
581* {integer}
582
583Number of times [`writable.uncork()`][stream-uncork] needs to be
584called in order to fully uncork the stream.
585
586##### `writable.writableFinished`
587<!-- YAML
588added: v12.6.0
589-->
590
591* {boolean}
592
593Is set to `true` immediately before the [`'finish'`][] event is emitted.
594
595##### `writable.writableHighWaterMark`
596<!-- YAML
597added: v9.3.0
598-->
599
600* {number}
601
602Return the value of `highWaterMark` passed when constructing this
603`Writable`.
604
605##### `writable.writableLength`
606<!-- YAML
607added: v9.4.0
608-->
609
610* {number}
611
612This property contains the number of bytes (or objects) in the queue
613ready to be written. The value provides introspection data regarding
614the status of the `highWaterMark`.
615
616##### `writable.writableNeedDrain`
617<!-- YAML
618added: v14.17.0
619-->
620
621* {boolean}
622
623Is `true` if the stream's buffer has been full and stream will emit `'drain'`.
624
625##### `writable.writableObjectMode`
626<!-- YAML
627added: v12.3.0
628-->
629
630* {boolean}
631
632Getter for the property `objectMode` of a given `Writable` stream.
633
634##### `writable.write(chunk[, encoding][, callback])`
635<!-- YAML
636added: v0.9.4
637changes:
638  - version: v8.0.0
639    pr-url: https://github.com/nodejs/node/pull/11608
640    description: The `chunk` argument can now be a `Uint8Array` instance.
641  - version: v6.0.0
642    pr-url: https://github.com/nodejs/node/pull/6170
643    description: Passing `null` as the `chunk` parameter will always be
644                 considered invalid now, even in object mode.
645-->
646
647* `chunk` {string|Buffer|Uint8Array|any} Optional data to write. For streams
648  not operating in object mode, `chunk` must be a string, `Buffer` or
649  `Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
650  other than `null`.
651* `encoding` {string|null} The encoding, if `chunk` is a string. **Default:** `'utf8'`
652* `callback` {Function} Callback for when this chunk of data is flushed.
653* Returns: {boolean} `false` if the stream wishes for the calling code to
654  wait for the `'drain'` event to be emitted before continuing to write
655  additional data; otherwise `true`.
656
657The `writable.write()` method writes some data to the stream, and calls the
658supplied `callback` once the data has been fully handled. If an error
659occurs, the `callback` *may or may not* be called with the error as its
660first argument. To reliably detect write errors, add a listener for the
661`'error'` event. The `callback` is called asynchronously and before `'error'` is
662emitted.
663
664The return value is `true` if the internal buffer is less than the
665`highWaterMark` configured when the stream was created after admitting `chunk`.
666If `false` is returned, further attempts to write data to the stream should
667stop until the [`'drain'`][] event is emitted.
668
669While a stream is not draining, calls to `write()` will buffer `chunk`, and
670return false. Once all currently buffered chunks are drained (accepted for
671delivery by the operating system), the `'drain'` event will be emitted.
672It is recommended that once `write()` returns false, no more chunks be written
673until the `'drain'` event is emitted. While calling `write()` on a stream that
674is not draining is allowed, Node.js will buffer all written chunks until
675maximum memory usage occurs, at which point it will abort unconditionally.
676Even before it aborts, high memory usage will cause poor garbage collector
677performance and high RSS (which is not typically released back to the system,
678even after the memory is no longer required). Since TCP sockets may never
679drain if the remote peer does not read the data, writing a socket that is
680not draining may lead to a remotely exploitable vulnerability.
681
682Writing data while the stream is not draining is particularly
683problematic for a [`Transform`][], because the `Transform` streams are paused
684by default until they are piped or a `'data'` or `'readable'` event handler
685is added.
686
687If the data to be written can be generated or fetched on demand, it is
688recommended to encapsulate the logic into a [`Readable`][] and use
689[`stream.pipe()`][]. However, if calling `write()` is preferred, it is
690possible to respect backpressure and avoid memory issues using the
691[`'drain'`][] event:
692
693```js
694function write(data, cb) {
695  if (!stream.write(data)) {
696    stream.once('drain', cb);
697  } else {
698    process.nextTick(cb);
699  }
700}
701
702// Wait for cb to be called before doing any other write.
703write('hello', () => {
704  console.log('Write completed, do more writes now.');
705});
706```
707
708A `Writable` stream in object mode will always ignore the `encoding` argument.
709
710### Readable streams
711
712Readable streams are an abstraction for a *source* from which data is
713consumed.
714
715Examples of `Readable` streams include:
716
717* [HTTP responses, on the client][http-incoming-message]
718* [HTTP requests, on the server][http-incoming-message]
719* [fs read streams][]
720* [zlib streams][zlib]
721* [crypto streams][crypto]
722* [TCP sockets][]
723* [child process stdout and stderr][]
724* [`process.stdin`][]
725
726All [`Readable`][] streams implement the interface defined by the
727`stream.Readable` class.
728
729#### Two reading modes
730
731`Readable` streams effectively operate in one of two modes: flowing and
732paused. These modes are separate from [object mode][object-mode].
733A [`Readable`][] stream can be in object mode or not, regardless of whether
734it is in flowing mode or paused mode.
735
736* In flowing mode, data is read from the underlying system automatically
737  and provided to an application as quickly as possible using events via the
738  [`EventEmitter`][] interface.
739
740* In paused mode, the [`stream.read()`][stream-read] method must be called
741  explicitly to read chunks of data from the stream.
742
743All [`Readable`][] streams begin in paused mode but can be switched to flowing
744mode in one of the following ways:
745
746* Adding a [`'data'`][] event handler.
747* Calling the [`stream.resume()`][stream-resume] method.
748* Calling the [`stream.pipe()`][] method to send the data to a [`Writable`][].
749
750The `Readable` can switch back to paused mode using one of the following:
751
752* If there are no pipe destinations, by calling the
753  [`stream.pause()`][stream-pause] method.
754* If there are pipe destinations, by removing all pipe destinations.
755  Multiple pipe destinations may be removed by calling the
756  [`stream.unpipe()`][] method.
757
758The important concept to remember is that a `Readable` will not generate data
759until a mechanism for either consuming or ignoring that data is provided. If
760the consuming mechanism is disabled or taken away, the `Readable` will *attempt*
761to stop generating the data.
762
763For backward compatibility reasons, removing [`'data'`][] event handlers will
764**not** automatically pause the stream. Also, if there are piped destinations,
765then calling [`stream.pause()`][stream-pause] will not guarantee that the
766stream will *remain* paused once those destinations drain and ask for more data.
767
768If a [`Readable`][] is switched into flowing mode and there are no consumers
769available to handle the data, that data will be lost. This can occur, for
770instance, when the `readable.resume()` method is called without a listener
771attached to the `'data'` event, or when a `'data'` event handler is removed
772from the stream.
773
774Adding a [`'readable'`][] event handler automatically makes the stream
775stop flowing, and the data has to be consumed via
776[`readable.read()`][stream-read]. If the [`'readable'`][] event handler is
777removed, then the stream will start flowing again if there is a
778[`'data'`][] event handler.
779
780#### Three states
781
782The "two modes" of operation for a `Readable` stream are a simplified
783abstraction for the more complicated internal state management that is happening
784within the `Readable` stream implementation.
785
786Specifically, at any given point in time, every `Readable` is in one of three
787possible states:
788
789* `readable.readableFlowing === null`
790* `readable.readableFlowing === false`
791* `readable.readableFlowing === true`
792
793When `readable.readableFlowing` is `null`, no mechanism for consuming the
794stream's data is provided. Therefore, the stream will not generate data.
795While in this state, attaching a listener for the `'data'` event, calling the
796`readable.pipe()` method, or calling the `readable.resume()` method will switch
797`readable.readableFlowing` to `true`, causing the `Readable` to begin actively
798emitting events as data is generated.
799
800Calling `readable.pause()`, `readable.unpipe()`, or receiving backpressure
801will cause the `readable.readableFlowing` to be set as `false`,
802temporarily halting the flowing of events but *not* halting the generation of
803data. While in this state, attaching a listener for the `'data'` event
804will not switch `readable.readableFlowing` to `true`.
805
806```js
807const { PassThrough, Writable } = require('stream');
808const pass = new PassThrough();
809const writable = new Writable();
810
811pass.pipe(writable);
812pass.unpipe(writable);
813// readableFlowing is now false.
814
815pass.on('data', (chunk) => { console.log(chunk.toString()); });
816pass.write('ok');  // Will not emit 'data'.
817pass.resume();     // Must be called to make stream emit 'data'.
818```
819
820While `readable.readableFlowing` is `false`, data may be accumulating
821within the stream's internal buffer.
822
823#### Choose one API style
824
825The `Readable` stream API evolved across multiple Node.js versions and provides
826multiple methods of consuming stream data. In general, developers should choose
827*one* of the methods of consuming data and *should never* use multiple methods
828to consume data from a single stream. Specifically, using a combination
829of `on('data')`, `on('readable')`, `pipe()`, or async iterators could
830lead to unintuitive behavior.
831
832Use of the `readable.pipe()` method is recommended for most users as it has been
833implemented to provide the easiest way of consuming stream data. Developers that
834require more fine-grained control over the transfer and generation of data can
835use the [`EventEmitter`][] and `readable.on('readable')`/`readable.read()`
836or the `readable.pause()`/`readable.resume()` APIs.
837
838#### Class: `stream.Readable`
839<!-- YAML
840added: v0.9.4
841-->
842
843<!--type=class-->
844
845##### Event: `'close'`
846<!-- YAML
847added: v0.9.4
848changes:
849  - version: v10.0.0
850    pr-url: https://github.com/nodejs/node/pull/18438
851    description: Add `emitClose` option to specify if `'close'` is emitted on
852                 destroy.
853-->
854
855The `'close'` event is emitted when the stream and any of its underlying
856resources (a file descriptor, for example) have been closed. The event indicates
857that no more events will be emitted, and no further computation will occur.
858
859A [`Readable`][] stream will always emit the `'close'` event if it is
860created with the `emitClose` option.
861
862##### Event: `'data'`
863<!-- YAML
864added: v0.9.4
865-->
866
867* `chunk` {Buffer|string|any} The chunk of data. For streams that are not
868  operating in object mode, the chunk will be either a string or `Buffer`.
869  For streams that are in object mode, the chunk can be any JavaScript value
870  other than `null`.
871
872The `'data'` event is emitted whenever the stream is relinquishing ownership of
873a chunk of data to a consumer. This may occur whenever the stream is switched
874in flowing mode by calling `readable.pipe()`, `readable.resume()`, or by
875attaching a listener callback to the `'data'` event. The `'data'` event will
876also be emitted whenever the `readable.read()` method is called and a chunk of
877data is available to be returned.
878
879Attaching a `'data'` event listener to a stream that has not been explicitly
880paused will switch the stream into flowing mode. Data will then be passed as
881soon as it is available.
882
883The listener callback will be passed the chunk of data as a string if a default
884encoding has been specified for the stream using the
885`readable.setEncoding()` method; otherwise the data will be passed as a
886`Buffer`.
887
888```js
889const readable = getReadableStreamSomehow();
890readable.on('data', (chunk) => {
891  console.log(`Received ${chunk.length} bytes of data.`);
892});
893```
894
895##### Event: `'end'`
896<!-- YAML
897added: v0.9.4
898-->
899
900The `'end'` event is emitted when there is no more data to be consumed from
901the stream.
902
903The `'end'` event **will not be emitted** unless the data is completely
904consumed. This can be accomplished by switching the stream into flowing mode,
905or by calling [`stream.read()`][stream-read] repeatedly until all data has been
906consumed.
907
908```js
909const readable = getReadableStreamSomehow();
910readable.on('data', (chunk) => {
911  console.log(`Received ${chunk.length} bytes of data.`);
912});
913readable.on('end', () => {
914  console.log('There will be no more data.');
915});
916```
917
918##### Event: `'error'`
919<!-- YAML
920added: v0.9.4
921-->
922
923* {Error}
924
925The `'error'` event may be emitted by a `Readable` implementation at any time.
926Typically, this may occur if the underlying stream is unable to generate data
927due to an underlying internal failure, or when a stream implementation attempts
928to push an invalid chunk of data.
929
930The listener callback will be passed a single `Error` object.
931
932##### Event: `'pause'`
933<!-- YAML
934added: v0.9.4
935-->
936
937The `'pause'` event is emitted when [`stream.pause()`][stream-pause] is called
938and `readableFlowing` is not `false`.
939
940##### Event: `'readable'`
941<!-- YAML
942added: v0.9.4
943changes:
944  - version: v10.0.0
945    pr-url: https://github.com/nodejs/node/pull/17979
946    description: The `'readable'` is always emitted in the next tick after
947                 `.push()` is called.
948  - version: v10.0.0
949    pr-url: https://github.com/nodejs/node/pull/18994
950    description: Using `'readable'` requires calling `.read()`.
951-->
952
953The `'readable'` event is emitted when there is data available to be read from
954the stream. In some cases, attaching a listener for the `'readable'` event will
955cause some amount of data to be read into an internal buffer.
956
957```js
958const readable = getReadableStreamSomehow();
959readable.on('readable', function() {
960  // There is some data to read now.
961  let data;
962
963  while (data = this.read()) {
964    console.log(data);
965  }
966});
967```
968
969The `'readable'` event will also be emitted once the end of the stream data
970has been reached but before the `'end'` event is emitted.
971
972Effectively, the `'readable'` event indicates that the stream has new
973information: either new data is available or the end of the stream has been
974reached. In the former case, [`stream.read()`][stream-read] will return the
975available data. In the latter case, [`stream.read()`][stream-read] will return
976`null`. For instance, in the following example, `foo.txt` is an empty file:
977
978```js
979const fs = require('fs');
980const rr = fs.createReadStream('foo.txt');
981rr.on('readable', () => {
982  console.log(`readable: ${rr.read()}`);
983});
984rr.on('end', () => {
985  console.log('end');
986});
987```
988
989The output of running this script is:
990
991```console
992$ node test.js
993readable: null
994end
995```
996
997In general, the `readable.pipe()` and `'data'` event mechanisms are easier to
998understand than the `'readable'` event. However, handling `'readable'` might
999result in increased throughput.
1000
1001If both `'readable'` and [`'data'`][] are used at the same time, `'readable'`
1002takes precedence in controlling the flow, i.e. `'data'` will be emitted
1003only when [`stream.read()`][stream-read] is called. The
1004`readableFlowing` property would become `false`.
1005If there are `'data'` listeners when `'readable'` is removed, the stream
1006will start flowing, i.e. `'data'` events will be emitted without calling
1007`.resume()`.
1008
1009##### Event: `'resume'`
1010<!-- YAML
1011added: v0.9.4
1012-->
1013
1014The `'resume'` event is emitted when [`stream.resume()`][stream-resume] is
1015called and `readableFlowing` is not `true`.
1016
1017##### `readable.destroy([error])`
1018<!-- YAML
1019added: v8.0.0
1020changes:
1021  - version: v14.0.0
1022    pr-url: https://github.com/nodejs/node/pull/29197
1023    description: Work as a no-op on a stream that has already been destroyed.
1024-->
1025
1026* `error` {Error} Error which will be passed as payload in `'error'` event
1027* Returns: {this}
1028
1029Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`
1030event (unless `emitClose` is set to `false`). After this call, the readable
1031stream will release any internal resources and subsequent calls to `push()`
1032will be ignored.
1033
1034Once `destroy()` has been called any further calls will be a no-op and no
1035further errors except from `_destroy()` may be emitted as `'error'`.
1036
1037Implementors should not override this method, but instead implement
1038[`readable._destroy()`][readable-_destroy].
1039
1040##### `readable.destroyed`
1041<!-- YAML
1042added: v8.0.0
1043-->
1044
1045* {boolean}
1046
1047Is `true` after [`readable.destroy()`][readable-destroy] has been called.
1048
1049##### `readable.isPaused()`
1050<!-- YAML
1051added: v0.11.14
1052-->
1053
1054* Returns: {boolean}
1055
1056The `readable.isPaused()` method returns the current operating state of the
1057`Readable`. This is used primarily by the mechanism that underlies the
1058`readable.pipe()` method. In most typical cases, there will be no reason to
1059use this method directly.
1060
1061```js
1062const readable = new stream.Readable();
1063
1064readable.isPaused(); // === false
1065readable.pause();
1066readable.isPaused(); // === true
1067readable.resume();
1068readable.isPaused(); // === false
1069```
1070
1071##### `readable.pause()`
1072<!-- YAML
1073added: v0.9.4
1074-->
1075
1076* Returns: {this}
1077
1078The `readable.pause()` method will cause a stream in flowing mode to stop
1079emitting [`'data'`][] events, switching out of flowing mode. Any data that
1080becomes available will remain in the internal buffer.
1081
1082```js
1083const readable = getReadableStreamSomehow();
1084readable.on('data', (chunk) => {
1085  console.log(`Received ${chunk.length} bytes of data.`);
1086  readable.pause();
1087  console.log('There will be no additional data for 1 second.');
1088  setTimeout(() => {
1089    console.log('Now data will start flowing again.');
1090    readable.resume();
1091  }, 1000);
1092});
1093```
1094
1095The `readable.pause()` method has no effect if there is a `'readable'`
1096event listener.
1097
1098##### `readable.pipe(destination[, options])`
1099<!-- YAML
1100added: v0.9.4
1101-->
1102
1103* `destination` {stream.Writable} The destination for writing data
1104* `options` {Object} Pipe options
1105  * `end` {boolean} End the writer when the reader ends. **Default:** `true`.
1106* Returns: {stream.Writable} The *destination*, allowing for a chain of pipes if
1107  it is a [`Duplex`][] or a [`Transform`][] stream
1108
1109The `readable.pipe()` method attaches a [`Writable`][] stream to the `readable`,
1110causing it to switch automatically into flowing mode and push all of its data
1111to the attached [`Writable`][]. The flow of data will be automatically managed
1112so that the destination `Writable` stream is not overwhelmed by a faster
1113`Readable` stream.
1114
1115The following example pipes all of the data from the `readable` into a file
1116named `file.txt`:
1117
1118```js
1119const fs = require('fs');
1120const readable = getReadableStreamSomehow();
1121const writable = fs.createWriteStream('file.txt');
1122// All the data from readable goes into 'file.txt'.
1123readable.pipe(writable);
1124```
1125
1126It is possible to attach multiple `Writable` streams to a single `Readable`
1127stream.
1128
1129The `readable.pipe()` method returns a reference to the *destination* stream
1130making it possible to set up chains of piped streams:
1131
1132```js
1133const fs = require('fs');
1134const r = fs.createReadStream('file.txt');
1135const z = zlib.createGzip();
1136const w = fs.createWriteStream('file.txt.gz');
1137r.pipe(z).pipe(w);
1138```
1139
1140By default, [`stream.end()`][stream-end] is called on the destination `Writable`
1141stream when the source `Readable` stream emits [`'end'`][], so that the
1142destination is no longer writable. To disable this default behavior, the `end`
1143option can be passed as `false`, causing the destination stream to remain open:
1144
1145```js
1146reader.pipe(writer, { end: false });
1147reader.on('end', () => {
1148  writer.end('Goodbye\n');
1149});
1150```
1151
1152One important caveat is that if the `Readable` stream emits an error during
1153processing, the `Writable` destination *is not closed* automatically. If an
1154error occurs, it will be necessary to *manually* close each stream in order
1155to prevent memory leaks.
1156
1157The [`process.stderr`][] and [`process.stdout`][] `Writable` streams are never
1158closed until the Node.js process exits, regardless of the specified options.
1159
1160##### `readable.read([size])`
1161<!-- YAML
1162added: v0.9.4
1163-->
1164
1165* `size` {number} Optional argument to specify how much data to read.
1166* Returns: {string|Buffer|null|any}
1167
1168The `readable.read()` method pulls some data out of the internal buffer and
1169returns it. If no data available to be read, `null` is returned. By default,
1170the data will be returned as a `Buffer` object unless an encoding has been
1171specified using the `readable.setEncoding()` method or the stream is operating
1172in object mode.
1173
1174The optional `size` argument specifies a specific number of bytes to read. If
1175`size` bytes are not available to be read, `null` will be returned *unless*
1176the stream has ended, in which case all of the data remaining in the internal
1177buffer will be returned.
1178
1179If the `size` argument is not specified, all of the data contained in the
1180internal buffer will be returned.
1181
1182The `size` argument must be less than or equal to 1 GiB.
1183
1184The `readable.read()` method should only be called on `Readable` streams
1185operating in paused mode. In flowing mode, `readable.read()` is called
1186automatically until the internal buffer is fully drained.
1187
1188```js
1189const readable = getReadableStreamSomehow();
1190
1191// 'readable' may be triggered multiple times as data is buffered in
1192readable.on('readable', () => {
1193  let chunk;
1194  console.log('Stream is readable (new data received in buffer)');
1195  // Use a loop to make sure we read all currently available data
1196  while (null !== (chunk = readable.read())) {
1197    console.log(`Read ${chunk.length} bytes of data...`);
1198  }
1199});
1200
1201// 'end' will be triggered once when there is no more data available
1202readable.on('end', () => {
1203  console.log('Reached end of stream.');
1204});
1205```
1206
1207Each call to `readable.read()` returns a chunk of data, or `null`. The chunks
1208are not concatenated. A `while` loop is necessary to consume all data
1209currently in the buffer. When reading a large file `.read()` may return `null`,
1210having consumed all buffered content so far, but there is still more data to
1211come not yet buffered. In this case a new `'readable'` event will be emitted
1212when there is more data in the buffer. Finally the `'end'` event will be
1213emitted when there is no more data to come.
1214
1215Therefore to read a file's whole contents from a `readable`, it is necessary
1216to collect chunks across multiple `'readable'` events:
1217
1218```js
1219const chunks = [];
1220
1221readable.on('readable', () => {
1222  let chunk;
1223  while (null !== (chunk = readable.read())) {
1224    chunks.push(chunk);
1225  }
1226});
1227
1228readable.on('end', () => {
1229  const content = chunks.join('');
1230});
1231```
1232
1233A `Readable` stream in object mode will always return a single item from
1234a call to [`readable.read(size)`][stream-read], regardless of the value of the
1235`size` argument.
1236
1237If the `readable.read()` method returns a chunk of data, a `'data'` event will
1238also be emitted.
1239
1240Calling [`stream.read([size])`][stream-read] after the [`'end'`][] event has
1241been emitted will return `null`. No runtime error will be raised.
1242
1243##### `readable.readable`
1244<!-- YAML
1245added: v11.4.0
1246-->
1247
1248* {boolean}
1249
1250Is `true` if it is safe to call [`readable.read()`][stream-read], which means
1251the stream has not been destroyed or emitted `'error'` or `'end'`.
1252
1253##### `readable.readableDidRead`
1254<!-- YAML
1255added: v14.18.0
1256-->
1257
1258* {boolean}
1259
1260Allows determining if the stream has been or is about to be read.
1261Returns true if `'data'`, `'end'`, `'error'` or `'close'` has been
1262emitted.
1263
1264##### `readable.readableEncoding`
1265<!-- YAML
1266added: v12.7.0
1267-->
1268
1269* {null|string}
1270
1271Getter for the property `encoding` of a given `Readable` stream. The `encoding`
1272property can be set using the [`readable.setEncoding()`][] method.
1273
1274##### `readable.readableEnded`
1275<!-- YAML
1276added: v12.9.0
1277-->
1278
1279* {boolean}
1280
1281Becomes `true` when [`'end'`][] event is emitted.
1282
1283##### `readable.readableFlowing`
1284<!-- YAML
1285added: v9.4.0
1286-->
1287
1288* {boolean}
1289
1290This property reflects the current state of a `Readable` stream as described
1291in the [Three states][] section.
1292
1293##### `readable.readableHighWaterMark`
1294<!-- YAML
1295added: v9.3.0
1296-->
1297
1298* {number}
1299
1300Returns the value of `highWaterMark` passed when constructing this
1301`Readable`.
1302
1303##### `readable.readableLength`
1304<!-- YAML
1305added: v9.4.0
1306-->
1307
1308* {number}
1309
1310This property contains the number of bytes (or objects) in the queue
1311ready to be read. The value provides introspection data regarding
1312the status of the `highWaterMark`.
1313
1314##### `readable.readableObjectMode`
1315<!-- YAML
1316added: v12.3.0
1317-->
1318
1319* {boolean}
1320
1321Getter for the property `objectMode` of a given `Readable` stream.
1322
1323##### `readable.resume()`
1324<!-- YAML
1325added: v0.9.4
1326changes:
1327  - version: v10.0.0
1328    pr-url: https://github.com/nodejs/node/pull/18994
1329    description: The `resume()` has no effect if there is a `'readable'` event
1330                 listening.
1331-->
1332
1333* Returns: {this}
1334
1335The `readable.resume()` method causes an explicitly paused `Readable` stream to
1336resume emitting [`'data'`][] events, switching the stream into flowing mode.
1337
1338The `readable.resume()` method can be used to fully consume the data from a
1339stream without actually processing any of that data:
1340
1341```js
1342getReadableStreamSomehow()
1343  .resume()
1344  .on('end', () => {
1345    console.log('Reached the end, but did not read anything.');
1346  });
1347```
1348
1349The `readable.resume()` method has no effect if there is a `'readable'`
1350event listener.
1351
1352##### `readable.setEncoding(encoding)`
1353<!-- YAML
1354added: v0.9.4
1355-->
1356
1357* `encoding` {string} The encoding to use.
1358* Returns: {this}
1359
1360The `readable.setEncoding()` method sets the character encoding for
1361data read from the `Readable` stream.
1362
1363By default, no encoding is assigned and stream data will be returned as
1364`Buffer` objects. Setting an encoding causes the stream data
1365to be returned as strings of the specified encoding rather than as `Buffer`
1366objects. For instance, calling `readable.setEncoding('utf8')` will cause the
1367output data to be interpreted as UTF-8 data, and passed as strings. Calling
1368`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal
1369string format.
1370
1371The `Readable` stream will properly handle multi-byte characters delivered
1372through the stream that would otherwise become improperly decoded if simply
1373pulled from the stream as `Buffer` objects.
1374
1375```js
1376const readable = getReadableStreamSomehow();
1377readable.setEncoding('utf8');
1378readable.on('data', (chunk) => {
1379  assert.equal(typeof chunk, 'string');
1380  console.log('Got %d characters of string data:', chunk.length);
1381});
1382```
1383
1384##### `readable.unpipe([destination])`
1385<!-- YAML
1386added: v0.9.4
1387-->
1388
1389* `destination` {stream.Writable} Optional specific stream to unpipe
1390* Returns: {this}
1391
1392The `readable.unpipe()` method detaches a `Writable` stream previously attached
1393using the [`stream.pipe()`][] method.
1394
1395If the `destination` is not specified, then *all* pipes are detached.
1396
1397If the `destination` is specified, but no pipe is set up for it, then
1398the method does nothing.
1399
1400```js
1401const fs = require('fs');
1402const readable = getReadableStreamSomehow();
1403const writable = fs.createWriteStream('file.txt');
1404// All the data from readable goes into 'file.txt',
1405// but only for the first second.
1406readable.pipe(writable);
1407setTimeout(() => {
1408  console.log('Stop writing to file.txt.');
1409  readable.unpipe(writable);
1410  console.log('Manually close the file stream.');
1411  writable.end();
1412}, 1000);
1413```
1414
1415##### `readable.unshift(chunk[, encoding])`
1416<!-- YAML
1417added: v0.9.11
1418changes:
1419  - version: v8.0.0
1420    pr-url: https://github.com/nodejs/node/pull/11608
1421    description: The `chunk` argument can now be a `Uint8Array` instance.
1422-->
1423
1424* `chunk` {Buffer|Uint8Array|string|null|any} Chunk of data to unshift onto the
1425  read queue. For streams not operating in object mode, `chunk` must be a
1426  string, `Buffer`, `Uint8Array` or `null`. For object mode streams, `chunk`
1427  may be any JavaScript value.
1428* `encoding` {string} Encoding of string chunks. Must be a valid
1429  `Buffer` encoding, such as `'utf8'` or `'ascii'`.
1430
1431Passing `chunk` as `null` signals the end of the stream (EOF) and behaves the
1432same as `readable.push(null)`, after which no more data can be written. The EOF
1433signal is put at the end of the buffer and any buffered data will still be
1434flushed.
1435
1436The `readable.unshift()` method pushes a chunk of data back into the internal
1437buffer. This is useful in certain situations where a stream is being consumed by
1438code that needs to "un-consume" some amount of data that it has optimistically
1439pulled out of the source, so that the data can be passed on to some other party.
1440
1441The `stream.unshift(chunk)` method cannot be called after the [`'end'`][] event
1442has been emitted or a runtime error will be thrown.
1443
1444Developers using `stream.unshift()` often should consider switching to
1445use of a [`Transform`][] stream instead. See the [API for stream implementers][]
1446section for more information.
1447
1448```js
1449// Pull off a header delimited by \n\n.
1450// Use unshift() if we get too much.
1451// Call the callback with (error, header, stream).
1452const { StringDecoder } = require('string_decoder');
1453function parseHeader(stream, callback) {
1454  stream.on('error', callback);
1455  stream.on('readable', onReadable);
1456  const decoder = new StringDecoder('utf8');
1457  let header = '';
1458  function onReadable() {
1459    let chunk;
1460    while (null !== (chunk = stream.read())) {
1461      const str = decoder.write(chunk);
1462      if (str.match(/\n\n/)) {
1463        // Found the header boundary.
1464        const split = str.split(/\n\n/);
1465        header += split.shift();
1466        const remaining = split.join('\n\n');
1467        const buf = Buffer.from(remaining, 'utf8');
1468        stream.removeListener('error', callback);
1469        // Remove the 'readable' listener before unshifting.
1470        stream.removeListener('readable', onReadable);
1471        if (buf.length)
1472          stream.unshift(buf);
1473        // Now the body of the message can be read from the stream.
1474        callback(null, header, stream);
1475      } else {
1476        // Still reading the header.
1477        header += str;
1478      }
1479    }
1480  }
1481}
1482```
1483
1484Unlike [`stream.push(chunk)`][stream-push], `stream.unshift(chunk)` will not
1485end the reading process by resetting the internal reading state of the stream.
1486This can cause unexpected results if `readable.unshift()` is called during a
1487read (i.e. from within a [`stream._read()`][stream-_read] implementation on a
1488custom stream). Following the call to `readable.unshift()` with an immediate
1489[`stream.push('')`][stream-push] will reset the reading state appropriately,
1490however it is best to simply avoid calling `readable.unshift()` while in the
1491process of performing a read.
1492
1493##### `readable.wrap(stream)`
1494<!-- YAML
1495added: v0.9.4
1496-->
1497
1498* `stream` {Stream} An "old style" readable stream
1499* Returns: {this}
1500
1501Prior to Node.js 0.10, streams did not implement the entire `stream` module API
1502as it is currently defined. (See [Compatibility][] for more information.)
1503
1504When using an older Node.js library that emits [`'data'`][] events and has a
1505[`stream.pause()`][stream-pause] method that is advisory only, the
1506`readable.wrap()` method can be used to create a [`Readable`][] stream that uses
1507the old stream as its data source.
1508
1509It will rarely be necessary to use `readable.wrap()` but the method has been
1510provided as a convenience for interacting with older Node.js applications and
1511libraries.
1512
1513```js
1514const { OldReader } = require('./old-api-module.js');
1515const { Readable } = require('stream');
1516const oreader = new OldReader();
1517const myReader = new Readable().wrap(oreader);
1518
1519myReader.on('readable', () => {
1520  myReader.read(); // etc.
1521});
1522```
1523
1524##### `readable[Symbol.asyncIterator]()`
1525<!-- YAML
1526added: v10.0.0
1527changes:
1528  - version: v11.14.0
1529    pr-url: https://github.com/nodejs/node/pull/26989
1530    description: Symbol.asyncIterator support is no longer experimental.
1531-->
1532
1533* Returns: {AsyncIterator} to fully consume the stream.
1534
1535```js
1536const fs = require('fs');
1537
1538async function print(readable) {
1539  readable.setEncoding('utf8');
1540  let data = '';
1541  for await (const chunk of readable) {
1542    data += chunk;
1543  }
1544  console.log(data);
1545}
1546
1547print(fs.createReadStream('file')).catch(console.error);
1548```
1549
1550If the loop terminates with a `break` or a `throw`, the stream will be
1551destroyed. In other terms, iterating over a stream will consume the stream
1552fully. The stream will be read in chunks of size equal to the `highWaterMark`
1553option. In the code example above, data will be in a single chunk if the file
1554has less then 64KB of data because no `highWaterMark` option is provided to
1555[`fs.createReadStream()`][].
1556
1557### Duplex and transform streams
1558
1559#### Class: `stream.Duplex`
1560<!-- YAML
1561added: v0.9.4
1562changes:
1563  - version: v6.8.0
1564    pr-url: https://github.com/nodejs/node/pull/8834
1565    description: Instances of `Duplex` now return `true` when
1566                 checking `instanceof stream.Writable`.
1567-->
1568
1569<!--type=class-->
1570
1571Duplex streams are streams that implement both the [`Readable`][] and
1572[`Writable`][] interfaces.
1573
1574Examples of `Duplex` streams include:
1575
1576* [TCP sockets][]
1577* [zlib streams][zlib]
1578* [crypto streams][crypto]
1579
1580#### Class: `stream.Transform`
1581<!-- YAML
1582added: v0.9.4
1583-->
1584
1585<!--type=class-->
1586
1587Transform streams are [`Duplex`][] streams where the output is in some way
1588related to the input. Like all [`Duplex`][] streams, `Transform` streams
1589implement both the [`Readable`][] and [`Writable`][] interfaces.
1590
1591Examples of `Transform` streams include:
1592
1593* [zlib streams][zlib]
1594* [crypto streams][crypto]
1595
1596##### `transform.destroy([error])`
1597<!-- YAML
1598added: v8.0.0
1599changes:
1600  - version: v14.0.0
1601    pr-url: https://github.com/nodejs/node/pull/29197
1602    description: Work as a no-op on a stream that has already been destroyed.
1603-->
1604
1605* `error` {Error}
1606* Returns: {this}
1607
1608Destroy the stream, and optionally emit an `'error'` event. After this call, the
1609transform stream would release any internal resources.
1610Implementors should not override this method, but instead implement
1611[`readable._destroy()`][readable-_destroy].
1612The default implementation of `_destroy()` for `Transform` also emit `'close'`
1613unless `emitClose` is set in false.
1614
1615Once `destroy()` has been called, any further calls will be a no-op and no
1616further errors except from `_destroy()` may be emitted as `'error'`.
1617
1618### `stream.finished(stream[, options], callback)`
1619<!-- YAML
1620added: v10.0.0
1621changes:
1622  - version: v14.0.0
1623    pr-url: https://github.com/nodejs/node/pull/32158
1624    description: The `finished(stream, cb)` will wait for the `'close'` event
1625                 before invoking the callback. The implementation tries to
1626                 detect legacy streams and only apply this behavior to streams
1627                 which are expected to emit `'close'`.
1628  - version: v14.0.0
1629    pr-url: https://github.com/nodejs/node/pull/31545
1630    description: Emitting `'close'` before `'end'` on a `Readable` stream
1631                 will cause an `ERR_STREAM_PREMATURE_CLOSE` error.
1632  - version: v14.0.0
1633    pr-url: https://github.com/nodejs/node/pull/31509
1634    description: Callback will be invoked on streams which have already
1635                 finished before the call to `finished(stream, cb)`.
1636-->
1637
1638* `stream` {Stream} A readable and/or writable stream.
1639* `options` {Object}
1640  * `error` {boolean} If set to `false`, then a call to `emit('error', err)` is
1641    not treated as finished. **Default**: `true`.
1642  * `readable` {boolean} When set to `false`, the callback will be called when
1643    the stream ends even though the stream might still be readable.
1644    **Default**: `true`.
1645  * `writable` {boolean} When set to `false`, the callback will be called when
1646    the stream ends even though the stream might still be writable.
1647    **Default**: `true`.
1648* `callback` {Function} A callback function that takes an optional error
1649  argument.
1650* Returns: {Function} A cleanup function which removes all registered
1651  listeners.
1652
1653A function to get notified when a stream is no longer readable, writable
1654or has experienced an error or a premature close event.
1655
1656```js
1657const { finished } = require('stream');
1658
1659const rs = fs.createReadStream('archive.tar');
1660
1661finished(rs, (err) => {
1662  if (err) {
1663    console.error('Stream failed.', err);
1664  } else {
1665    console.log('Stream is done reading.');
1666  }
1667});
1668
1669rs.resume(); // Drain the stream.
1670```
1671
1672Especially useful in error handling scenarios where a stream is destroyed
1673prematurely (like an aborted HTTP request), and will not emit `'end'`
1674or `'finish'`.
1675
1676The `finished` API is promisify-able as well;
1677
1678```js
1679const finished = util.promisify(stream.finished);
1680
1681const rs = fs.createReadStream('archive.tar');
1682
1683async function run() {
1684  await finished(rs);
1685  console.log('Stream is done reading.');
1686}
1687
1688run().catch(console.error);
1689rs.resume(); // Drain the stream.
1690```
1691
1692`stream.finished()` leaves dangling event listeners (in particular
1693`'error'`, `'end'`, `'finish'` and `'close'`) after `callback` has been
1694invoked. The reason for this is so that unexpected `'error'` events (due to
1695incorrect stream implementations) do not cause unexpected crashes.
1696If this is unwanted behavior then the returned cleanup function needs to be
1697invoked in the callback:
1698
1699```js
1700const cleanup = finished(rs, (err) => {
1701  cleanup();
1702  // ...
1703});
1704```
1705
1706### `stream.pipeline(source[, ...transforms], destination, callback)`
1707### `stream.pipeline(streams, callback)`
1708<!-- YAML
1709added: v10.0.0
1710changes:
1711  - version: v14.0.0
1712    pr-url: https://github.com/nodejs/node/pull/32158
1713    description: The `pipeline(..., cb)` will wait for the `'close'` event
1714                 before invoking the callback. The implementation tries to
1715                 detect legacy streams and only apply this behavior to streams
1716                 which are expected to emit `'close'`.
1717  - version: v13.10.0
1718    pr-url: https://github.com/nodejs/node/pull/31223
1719    description: Add support for async generators.
1720-->
1721
1722* `streams` {Stream[]|Iterable[]|AsyncIterable[]|Function[]}
1723* `source` {Stream|Iterable|AsyncIterable|Function}
1724  * Returns: {Iterable|AsyncIterable}
1725* `...transforms` {Stream|Function}
1726  * `source` {AsyncIterable}
1727  * Returns: {AsyncIterable}
1728* `destination` {Stream|Function}
1729  * `source` {AsyncIterable}
1730  * Returns: {AsyncIterable|Promise}
1731* `callback` {Function} Called when the pipeline is fully done.
1732  * `err` {Error}
1733  * `val` Resolved value of `Promise` returned by `destination`.
1734* Returns: {Stream}
1735
1736A module method to pipe between streams and generators forwarding errors and
1737properly cleaning up and provide a callback when the pipeline is complete.
1738
1739```js
1740const { pipeline } = require('stream');
1741const fs = require('fs');
1742const zlib = require('zlib');
1743
1744// Use the pipeline API to easily pipe a series of streams
1745// together and get notified when the pipeline is fully done.
1746
1747// A pipeline to gzip a potentially huge tar file efficiently:
1748
1749pipeline(
1750  fs.createReadStream('archive.tar'),
1751  zlib.createGzip(),
1752  fs.createWriteStream('archive.tar.gz'),
1753  (err) => {
1754    if (err) {
1755      console.error('Pipeline failed.', err);
1756    } else {
1757      console.log('Pipeline succeeded.');
1758    }
1759  }
1760);
1761```
1762
1763The `pipeline` API is promisify-able as well:
1764
1765```js
1766const pipeline = util.promisify(stream.pipeline);
1767
1768async function run() {
1769  await pipeline(
1770    fs.createReadStream('archive.tar'),
1771    zlib.createGzip(),
1772    fs.createWriteStream('archive.tar.gz')
1773  );
1774  console.log('Pipeline succeeded.');
1775}
1776
1777run().catch(console.error);
1778```
1779
1780The `pipeline` API also supports async generators:
1781
1782```js
1783const pipeline = util.promisify(stream.pipeline);
1784const fs = require('fs');
1785
1786async function run() {
1787  await pipeline(
1788    fs.createReadStream('lowercase.txt'),
1789    async function* (source) {
1790      source.setEncoding('utf8');  // Work with strings rather than `Buffer`s.
1791      for await (const chunk of source) {
1792        yield chunk.toUpperCase();
1793      }
1794    },
1795    fs.createWriteStream('uppercase.txt')
1796  );
1797  console.log('Pipeline succeeded.');
1798}
1799
1800run().catch(console.error);
1801```
1802
1803`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
1804* `Readable` streams which have emitted `'end'` or `'close'`.
1805* `Writable` streams which have emitted `'finish'` or `'close'`.
1806
1807`stream.pipeline()` leaves dangling event listeners on the streams
1808after the `callback` has been invoked. In the case of reuse of streams after
1809failure, this can cause event listener leaks and swallowed errors.
1810
1811### `stream.Readable.from(iterable, [options])`
1812<!-- YAML
1813added:
1814  - v12.3.0
1815  - v10.17.0
1816-->
1817
1818* `iterable` {Iterable} Object implementing the `Symbol.asyncIterator` or
1819  `Symbol.iterator` iterable protocol. Emits an 'error' event if a null
1820   value is passed.
1821* `options` {Object} Options provided to `new stream.Readable([options])`.
1822  By default, `Readable.from()` will set `options.objectMode` to `true`, unless
1823  this is explicitly opted out by setting `options.objectMode` to `false`.
1824* Returns: {stream.Readable}
1825
1826A utility method for creating readable streams out of iterators.
1827
1828```js
1829const { Readable } = require('stream');
1830
1831async function * generate() {
1832  yield 'hello';
1833  yield 'streams';
1834}
1835
1836const readable = Readable.from(generate());
1837
1838readable.on('data', (chunk) => {
1839  console.log(chunk);
1840});
1841```
1842
1843Calling `Readable.from(string)` or `Readable.from(buffer)` will not have
1844the strings or buffers be iterated to match the other streams semantics
1845for performance reasons.
1846
1847## API for stream implementers
1848
1849<!--type=misc-->
1850
1851The `stream` module API has been designed to make it possible to easily
1852implement streams using JavaScript's prototypal inheritance model.
1853
1854First, a stream developer would declare a new JavaScript class that extends one
1855of the four basic stream classes (`stream.Writable`, `stream.Readable`,
1856`stream.Duplex`, or `stream.Transform`), making sure they call the appropriate
1857parent class constructor:
1858
1859<!-- eslint-disable no-useless-constructor -->
1860```js
1861const { Writable } = require('stream');
1862
1863class MyWritable extends Writable {
1864  constructor({ highWaterMark, ...options }) {
1865    super({ highWaterMark });
1866    // ...
1867  }
1868}
1869```
1870
1871When extending streams, keep in mind what options the user
1872can and should provide before forwarding these to the base constructor. For
1873example, if the implementation makes assumptions in regard to the
1874`autoDestroy` and `emitClose` options, do not allow the
1875user to override these. Be explicit about what
1876options are forwarded instead of implicitly forwarding all options.
1877
1878The new stream class must then implement one or more specific methods, depending
1879on the type of stream being created, as detailed in the chart below:
1880
1881| Use-case | Class | Method(s) to implement |
1882| -------- | ----- | ---------------------- |
1883| Reading only | [`Readable`][] | [`_read()`][stream-_read] |
1884| Writing only | [`Writable`][] | [`_write()`][stream-_write], [`_writev()`][stream-_writev], [`_final()`][stream-_final] |
1885| Reading and writing | [`Duplex`][] | [`_read()`][stream-_read], [`_write()`][stream-_write], [`_writev()`][stream-_writev], [`_final()`][stream-_final] |
1886| Operate on written data, then read the result | [`Transform`][] | [`_transform()`][stream-_transform], [`_flush()`][stream-_flush], [`_final()`][stream-_final] |
1887
1888The implementation code for a stream should *never* call the "public" methods
1889of a stream that are intended for use by consumers (as described in the
1890[API for stream consumers][] section). Doing so may lead to adverse side effects
1891in application code consuming the stream.
1892
1893Avoid overriding public methods such as `write()`, `end()`, `cork()`,
1894`uncork()`, `read()` and `destroy()`, or emitting internal events such
1895as `'error'`, `'data'`, `'end'`, `'finish'` and `'close'` through `.emit()`.
1896Doing so can break current and future stream invariants leading to behavior
1897and/or compatibility issues with other streams, stream utilities, and user
1898expectations.
1899
1900### Simplified construction
1901<!-- YAML
1902added: v1.2.0
1903-->
1904
1905For many simple cases, it is possible to construct a stream without relying on
1906inheritance. This can be accomplished by directly creating instances of the
1907`stream.Writable`, `stream.Readable`, `stream.Duplex` or `stream.Transform`
1908objects and passing appropriate methods as constructor options.
1909
1910```js
1911const { Writable } = require('stream');
1912
1913const myWritable = new Writable({
1914  write(chunk, encoding, callback) {
1915    // ...
1916  }
1917});
1918```
1919
1920### Implementing a writable stream
1921
1922The `stream.Writable` class is extended to implement a [`Writable`][] stream.
1923
1924Custom `Writable` streams *must* call the `new stream.Writable([options])`
1925constructor and implement the `writable._write()` and/or `writable._writev()`
1926method.
1927
1928#### `new stream.Writable([options])`
1929<!-- YAML
1930changes:
1931  - version: v14.0.0
1932    pr-url: https://github.com/nodejs/node/pull/30623
1933    description: Change `autoDestroy` option default to `true`.
1934  - version:
1935     - v11.2.0
1936     - v10.16.0
1937    pr-url: https://github.com/nodejs/node/pull/22795
1938    description: Add `autoDestroy` option to automatically `destroy()` the
1939                 stream when it emits `'finish'` or errors.
1940  - version: v10.0.0
1941    pr-url: https://github.com/nodejs/node/pull/18438
1942    description: Add `emitClose` option to specify if `'close'` is emitted on
1943                 destroy.
1944-->
1945
1946* `options` {Object}
1947  * `highWaterMark` {number} Buffer level when
1948    [`stream.write()`][stream-write] starts returning `false`. **Default:**
1949    `16384` (16KB), or `16` for `objectMode` streams.
1950  * `decodeStrings` {boolean} Whether to encode `string`s passed to
1951    [`stream.write()`][stream-write] to `Buffer`s (with the encoding
1952    specified in the [`stream.write()`][stream-write] call) before passing
1953    them to [`stream._write()`][stream-_write]. Other types of data are not
1954    converted (i.e. `Buffer`s are not decoded into `string`s). Setting to
1955    false will prevent `string`s from being converted. **Default:** `true`.
1956  * `defaultEncoding` {string} The default encoding that is used when no
1957    encoding is specified as an argument to [`stream.write()`][stream-write].
1958    **Default:** `'utf8'`.
1959  * `objectMode` {boolean} Whether or not the
1960    [`stream.write(anyObj)`][stream-write] is a valid operation. When set,
1961    it becomes possible to write JavaScript values other than string,
1962    `Buffer` or `Uint8Array` if supported by the stream implementation.
1963    **Default:** `false`.
1964  * `emitClose` {boolean} Whether or not the stream should emit `'close'`
1965    after it has been destroyed. **Default:** `true`.
1966  * `write` {Function} Implementation for the
1967    [`stream._write()`][stream-_write] method.
1968  * `writev` {Function} Implementation for the
1969    [`stream._writev()`][stream-_writev] method.
1970  * `destroy` {Function} Implementation for the
1971    [`stream._destroy()`][writable-_destroy] method.
1972  * `final` {Function} Implementation for the
1973    [`stream._final()`][stream-_final] method.
1974  * `autoDestroy` {boolean} Whether this stream should automatically call
1975    `.destroy()` on itself after ending. **Default:** `true`.
1976
1977<!-- eslint-disable no-useless-constructor -->
1978```js
1979const { Writable } = require('stream');
1980
1981class MyWritable extends Writable {
1982  constructor(options) {
1983    // Calls the stream.Writable() constructor.
1984    super(options);
1985    // ...
1986  }
1987}
1988```
1989
1990Or, when using pre-ES6 style constructors:
1991
1992```js
1993const { Writable } = require('stream');
1994const util = require('util');
1995
1996function MyWritable(options) {
1997  if (!(this instanceof MyWritable))
1998    return new MyWritable(options);
1999  Writable.call(this, options);
2000}
2001util.inherits(MyWritable, Writable);
2002```
2003
2004Or, using the simplified constructor approach:
2005
2006```js
2007const { Writable } = require('stream');
2008
2009const myWritable = new Writable({
2010  write(chunk, encoding, callback) {
2011    // ...
2012  },
2013  writev(chunks, callback) {
2014    // ...
2015  }
2016});
2017```
2018
2019#### `writable._write(chunk, encoding, callback)`
2020<!-- YAML
2021changes:
2022  - version: v12.11.0
2023    pr-url: https://github.com/nodejs/node/pull/29639
2024    description: _write() is optional when providing _writev().
2025-->
2026
2027* `chunk` {Buffer|string|any} The `Buffer` to be written, converted from the
2028  `string` passed to [`stream.write()`][stream-write]. If the stream's
2029  `decodeStrings` option is `false` or the stream is operating in object mode,
2030  the chunk will not be converted & will be whatever was passed to
2031  [`stream.write()`][stream-write].
2032* `encoding` {string} If the chunk is a string, then `encoding` is the
2033  character encoding of that string. If chunk is a `Buffer`, or if the
2034  stream is operating in object mode, `encoding` may be ignored.
2035* `callback` {Function} Call this function (optionally with an error
2036  argument) when processing is complete for the supplied chunk.
2037
2038All `Writable` stream implementations must provide a
2039[`writable._write()`][stream-_write] and/or
2040[`writable._writev()`][stream-_writev] method to send data to the underlying
2041resource.
2042
2043[`Transform`][] streams provide their own implementation of the
2044[`writable._write()`][stream-_write].
2045
2046This function MUST NOT be called by application code directly. It should be
2047implemented by child classes, and called by the internal `Writable` class
2048methods only.
2049
2050The `callback` function must be called synchronously inside of
2051`writable._write()` or asynchronously (i.e. different tick) to signal either
2052that the write completed successfully or failed with an error.
2053The first argument passed to the `callback` must be the `Error` object if the
2054call failed or `null` if the write succeeded.
2055
2056All calls to `writable.write()` that occur between the time `writable._write()`
2057is called and the `callback` is called will cause the written data to be
2058buffered. When the `callback` is invoked, the stream might emit a [`'drain'`][]
2059event. If a stream implementation is capable of processing multiple chunks of
2060data at once, the `writable._writev()` method should be implemented.
2061
2062If the `decodeStrings` property is explicitly set to `false` in the constructor
2063options, then `chunk` will remain the same object that is passed to `.write()`,
2064and may be a string rather than a `Buffer`. This is to support implementations
2065that have an optimized handling for certain string data encodings. In that case,
2066the `encoding` argument will indicate the character encoding of the string.
2067Otherwise, the `encoding` argument can be safely ignored.
2068
2069The `writable._write()` method is prefixed with an underscore because it is
2070internal to the class that defines it, and should never be called directly by
2071user programs.
2072
2073#### `writable._writev(chunks, callback)`
2074
2075* `chunks` {Object[]} The data to be written. The value is an array of {Object}
2076  that each represent a discrete chunk of data to write. The properties of
2077  these objects are:
2078  * `chunk` {Buffer|string} A buffer instance or string containing the data to
2079    be written. The `chunk` will be a string if the `Writable` was created with
2080    the `decodeStrings` option set to `false` and a string was passed to `write()`.
2081  * `encoding` {string} The character encoding of the `chunk`. If `chunk` is
2082    a `Buffer`, the `encoding` will be `'buffer'`.
2083* `callback` {Function} A callback function (optionally with an error
2084  argument) to be invoked when processing is complete for the supplied chunks.
2085
2086This function MUST NOT be called by application code directly. It should be
2087implemented by child classes, and called by the internal `Writable` class
2088methods only.
2089
2090The `writable._writev()` method may be implemented in addition or alternatively
2091to `writable._write()` in stream implementations that are capable of processing
2092multiple chunks of data at once. If implemented and if there is buffered data
2093from previous writes, `_writev()` will be called instead of `_write()`.
2094
2095The `writable._writev()` method is prefixed with an underscore because it is
2096internal to the class that defines it, and should never be called directly by
2097user programs.
2098
2099#### `writable._destroy(err, callback)`
2100<!-- YAML
2101added: v8.0.0
2102-->
2103
2104* `err` {Error} A possible error.
2105* `callback` {Function} A callback function that takes an optional error
2106  argument.
2107
2108The `_destroy()` method is called by [`writable.destroy()`][writable-destroy].
2109It can be overridden by child classes but it **must not** be called directly.
2110
2111#### `writable._final(callback)`
2112<!-- YAML
2113added: v8.0.0
2114-->
2115
2116* `callback` {Function} Call this function (optionally with an error
2117  argument) when finished writing any remaining data.
2118
2119The `_final()` method **must not** be called directly. It may be implemented
2120by child classes, and if so, will be called by the internal `Writable`
2121class methods only.
2122
2123This optional function will be called before the stream closes, delaying the
2124`'finish'` event until `callback` is called. This is useful to close resources
2125or write buffered data before a stream ends.
2126
2127#### Errors while writing
2128
2129Errors occurring during the processing of the [`writable._write()`][],
2130[`writable._writev()`][] and [`writable._final()`][] methods must be propagated
2131by invoking the callback and passing the error as the first argument.
2132Throwing an `Error` from within these methods or manually emitting an `'error'`
2133event results in undefined behavior.
2134
2135If a `Readable` stream pipes into a `Writable` stream when `Writable` emits an
2136error, the `Readable` stream will be unpiped.
2137
2138```js
2139const { Writable } = require('stream');
2140
2141const myWritable = new Writable({
2142  write(chunk, encoding, callback) {
2143    if (chunk.toString().indexOf('a') >= 0) {
2144      callback(new Error('chunk is invalid'));
2145    } else {
2146      callback();
2147    }
2148  }
2149});
2150```
2151
2152#### An example writable stream
2153
2154The following illustrates a rather simplistic (and somewhat pointless) custom
2155`Writable` stream implementation. While this specific `Writable` stream instance
2156is not of any real particular usefulness, the example illustrates each of the
2157required elements of a custom [`Writable`][] stream instance:
2158
2159```js
2160const { Writable } = require('stream');
2161
2162class MyWritable extends Writable {
2163  _write(chunk, encoding, callback) {
2164    if (chunk.toString().indexOf('a') >= 0) {
2165      callback(new Error('chunk is invalid'));
2166    } else {
2167      callback();
2168    }
2169  }
2170}
2171```
2172
2173#### Decoding buffers in a writable stream
2174
2175Decoding buffers is a common task, for instance, when using transformers whose
2176input is a string. This is not a trivial process when using multi-byte
2177characters encoding, such as UTF-8. The following example shows how to decode
2178multi-byte strings using `StringDecoder` and [`Writable`][].
2179
2180```js
2181const { Writable } = require('stream');
2182const { StringDecoder } = require('string_decoder');
2183
2184class StringWritable extends Writable {
2185  constructor(options) {
2186    super(options);
2187    this._decoder = new StringDecoder(options && options.defaultEncoding);
2188    this.data = '';
2189  }
2190  _write(chunk, encoding, callback) {
2191    if (encoding === 'buffer') {
2192      chunk = this._decoder.write(chunk);
2193    }
2194    this.data += chunk;
2195    callback();
2196  }
2197  _final(callback) {
2198    this.data += this._decoder.end();
2199    callback();
2200  }
2201}
2202
2203const euro = [[0xE2, 0x82], [0xAC]].map(Buffer.from);
2204const w = new StringWritable();
2205
2206w.write('currency: ');
2207w.write(euro[0]);
2208w.end(euro[1]);
2209
2210console.log(w.data); // currency: €
2211```
2212
2213### Implementing a readable stream
2214
2215The `stream.Readable` class is extended to implement a [`Readable`][] stream.
2216
2217Custom `Readable` streams *must* call the `new stream.Readable([options])`
2218constructor and implement the [`readable._read()`][] method.
2219
2220#### `new stream.Readable([options])`
2221<!-- YAML
2222changes:
2223  - version: v14.0.0
2224    pr-url: https://github.com/nodejs/node/pull/30623
2225    description: Change `autoDestroy` option default to `true`.
2226  - version:
2227     - v11.2.0
2228     - v10.16.0
2229    pr-url: https://github.com/nodejs/node/pull/22795
2230    description: Add `autoDestroy` option to automatically `destroy()` the
2231                 stream when it emits `'end'` or errors.
2232-->
2233
2234* `options` {Object}
2235  * `highWaterMark` {number} The maximum [number of bytes][hwm-gotcha] to store
2236    in the internal buffer before ceasing to read from the underlying resource.
2237    **Default:** `16384` (16KB), or `16` for `objectMode` streams.
2238  * `encoding` {string} If specified, then buffers will be decoded to
2239    strings using the specified encoding. **Default:** `null`.
2240  * `objectMode` {boolean} Whether this stream should behave
2241    as a stream of objects. Meaning that [`stream.read(n)`][stream-read] returns
2242    a single value instead of a `Buffer` of size `n`. **Default:** `false`.
2243  * `emitClose` {boolean} Whether or not the stream should emit `'close'`
2244    after it has been destroyed. **Default:** `true`.
2245  * `read` {Function} Implementation for the [`stream._read()`][stream-_read]
2246    method.
2247  * `destroy` {Function} Implementation for the
2248    [`stream._destroy()`][readable-_destroy] method.
2249  * `autoDestroy` {boolean} Whether this stream should automatically call
2250    `.destroy()` on itself after ending. **Default:** `true`.
2251
2252<!-- eslint-disable no-useless-constructor -->
2253```js
2254const { Readable } = require('stream');
2255
2256class MyReadable extends Readable {
2257  constructor(options) {
2258    // Calls the stream.Readable(options) constructor.
2259    super(options);
2260    // ...
2261  }
2262}
2263```
2264
2265Or, when using pre-ES6 style constructors:
2266
2267```js
2268const { Readable } = require('stream');
2269const util = require('util');
2270
2271function MyReadable(options) {
2272  if (!(this instanceof MyReadable))
2273    return new MyReadable(options);
2274  Readable.call(this, options);
2275}
2276util.inherits(MyReadable, Readable);
2277```
2278
2279Or, using the simplified constructor approach:
2280
2281```js
2282const { Readable } = require('stream');
2283
2284const myReadable = new Readable({
2285  read(size) {
2286    // ...
2287  }
2288});
2289```
2290
2291#### `readable._read(size)`
2292<!-- YAML
2293added: v0.9.4
2294-->
2295
2296* `size` {number} Number of bytes to read asynchronously
2297
2298This function MUST NOT be called by application code directly. It should be
2299implemented by child classes, and called by the internal `Readable` class
2300methods only.
2301
2302All `Readable` stream implementations must provide an implementation of the
2303[`readable._read()`][] method to fetch data from the underlying resource.
2304
2305When [`readable._read()`][] is called, if data is available from the resource,
2306the implementation should begin pushing that data into the read queue using the
2307[`this.push(dataChunk)`][stream-push] method. `_read()` will be called again
2308after each call to [`this.push(dataChunk)`][stream-push] once the stream is
2309ready to accept more data. `_read()` may continue reading from the resource and
2310pushing data until `readable.push()` returns `false`. Only when `_read()` is
2311called again after it has stopped should it resume pushing additional data into
2312the queue.
2313
2314Once the [`readable._read()`][] method has been called, it will not be called
2315again until more data is pushed through the [`readable.push()`][stream-push]
2316method. Empty data such as empty buffers and strings will not cause
2317[`readable._read()`][] to be called.
2318
2319The `size` argument is advisory. For implementations where a "read" is a
2320single operation that returns data can use the `size` argument to determine how
2321much data to fetch. Other implementations may ignore this argument and simply
2322provide data whenever it becomes available. There is no need to "wait" until
2323`size` bytes are available before calling [`stream.push(chunk)`][stream-push].
2324
2325The [`readable._read()`][] method is prefixed with an underscore because it is
2326internal to the class that defines it, and should never be called directly by
2327user programs.
2328
2329#### `readable._destroy(err, callback)`
2330<!-- YAML
2331added: v8.0.0
2332-->
2333
2334* `err` {Error} A possible error.
2335* `callback` {Function} A callback function that takes an optional error
2336  argument.
2337
2338The `_destroy()` method is called by [`readable.destroy()`][readable-destroy].
2339It can be overridden by child classes but it **must not** be called directly.
2340
2341#### `readable.push(chunk[, encoding])`
2342<!-- YAML
2343changes:
2344  - version: v8.0.0
2345    pr-url: https://github.com/nodejs/node/pull/11608
2346    description: The `chunk` argument can now be a `Uint8Array` instance.
2347-->
2348
2349* `chunk` {Buffer|Uint8Array|string|null|any} Chunk of data to push into the
2350  read queue. For streams not operating in object mode, `chunk` must be a
2351  string, `Buffer` or `Uint8Array`. For object mode streams, `chunk` may be
2352  any JavaScript value.
2353* `encoding` {string} Encoding of string chunks. Must be a valid
2354  `Buffer` encoding, such as `'utf8'` or `'ascii'`.
2355* Returns: {boolean} `true` if additional chunks of data may continue to be
2356  pushed; `false` otherwise.
2357
2358When `chunk` is a `Buffer`, `Uint8Array` or `string`, the `chunk` of data will
2359be added to the internal queue for users of the stream to consume.
2360Passing `chunk` as `null` signals the end of the stream (EOF), after which no
2361more data can be written.
2362
2363When the `Readable` is operating in paused mode, the data added with
2364`readable.push()` can be read out by calling the
2365[`readable.read()`][stream-read] method when the [`'readable'`][] event is
2366emitted.
2367
2368When the `Readable` is operating in flowing mode, the data added with
2369`readable.push()` will be delivered by emitting a `'data'` event.
2370
2371The `readable.push()` method is designed to be as flexible as possible. For
2372example, when wrapping a lower-level source that provides some form of
2373pause/resume mechanism, and a data callback, the low-level source can be wrapped
2374by the custom `Readable` instance:
2375
2376```js
2377// `_source` is an object with readStop() and readStart() methods,
2378// and an `ondata` member that gets called when it has data, and
2379// an `onend` member that gets called when the data is over.
2380
2381class SourceWrapper extends Readable {
2382  constructor(options) {
2383    super(options);
2384
2385    this._source = getLowLevelSourceObject();
2386
2387    // Every time there's data, push it into the internal buffer.
2388    this._source.ondata = (chunk) => {
2389      // If push() returns false, then stop reading from source.
2390      if (!this.push(chunk))
2391        this._source.readStop();
2392    };
2393
2394    // When the source ends, push the EOF-signaling `null` chunk.
2395    this._source.onend = () => {
2396      this.push(null);
2397    };
2398  }
2399  // _read() will be called when the stream wants to pull more data in.
2400  // The advisory size argument is ignored in this case.
2401  _read(size) {
2402    this._source.readStart();
2403  }
2404}
2405```
2406
2407The `readable.push()` method is used to push the content
2408into the internal buffer. It can be driven by the [`readable._read()`][] method.
2409
2410For streams not operating in object mode, if the `chunk` parameter of
2411`readable.push()` is `undefined`, it will be treated as empty string or
2412buffer. See [`readable.push('')`][] for more information.
2413
2414#### Errors while reading
2415
2416Errors occurring during processing of the [`readable._read()`][] must be
2417propagated through the [`readable.destroy(err)`][readable-_destroy] method.
2418Throwing an `Error` from within [`readable._read()`][] or manually emitting an
2419`'error'` event results in undefined behavior.
2420
2421```js
2422const { Readable } = require('stream');
2423
2424const myReadable = new Readable({
2425  read(size) {
2426    const err = checkSomeErrorCondition();
2427    if (err) {
2428      this.destroy(err);
2429    } else {
2430      // Do some work.
2431    }
2432  }
2433});
2434```
2435
2436#### An example counting stream
2437
2438<!--type=example-->
2439
2440The following is a basic example of a `Readable` stream that emits the numerals
2441from 1 to 1,000,000 in ascending order, and then ends.
2442
2443```js
2444const { Readable } = require('stream');
2445
2446class Counter extends Readable {
2447  constructor(opt) {
2448    super(opt);
2449    this._max = 1000000;
2450    this._index = 1;
2451  }
2452
2453  _read() {
2454    const i = this._index++;
2455    if (i > this._max)
2456      this.push(null);
2457    else {
2458      const str = String(i);
2459      const buf = Buffer.from(str, 'ascii');
2460      this.push(buf);
2461    }
2462  }
2463}
2464```
2465
2466### Implementing a duplex stream
2467
2468A [`Duplex`][] stream is one that implements both [`Readable`][] and
2469[`Writable`][], such as a TCP socket connection.
2470
2471Because JavaScript does not have support for multiple inheritance, the
2472`stream.Duplex` class is extended to implement a [`Duplex`][] stream (as opposed
2473to extending the `stream.Readable` *and* `stream.Writable` classes).
2474
2475The `stream.Duplex` class prototypically inherits from `stream.Readable` and
2476parasitically from `stream.Writable`, but `instanceof` will work properly for
2477both base classes due to overriding [`Symbol.hasInstance`][] on
2478`stream.Writable`.
2479
2480Custom `Duplex` streams *must* call the `new stream.Duplex([options])`
2481constructor and implement *both* the [`readable._read()`][] and
2482`writable._write()` methods.
2483
2484#### `new stream.Duplex(options)`
2485<!-- YAML
2486changes:
2487  - version: v8.4.0
2488    pr-url: https://github.com/nodejs/node/pull/14636
2489    description: The `readableHighWaterMark` and `writableHighWaterMark` options
2490                 are supported now.
2491-->
2492
2493* `options` {Object} Passed to both `Writable` and `Readable`
2494  constructors. Also has the following fields:
2495  * `allowHalfOpen` {boolean} If set to `false`, then the stream will
2496    automatically end the writable side when the readable side ends.
2497    **Default:** `true`.
2498  * `readable` {boolean} Sets whether the `Duplex` should be readable.
2499    **Default:** `true`.
2500  * `writable` {boolean} Sets whether the `Duplex` should be writable.
2501    **Default:** `true`.
2502  * `readableObjectMode` {boolean} Sets `objectMode` for readable side of the
2503    stream. Has no effect if `objectMode` is `true`. **Default:** `false`.
2504  * `writableObjectMode` {boolean} Sets `objectMode` for writable side of the
2505    stream. Has no effect if `objectMode` is `true`. **Default:** `false`.
2506  * `readableHighWaterMark` {number} Sets `highWaterMark` for the readable side
2507    of the stream. Has no effect if `highWaterMark` is provided.
2508  * `writableHighWaterMark` {number} Sets `highWaterMark` for the writable side
2509    of the stream. Has no effect if `highWaterMark` is provided.
2510
2511<!-- eslint-disable no-useless-constructor -->
2512```js
2513const { Duplex } = require('stream');
2514
2515class MyDuplex extends Duplex {
2516  constructor(options) {
2517    super(options);
2518    // ...
2519  }
2520}
2521```
2522
2523Or, when using pre-ES6 style constructors:
2524
2525```js
2526const { Duplex } = require('stream');
2527const util = require('util');
2528
2529function MyDuplex(options) {
2530  if (!(this instanceof MyDuplex))
2531    return new MyDuplex(options);
2532  Duplex.call(this, options);
2533}
2534util.inherits(MyDuplex, Duplex);
2535```
2536
2537Or, using the simplified constructor approach:
2538
2539```js
2540const { Duplex } = require('stream');
2541
2542const myDuplex = new Duplex({
2543  read(size) {
2544    // ...
2545  },
2546  write(chunk, encoding, callback) {
2547    // ...
2548  }
2549});
2550```
2551
2552#### An example duplex stream
2553
2554The following illustrates a simple example of a `Duplex` stream that wraps a
2555hypothetical lower-level source object to which data can be written, and
2556from which data can be read, albeit using an API that is not compatible with
2557Node.js streams.
2558The following illustrates a simple example of a `Duplex` stream that buffers
2559incoming written data via the [`Writable`][] interface that is read back out
2560via the [`Readable`][] interface.
2561
2562```js
2563const { Duplex } = require('stream');
2564const kSource = Symbol('source');
2565
2566class MyDuplex extends Duplex {
2567  constructor(source, options) {
2568    super(options);
2569    this[kSource] = source;
2570  }
2571
2572  _write(chunk, encoding, callback) {
2573    // The underlying source only deals with strings.
2574    if (Buffer.isBuffer(chunk))
2575      chunk = chunk.toString();
2576    this[kSource].writeSomeData(chunk);
2577    callback();
2578  }
2579
2580  _read(size) {
2581    this[kSource].fetchSomeData(size, (data, encoding) => {
2582      this.push(Buffer.from(data, encoding));
2583    });
2584  }
2585}
2586```
2587
2588The most important aspect of a `Duplex` stream is that the `Readable` and
2589`Writable` sides operate independently of one another despite co-existing within
2590a single object instance.
2591
2592#### Object mode duplex streams
2593
2594For `Duplex` streams, `objectMode` can be set exclusively for either the
2595`Readable` or `Writable` side using the `readableObjectMode` and
2596`writableObjectMode` options respectively.
2597
2598In the following example, for instance, a new `Transform` stream (which is a
2599type of [`Duplex`][] stream) is created that has an object mode `Writable` side
2600that accepts JavaScript numbers that are converted to hexadecimal strings on
2601the `Readable` side.
2602
2603```js
2604const { Transform } = require('stream');
2605
2606// All Transform streams are also Duplex Streams.
2607const myTransform = new Transform({
2608  writableObjectMode: true,
2609
2610  transform(chunk, encoding, callback) {
2611    // Coerce the chunk to a number if necessary.
2612    chunk |= 0;
2613
2614    // Transform the chunk into something else.
2615    const data = chunk.toString(16);
2616
2617    // Push the data onto the readable queue.
2618    callback(null, '0'.repeat(data.length % 2) + data);
2619  }
2620});
2621
2622myTransform.setEncoding('ascii');
2623myTransform.on('data', (chunk) => console.log(chunk));
2624
2625myTransform.write(1);
2626// Prints: 01
2627myTransform.write(10);
2628// Prints: 0a
2629myTransform.write(100);
2630// Prints: 64
2631```
2632
2633### Implementing a transform stream
2634
2635A [`Transform`][] stream is a [`Duplex`][] stream where the output is computed
2636in some way from the input. Examples include [zlib][] streams or [crypto][]
2637streams that compress, encrypt, or decrypt data.
2638
2639There is no requirement that the output be the same size as the input, the same
2640number of chunks, or arrive at the same time. For example, a `Hash` stream will
2641only ever have a single chunk of output which is provided when the input is
2642ended. A `zlib` stream will produce output that is either much smaller or much
2643larger than its input.
2644
2645The `stream.Transform` class is extended to implement a [`Transform`][] stream.
2646
2647The `stream.Transform` class prototypically inherits from `stream.Duplex` and
2648implements its own versions of the `writable._write()` and
2649[`readable._read()`][] methods. Custom `Transform` implementations *must*
2650implement the [`transform._transform()`][stream-_transform] method and *may*
2651also implement the [`transform._flush()`][stream-_flush] method.
2652
2653Care must be taken when using `Transform` streams in that data written to the
2654stream can cause the `Writable` side of the stream to become paused if the
2655output on the `Readable` side is not consumed.
2656
2657#### `new stream.Transform([options])`
2658
2659* `options` {Object} Passed to both `Writable` and `Readable`
2660  constructors. Also has the following fields:
2661  * `transform` {Function} Implementation for the
2662    [`stream._transform()`][stream-_transform] method.
2663  * `flush` {Function} Implementation for the [`stream._flush()`][stream-_flush]
2664    method.
2665
2666<!-- eslint-disable no-useless-constructor -->
2667```js
2668const { Transform } = require('stream');
2669
2670class MyTransform extends Transform {
2671  constructor(options) {
2672    super(options);
2673    // ...
2674  }
2675}
2676```
2677
2678Or, when using pre-ES6 style constructors:
2679
2680```js
2681const { Transform } = require('stream');
2682const util = require('util');
2683
2684function MyTransform(options) {
2685  if (!(this instanceof MyTransform))
2686    return new MyTransform(options);
2687  Transform.call(this, options);
2688}
2689util.inherits(MyTransform, Transform);
2690```
2691
2692Or, using the simplified constructor approach:
2693
2694```js
2695const { Transform } = require('stream');
2696
2697const myTransform = new Transform({
2698  transform(chunk, encoding, callback) {
2699    // ...
2700  }
2701});
2702```
2703
2704#### Event: `'end'`
2705
2706The [`'end'`][] event is from the `stream.Readable` class. The `'end'` event is
2707emitted after all data has been output, which occurs after the callback in
2708[`transform._flush()`][stream-_flush] has been called. In the case of an error,
2709`'end'` should not be emitted.
2710
2711#### Event: `'finish'`
2712
2713The [`'finish'`][] event is from the `stream.Writable` class. The `'finish'`
2714event is emitted after [`stream.end()`][stream-end] is called and all chunks
2715have been processed by [`stream._transform()`][stream-_transform]. In the case
2716of an error, `'finish'` should not be emitted.
2717
2718#### `transform._flush(callback)`
2719
2720* `callback` {Function} A callback function (optionally with an error
2721  argument and data) to be called when remaining data has been flushed.
2722
2723This function MUST NOT be called by application code directly. It should be
2724implemented by child classes, and called by the internal `Readable` class
2725methods only.
2726
2727In some cases, a transform operation may need to emit an additional bit of
2728data at the end of the stream. For example, a `zlib` compression stream will
2729store an amount of internal state used to optimally compress the output. When
2730the stream ends, however, that additional data needs to be flushed so that the
2731compressed data will be complete.
2732
2733Custom [`Transform`][] implementations *may* implement the `transform._flush()`
2734method. This will be called when there is no more written data to be consumed,
2735but before the [`'end'`][] event is emitted signaling the end of the
2736[`Readable`][] stream.
2737
2738Within the `transform._flush()` implementation, the `transform.push()` method
2739may be called zero or more times, as appropriate. The `callback` function must
2740be called when the flush operation is complete.
2741
2742The `transform._flush()` method is prefixed with an underscore because it is
2743internal to the class that defines it, and should never be called directly by
2744user programs.
2745
2746#### `transform._transform(chunk, encoding, callback)`
2747
2748* `chunk` {Buffer|string|any} The `Buffer` to be transformed, converted from
2749  the `string` passed to [`stream.write()`][stream-write]. If the stream's
2750  `decodeStrings` option is `false` or the stream is operating in object mode,
2751  the chunk will not be converted & will be whatever was passed to
2752  [`stream.write()`][stream-write].
2753* `encoding` {string} If the chunk is a string, then this is the
2754  encoding type. If chunk is a buffer, then this is the special
2755  value `'buffer'`. Ignore it in that case.
2756* `callback` {Function} A callback function (optionally with an error
2757  argument and data) to be called after the supplied `chunk` has been
2758  processed.
2759
2760This function MUST NOT be called by application code directly. It should be
2761implemented by child classes, and called by the internal `Readable` class
2762methods only.
2763
2764All `Transform` stream implementations must provide a `_transform()`
2765method to accept input and produce output. The `transform._transform()`
2766implementation handles the bytes being written, computes an output, then passes
2767that output off to the readable portion using the `transform.push()` method.
2768
2769The `transform.push()` method may be called zero or more times to generate
2770output from a single input chunk, depending on how much is to be output
2771as a result of the chunk.
2772
2773It is possible that no output is generated from any given chunk of input data.
2774
2775The `callback` function must be called only when the current chunk is completely
2776consumed. The first argument passed to the `callback` must be an `Error` object
2777if an error occurred while processing the input or `null` otherwise. If a second
2778argument is passed to the `callback`, it will be forwarded on to the
2779`transform.push()` method. In other words, the following are equivalent:
2780
2781```js
2782transform.prototype._transform = function(data, encoding, callback) {
2783  this.push(data);
2784  callback();
2785};
2786
2787transform.prototype._transform = function(data, encoding, callback) {
2788  callback(null, data);
2789};
2790```
2791
2792The `transform._transform()` method is prefixed with an underscore because it
2793is internal to the class that defines it, and should never be called directly by
2794user programs.
2795
2796`transform._transform()` is never called in parallel; streams implement a
2797queue mechanism, and to receive the next chunk, `callback` must be
2798called, either synchronously or asynchronously.
2799
2800#### Class: `stream.PassThrough`
2801
2802The `stream.PassThrough` class is a trivial implementation of a [`Transform`][]
2803stream that simply passes the input bytes across to the output. Its purpose is
2804primarily for examples and testing, but there are some use cases where
2805`stream.PassThrough` is useful as a building block for novel sorts of streams.
2806
2807## Additional notes
2808
2809<!--type=misc-->
2810
2811### Streams compatibility with async generators and async iterators
2812
2813With the support of async generators and iterators in JavaScript, async
2814generators are effectively a first-class language-level stream construct at
2815this point.
2816
2817Some common interop cases of using Node.js streams with async generators
2818and async iterators are provided below.
2819
2820#### Consuming readable streams with async iterators
2821
2822```js
2823(async function() {
2824  for await (const chunk of readable) {
2825    console.log(chunk);
2826  }
2827})();
2828```
2829
2830Async iterators register a permanent error handler on the stream to prevent any
2831unhandled post-destroy errors.
2832
2833#### Creating readable streams with async generators
2834
2835We can construct a Node.js readable stream from an asynchronous generator
2836using the `Readable.from()` utility method:
2837
2838```js
2839const { Readable } = require('stream');
2840
2841async function * generate() {
2842  yield 'a';
2843  yield 'b';
2844  yield 'c';
2845}
2846
2847const readable = Readable.from(generate());
2848
2849readable.on('data', (chunk) => {
2850  console.log(chunk);
2851});
2852```
2853
2854#### Piping to writable streams from async iterators
2855
2856When writing to a writable stream from an async iterator, ensure correct
2857handling of backpressure and errors. [`stream.pipeline()`][] abstracts away
2858the handling of backpressure and backpressure-related errors:
2859
2860```js
2861const { pipeline } = require('stream');
2862const util = require('util');
2863const fs = require('fs');
2864
2865const writable = fs.createWriteStream('./file');
2866
2867// Callback Pattern
2868pipeline(iterator, writable, (err, value) => {
2869  if (err) {
2870    console.error(err);
2871  } else {
2872    console.log(value, 'value returned');
2873  }
2874});
2875
2876// Promise Pattern
2877const pipelinePromise = util.promisify(pipeline);
2878pipelinePromise(iterator, writable)
2879  .then((value) => {
2880    console.log(value, 'value returned');
2881  })
2882  .catch(console.error);
2883```
2884
2885<!--type=misc-->
2886
2887### Compatibility with older Node.js versions
2888
2889<!--type=misc-->
2890
2891Prior to Node.js 0.10, the `Readable` stream interface was simpler, but also
2892less powerful and less useful.
2893
2894* Rather than waiting for calls to the [`stream.read()`][stream-read] method,
2895  [`'data'`][] events would begin emitting immediately. Applications that
2896  would need to perform some amount of work to decide how to handle data
2897  were required to store read data into buffers so the data would not be lost.
2898* The [`stream.pause()`][stream-pause] method was advisory, rather than
2899  guaranteed. This meant that it was still necessary to be prepared to receive
2900  [`'data'`][] events *even when the stream was in a paused state*.
2901
2902In Node.js 0.10, the [`Readable`][] class was added. For backward
2903compatibility with older Node.js programs, `Readable` streams switch into
2904"flowing mode" when a [`'data'`][] event handler is added, or when the
2905[`stream.resume()`][stream-resume] method is called. The effect is that, even
2906when not using the new [`stream.read()`][stream-read] method and
2907[`'readable'`][] event, it is no longer necessary to worry about losing
2908[`'data'`][] chunks.
2909
2910While most applications will continue to function normally, this introduces an
2911edge case in the following conditions:
2912
2913* No [`'data'`][] event listener is added.
2914* The [`stream.resume()`][stream-resume] method is never called.
2915* The stream is not piped to any writable destination.
2916
2917For example, consider the following code:
2918
2919```js
2920// WARNING!  BROKEN!
2921net.createServer((socket) => {
2922
2923  // We add an 'end' listener, but never consume the data.
2924  socket.on('end', () => {
2925    // It will never get here.
2926    socket.end('The message was received but was not processed.\n');
2927  });
2928
2929}).listen(1337);
2930```
2931
2932Prior to Node.js 0.10, the incoming message data would be simply discarded.
2933However, in Node.js 0.10 and beyond, the socket remains paused forever.
2934
2935The workaround in this situation is to call the
2936[`stream.resume()`][stream-resume] method to begin the flow of data:
2937
2938```js
2939// Workaround.
2940net.createServer((socket) => {
2941  socket.on('end', () => {
2942    socket.end('The message was received but was not processed.\n');
2943  });
2944
2945  // Start the flow of data, discarding it.
2946  socket.resume();
2947}).listen(1337);
2948```
2949
2950In addition to new `Readable` streams switching into flowing mode,
2951pre-0.10 style streams can be wrapped in a `Readable` class using the
2952[`readable.wrap()`][`stream.wrap()`] method.
2953
2954### `readable.read(0)`
2955
2956There are some cases where it is necessary to trigger a refresh of the
2957underlying readable stream mechanisms, without actually consuming any
2958data. In such cases, it is possible to call `readable.read(0)`, which will
2959always return `null`.
2960
2961If the internal read buffer is below the `highWaterMark`, and the
2962stream is not currently reading, then calling `stream.read(0)` will trigger
2963a low-level [`stream._read()`][stream-_read] call.
2964
2965While most applications will almost never need to do this, there are
2966situations within Node.js where this is done, particularly in the
2967`Readable` stream class internals.
2968
2969### `readable.push('')`
2970
2971Use of `readable.push('')` is not recommended.
2972
2973Pushing a zero-byte string, `Buffer` or `Uint8Array` to a stream that is not in
2974object mode has an interesting side effect. Because it *is* a call to
2975[`readable.push()`][stream-push], the call will end the reading process.
2976However, because the argument is an empty string, no data is added to the
2977readable buffer so there is nothing for a user to consume.
2978
2979### `highWaterMark` discrepancy after calling `readable.setEncoding()`
2980
2981The use of `readable.setEncoding()` will change the behavior of how the
2982`highWaterMark` operates in non-object mode.
2983
2984Typically, the size of the current buffer is measured against the
2985`highWaterMark` in _bytes_. However, after `setEncoding()` is called, the
2986comparison function will begin to measure the buffer's size in _characters_.
2987
2988This is not a problem in common cases with `latin1` or `ascii`. But it is
2989advised to be mindful about this behavior when working with strings that could
2990contain multi-byte characters.
2991
2992[API for stream consumers]: #stream_api_for_stream_consumers
2993[API for stream implementers]: #stream_api_for_stream_implementers
2994[Compatibility]: #stream_compatibility_with_older_node_js_versions
2995[HTTP requests, on the client]: http.md#http_class_http_clientrequest
2996[HTTP responses, on the server]: http.md#http_class_http_serverresponse
2997[TCP sockets]: net.md#net_class_net_socket
2998[Three states]: #stream_three_states
2999[`'data'`]: #stream_event_data
3000[`'drain'`]: #stream_event_drain
3001[`'end'`]: #stream_event_end
3002[`'finish'`]: #stream_event_finish
3003[`'readable'`]: #stream_event_readable
3004[`Duplex`]: #stream_class_stream_duplex
3005[`EventEmitter`]: events.md#events_class_eventemitter
3006[`Readable`]: #stream_class_stream_readable
3007[`Symbol.hasInstance`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Symbol/hasInstance
3008[`Transform`]: #stream_class_stream_transform
3009[`Writable`]: #stream_class_stream_writable
3010[`fs.createReadStream()`]: fs.md#fs_fs_createreadstream_path_options
3011[`fs.createWriteStream()`]: fs.md#fs_fs_createwritestream_path_options
3012[`net.Socket`]: net.md#net_class_net_socket
3013[`process.stderr`]: process.md#process_process_stderr
3014[`process.stdin`]: process.md#process_process_stdin
3015[`process.stdout`]: process.md#process_process_stdout
3016[`readable._read()`]: #stream_readable_read_size_1
3017[`readable.push('')`]: #stream_readable_push
3018[`readable.setEncoding()`]: #stream_readable_setencoding_encoding
3019[`stream.Readable.from()`]: #stream_stream_readable_from_iterable_options
3020[`stream.cork()`]: #stream_writable_cork
3021[`stream.finished()`]: #stream_stream_finished_stream_options_callback
3022[`stream.pipe()`]: #stream_readable_pipe_destination_options
3023[`stream.pipeline()`]: #stream_stream_pipeline_source_transforms_destination_callback
3024[`stream.uncork()`]: #stream_writable_uncork
3025[`stream.unpipe()`]: #stream_readable_unpipe_destination
3026[`stream.wrap()`]: #stream_readable_wrap_stream
3027[`writable._final()`]: #stream_writable_final_callback
3028[`writable._write()`]: #stream_writable_write_chunk_encoding_callback_1
3029[`writable._writev()`]: #stream_writable_writev_chunks_callback
3030[`writable.cork()`]: #stream_writable_cork
3031[`writable.end()`]: #stream_writable_end_chunk_encoding_callback
3032[`writable.uncork()`]: #stream_writable_uncork
3033[`writable.writableFinished`]: #stream_writable_writablefinished
3034[`zlib.createDeflate()`]: zlib.md#zlib_zlib_createdeflate_options
3035[child process stdin]: child_process.md#child_process_subprocess_stdin
3036[child process stdout and stderr]: child_process.md#child_process_subprocess_stdout
3037[crypto]: crypto.md
3038[fs read streams]: fs.md#fs_class_fs_readstream
3039[fs write streams]: fs.md#fs_class_fs_writestream
3040[http-incoming-message]: http.md#http_class_http_incomingmessage
3041[hwm-gotcha]: #stream_highwatermark_discrepancy_after_calling_readable_setencoding
3042[object-mode]: #stream_object_mode
3043[readable-_destroy]: #stream_readable_destroy_err_callback
3044[readable-destroy]: #stream_readable_destroy_error
3045[stream-_final]: #stream_writable_final_callback
3046[stream-_flush]: #stream_transform_flush_callback
3047[stream-_read]: #stream_readable_read_size_1
3048[stream-_transform]: #stream_transform_transform_chunk_encoding_callback
3049[stream-_write]: #stream_writable_write_chunk_encoding_callback_1
3050[stream-_writev]: #stream_writable_writev_chunks_callback
3051[stream-end]: #stream_writable_end_chunk_encoding_callback
3052[stream-pause]: #stream_readable_pause
3053[stream-push]: #stream_readable_push_chunk_encoding
3054[stream-read]: #stream_readable_read_size
3055[stream-resume]: #stream_readable_resume
3056[stream-uncork]: #stream_writable_uncork
3057[stream-write]: #stream_writable_write_chunk_encoding_callback
3058[writable-_destroy]: #stream_writable_destroy_err_callback
3059[writable-destroy]: #stream_writable_destroy_error
3060[writable-new]: #stream_new_stream_writable_options
3061[zlib]: zlib.md
3062