Backend Development 14 min read

Mastering Node.js Streams: From Readable to Duplex and Beyond

Explore the inner workings of Node.js streams—including readable, writable, duplex, transform, and pipe—through detailed explanations, code demos, and visual diagrams, revealing how backpressure, flow modes, and internal mechanisms shape data handling in backend development.

Taobao Frontend Technology
Taobao Frontend Technology
Taobao Frontend Technology
Mastering Node.js Streams: From Readable to Duplex and Beyond

Many developers are already familiar with Node.js

stream

objects, whether they are request streams, response streams, file streams, or socket streams. All of these streams are built on the core

stream

module, and even

console.log

uses it internally. You can see this by opening the Node.js runtime source file

lib/console.js

:

<code>function write(ignoreErrors, stream, string, errorhandler) {
  // ...
  stream.once('error', noop);
  stream.write(string, errorhandler);
}
Console.prototype.log = function log(...args) {
  write(this._ignoreErrors, this._stdout, util.format.apply(null, args));
};
</code>

Understanding the

stream

module makes many other Node.js modules easier to grasp.

Stream Module

If you know the classic producer‑consumer problem, the concept of a stream becomes straightforward. A stream is essentially a state‑management unit that moves data from a source to a destination while controlling flow. The most relevant source files are:

lib/module.js
lib/_stream_readable.js
lib/_stream_writable.js
lib/_stream_transform.js
lib/_stream_duplex.js

Once you understand

Readable

and

Writable

, the other stream types become easy to follow.

Readable Stream

A

Readable

stream has two operating modes. The first is Flowing Mode , which automatically emits

data

events when a listener is attached:

<code>const readable = getReadableStreamSomehow();
readable.on('data', chunk => {
  console.log(`Received ${chunk.length} bytes of data.`);
});
</code>

Data is first pushed into an internal buffer that has a

highWaterMark

. When the buffer exceeds this threshold,

stream.push()

returns

false

, indicating back‑pressure. Back‑pressure occurs when the consumer pauses (

stream.pause()

) or when the consumption speed is slower than the production speed.

Below is a simple demo that implements a custom

Readable

subclass:

<code>const { Readable } = require('stream');
class MyReadable extends Readable {
  constructor(dataSource, options) {
    super(options);
    this.dataSource = dataSource;
  }
  _read() {
    const data = this.dataSource.makeData();
    this.push(data);
  }
}
const dataSource = {
  data: new Array(10).fill('-'),
  makeData() {
    if (!this.data.length) return null;
    return this.data.pop();
  }
};
const myReadable = new MyReadable(dataSource);
myReadable.setEncoding('utf8');
myReadable.on('data', chunk => console.log(chunk));
</code>

The second mode is Non‑Flowing Mode (the default). In this mode the internal state

_readableState.flow

can be

null

,

false

(paused), or

true

(flowing). When a

readable

event is listened to, you must manually call

myReadable.read()

to pull data from the buffer.

<code>myReadable.on('readable', () => {
  let chunk;
  while (null !== (chunk = myReadable.read())) {
    console.log(`Received ${chunk.length} bytes of data.`);
  }
});
</code>

The buffer size defaults to 16 KB (16384 bytes) and can be increased up to 8 MB.

Writable Stream

A

Writable

stream receives data and writes it to a destination. When the producer writes faster than the consumer can handle, data is queued in an internal buffer. Once the buffer fills, back‑pressure is signaled and the producer must pause until a

drain

event is emitted.

<code>function writeOneMillionTimes(writer, data, encoding, callback) {
  let i = 10000;
  function write() {
    let ok = true;
    while (i-- > 0 && ok) {
      ok = writer.write(data, encoding, i === 0 ? callback : null);
    }
    if (i > 0) {
      console.log('drain', i);
      writer.once('drain', write);
    }
  }
  write();
}
const { Writable } = require('stream');
const writer = new Writable({
  write(chunk, encoding, callback) {
    setTimeout(() => callback && callback(), 0);
  }
});
writeOneMillionTimes(writer, 'simple', 'utf8', () => console.log('end'));
</code>

The demo prints three

drain

messages, showing that back‑pressure occurred three times.

Pipe

The

pipe

method connects a readable stream to a writable stream, handling back‑pressure automatically. The simplified implementation is:

<code>Readable.prototype.pipe = function (writable, options) {
  this.on('data', chunk => {
    const ok = writable.write(chunk);
    if (!ok) this.pause();
  });
  writable.on('drain', () => this.resume());
  writable.emit('pipe', this);
  return writable;
};
</code>

This logic emits a

pipe

event, writes data, pauses the source when the destination signals back‑pressure, and resumes when a

drain

event occurs.

Duplex Stream

A duplex stream combines readable and writable capabilities. Internally it inherits from

Readable

and copies all writable methods:

<code>const util = require('util');
const { Readable } = require('_stream_readable');
const { Writable } = require('_stream_writable');
function Duplex(opts) {
  Readable.call(this, opts);
  Writable.call(this, opts);
}
util.inherits(Duplex, Readable);
Object.keys(Writable.prototype).forEach(method => {
  if (!Duplex.prototype[method]) {
    Duplex.prototype[method] = Writable.prototype[method];
  }
});
</code>

A simple demo shows independent read and write flows:

<code>const { Duplex } = require('stream');
const duplex = new Duplex();
let i = 2;
duplex._read = function () {
  this.push(i-- ? 'read ' + i : null);
};
duplex.on('data', data => console.log(data.toString()));
duplex._write = function (chunk, enc, cb) {
  console.log(chunk.toString());
  cb();
};
duplex.write('write');
</code>

Running this code prints:

<code>write
read 1
read 0
</code>

Transform Stream

A transform stream is a duplex stream whose output is derived from its input via a transformation function. It is commonly used for tasks such as compression or data format conversion.

<code>const { Transform } = require('stream');
const MAP = { Barret: '靖', Lee: '李' };
class Translate extends Transform {
  constructor(options) { super(options); }
  _transform(buf, enc, next) {
    const key = buf.toString();
    const data = MAP[key];
    this.push(data);
    next();
  }
}
const transform = new Translate();
transform.on('data', d => console.log(d.toString()));
transform.write('Lee');
transform.write('Barret');
transform.end();
</code>

The output is:

<code>李
靖
</code>

Conclusion

This article mainly references the official Node.js documentation and source code. It covers the fundamentals of streams, including back‑pressure, flow modes, and the core implementations of readable, writable, duplex, and transform streams. For a deeper mastery, readers are encouraged to read the official docs and experiment with code.

Understanding these internal mechanisms greatly aids future work with higher‑level Node.js APIs, especially for developers new to the platform.

Image credit: https://unsplash.com/photos/pOWBHdgy1Lo by @Neven Krcmarek
Node.js stream illustration
Node.js stream illustration
Readable stream flow diagram
Readable stream flow diagram
Writable stream back‑pressure diagram
Writable stream back‑pressure diagram
Duplex stream illustration
Duplex stream illustration
Transform stream illustration
Transform stream illustration
backend developmentNode.jsStreamsTransformDuplexReadableWritable
Taobao Frontend Technology
Written by

Taobao Frontend Technology

The frontend landscape is constantly evolving, with rapid innovations across familiar languages. Like us, your understanding of the frontend is continually refreshed. Join us on Taobao, a vibrant, all‑encompassing platform, to uncover limitless potential.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.