Convert Figma logo to code with AI

rvagg logothrough2

Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise

1,899
106
1,899
5

Quick Overview

Through2 is a small wrapper around Node.js streams, providing a simpler interface for creating transform streams. It's designed to be a thin wrapper around the core stream module, making it easier to work with streams while maintaining full compatibility with Node.js stream APIs.

Pros

  • Simplifies the creation of transform streams
  • Maintains full compatibility with Node.js stream APIs
  • Lightweight and has minimal dependencies
  • Supports both object mode and buffer mode streams

Cons

  • May add a slight overhead compared to using raw Node.js streams
  • Limited to transform streams only (not for readable or writable streams)
  • Requires understanding of Node.js streams concepts
  • Some users might find the API less intuitive compared to other stream libraries

Code Examples

  1. Basic usage:
const through2 = require('through2');

const upperCaseStream = through2(function(chunk, enc, callback) {
  this.push(chunk.toString().toUpperCase());
  callback();
});

process.stdin.pipe(upperCaseStream).pipe(process.stdout);

This example creates a transform stream that converts input text to uppercase.

  1. Object mode stream:
const through2 = require('through2');

const multiplyByTwoStream = through2.obj(function(chunk, enc, callback) {
  this.push({ value: chunk.value * 2 });
  callback();
});

[{ value: 1 }, { value: 2 }, { value: 3 }]
  .forEach(obj => multiplyByTwoStream.write(obj));

multiplyByTwoStream.on('data', (data) => console.log(data));

This example demonstrates an object mode stream that multiplies input values by two.

  1. Async transform:
const through2 = require('through2');

const delayedUpperCaseStream = through2(function(chunk, enc, callback) {
  setTimeout(() => {
    this.push(chunk.toString().toUpperCase());
    callback();
  }, 1000);
});

process.stdin.pipe(delayedUpperCaseStream).pipe(process.stdout);

This example shows how to create an asynchronous transform stream with a delay.

Getting Started

To use Through2 in your project, follow these steps:

  1. Install the package:

    npm install through2
    
  2. Import and use in your code:

    const through2 = require('through2');
    
    const transformStream = through2(function(chunk, enc, callback) {
      // Transform the chunk here
      this.push(transformedChunk);
      callback();
    });
    
    sourceStream.pipe(transformStream).pipe(destinationStream);
    

Replace sourceStream and destinationStream with your actual input and output streams, and implement the transformation logic inside the function passed to through2.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

through2

Build & Test

NPM

A tiny wrapper around Node.js streams.Transform (Streams2/3) to avoid explicit subclassing noise

Inspired by Dominic Tarr's through in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: through(function (chunk) { ... }).

fs.createReadStream('ex.txt')
  .pipe(through2(function (chunk, enc, callback) {
    for (let i = 0; i < chunk.length; i++)
      if (chunk[i] == 97)
        chunk[i] = 122 // swap 'a' for 'z'

    this.push(chunk)

    callback()
   }))
  .pipe(fs.createWriteStream('out.txt'))
  .on('finish', () => doSomethingSpecial())

Or object streams:

const all = []

fs.createReadStream('data.csv')
  .pipe(csv2())
  .pipe(through2.obj(function (chunk, enc, callback) {
    const data = {
        name    : chunk[0]
      , address : chunk[3]
      , phone   : chunk[10]
    }
    this.push(data)

    callback()
  }))
  .on('data', (data) => {
    all.push(data)
  })
  .on('end', () => {
    doSomethingSpecial(all)
  })

Note that through2.obj(fn) is a convenience wrapper around through2({ objectMode: true }, fn).

Do you need this?

Since Node.js introduced Simplified Stream Construction, many uses of through2 have become redundant. Consider whether you really need to use through2 or just want to use the 'readable-stream' package, or the core 'stream' package (which is derived from 'readable-stream'):

const { Transform } = require('readable-stream')

const transformer = new Transform({
  transform(chunk, enc, callback) {
    // ...
  }
})

API

through2([ options, ] [ transformFunction ] [, flushFunction ])

Consult the stream.Transform documentation for the exact rules of the transformFunction (i.e. this._transform) and the optional flushFunction (i.e. this._flush).

options

The options argument is optional and is passed straight through to stream.Transform. So you can use objectMode:true if you are processing non-binary streams (or just use through2.obj()).

The options argument is first, unlike standard convention, because if I'm passing in an anonymous function then I'd prefer for the options argument to not get lost at the end of the call:

fs.createReadStream('/tmp/important.dat')
  .pipe(through2({ objectMode: true, allowHalfOpen: false },
    (chunk, enc, cb) => {
      cb(null, 'wut?') // note we can use the second argument on the callback
                       // to provide data as an alternative to this.push('wut?')
    }
  ))
  .pipe(fs.createWriteStream('/tmp/wut.txt'))

transformFunction

The transformFunction must have the following signature: function (chunk, encoding, callback) {}. A minimal implementation should call the callback function to indicate that the transformation is done, even if that transformation means discarding the chunk.

To queue a new chunk, call this.push(chunk)—this can be called as many times as required before the callback() if you have multiple pieces to send on.

Alternatively, you may use callback(err, chunk) as shorthand for emitting a single chunk or an error.

If you do not provide a transformFunction then you will get a simple pass-through stream.

flushFunction

The optional flushFunction is provided as the last argument (2nd or 3rd, depending on whether you've supplied options) is called just prior to the stream ending. Can be used to finish up any processing that may be in progress.

fs.createReadStream('/tmp/important.dat')
  .pipe(through2(
    (chunk, enc, cb) => cb(null, chunk), // transform is a noop
    function (cb) { // flush function
      this.push('tacking on an extra buffer to the end');
      cb();
    }
  ))
  .pipe(fs.createWriteStream('/tmp/wut.txt'));

through2.ctor([ options, ] transformFunction[, flushFunction ])

Instead of returning a stream.Transform instance, through2.ctor() returns a constructor for a custom Transform. This is useful when you want to use the same transform logic in multiple instances.

const FToC = through2.ctor({objectMode: true}, function (record, encoding, callback) {
  if (record.temp != null && record.unit == "F") {
    record.temp = ( ( record.temp - 32 ) * 5 ) / 9
    record.unit = "C"
  }
  this.push(record)
  callback()
})

// Create instances of FToC like so:
const converter = new FToC()
// Or:
const converter = FToC()
// Or specify/override options when you instantiate, if you prefer:
const converter = FToC({objectMode: true})

License

through2 is Copyright © Rod Vagg and additional contributors and licensed under the MIT license. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE file for more details.

NPM DownloadsLast 30 Days