JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 3110946
  • Score
    100M100P100Q209015F
  • License BSD-3-Clause

Chain functions as transform streams.

Package Exports

  • stream-chain

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (stream-chain) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

stream-chain

Build status Dependencies devDependencies NPM version

stream-chain creates a chain of object mode transform streams out of regular functions, asynchronous functions, generator functions, and existing Transform and Duplex object mode streams, while properly handling backpressure. The resulting chain is represented as a Duplex stream, which can be combined with other streams the usual way. It eliminates a boilerplate helping to concentrate on functionality without losing the performance.

Originally stream-chain was used internally with stream-fork and stream-json to create flexible data processing pipelines.

stream-chain is a lightweight, no-dependencies micro-package. It is distributed under New BSD license.

Intro

const Chain = require('stream-chain');

const fs = require('fs');
const zlib = require('zlib');
const {Transform} = require('stream');

// the chain will work on a stream of number objects
const chain = new Chain([
  // transforms a value
  x => x * x,
  // returns several values
  x => [x - 1, x, x + 1],
  // waits for an asynchronous operation
  async x => await getTotalFromDatabaseByKey(x),
  // returns multiple values with a generator
  function* (x) {
    for (let i = x; i > 0; --i) {
      yield i;
    }
    return 0;
  },
  // filters out even values
  x => x % 2 ? x : null,
  // uses an arbitrary transform stream
  new Transform({
    writableObjectMode: true,
    transform(x, _, callback) {
      // transform to text
      callback(null, x.toString());
    }
  }),
  // compress
  zlib.createGzip()
]);
// log errors
chain.on('error', error => console.log(error));
// use the chain, and save the result to a file
dataSource.pipe(chain).pipe(fs.createWriteStream('output.txt.gz'));

Making processing pipelines appears to be easy: just chain functions one after another, and we are done. Real life pipelines filter objects out and/or produce more objects out of a few ones. On top of that we have to deal with asynchronous operations, while processing or producing data: networking, databases, files, user responses, and so on. Unequal number of values per stage, and unequal throughput of stages introduced problems like backpressure, which requires algorithms implemented by streams.

While a lot of API improvements were made to make streams easy to use, in reality, a lot of boilerplate is required when creaing a pipeline. stream-chain eliminates most of it.

Installation

npm i --save stream-chain

Documentation

Chain, which is returned by require('stream-chain'), is based on Duplex. It chains its dependents in a single pipeline optionally binding error events.

Many details about this package can be discovered by looking at test files located in tests/ and in the source code (main.js).

Constructor: new Chain(fns[, options])

The constructor accepts following arguments:

  • fns is an array of functions or stream instances.
    • If a value is a function, a Transform stream is created, which calls this function with two parameters: chunk (an object), and an optional encoding. See Node's documentation for more details on those parameters. The function will be called in context of the created stream.
      • If it is a regular function, it can return:
        • Regular value:
          • Array of values to pass several or zero values to the next stream as they are.
            // produces no values:
            x => []
            // produces two values:
            x => [x, x + 1]
            // produces one array value:
            x => [[x, x + 1]]
          • Single value.
            • If it is undefined or null, no value shall be passed.
            • Otherwise, the value will be passed to the next stream.
            // produces no values:
            x => null
            x => undefined
            // produces one value:
            x => x
        • Special value:
          • If it is an instance of Promise or "thenable" (an object with a method called then()), it will be waited for. Its result should be a regular value.
            // delays by 0.5s:
            x => new Promise(resolve => setTimeout(() => resolve(x), 500))
          • If it is an instance of a generator or "nextable" (an object with a method called next()), it will be iterated according to the generator protocol. The results should be regular values.
            // produces multiple values:
            class Nextable {
              constructor(x) { this.x = x; this.i = -1; }
              next() { return {done: this.i <= 1, value: this.x + this.i++}; }
            }
            x => new Nextable(x)
        • Any thrown exception will be catched and passed to a callback function effectively generating an error event.
          // fails
          x => { throw new Error('Bad!'); }
    • If it is an asynchronous function, it can return a regular value.
      • In essence, it is covered under "special values" as a function that returns a promise.
      // delays by 0.5s:
      async x => {
        await new Promise(resolve => setTimeout(() => resolve(), 500));
        return x;
      }
    • If it is a generator function, each yield or return should produce a regular value.
      • In essence, it is covered under "special values" as a function that returns a generator object.
      // produces multiple values:
      function* (x) {
        for (let i = -1; i <= 1; ++i) {
          if (i) yield x + i;
        }
        return x;
      }
    • If a value is a valid stream, it is included as is in the pipeline.
      • Transform.
      • Duplex.
      • The very first stream can be Readable.
        • In this case a Chain instance ignores all possible writes to the front, and ends when the first stream ends.
      • The very last stream can be Writable.
        • In this case a Chain instance does not produce any output, and finishes when the last stream finishes.
        • Because 'data' event is not used in this case, the instance resumes itself automatically. Read about it in Node's documentation:
  • options is an optional object detailed in the Node's documentation.
    • If options is not specified, or falsy, it is assumed to be:
      {writableObjectMode: true, readableObjectMode: true}
    • Always make sure that writableObjectMode is the same as the corresponding object mode of the first stream, and readableObjectMode is the same as the corresponding object mode of the last stream.
      • Eventually both these modes can be deduced, but Node does not define the standard way to determine it, so currently it cannot be done reliably.
    • Additionally following custom properties are recognized:
      • skipEvents is an optional flag. If it is falsy (the default), 'error' events from all streams are forwarded to the created instance. If it is truthy, no event forwarding is made. A user can always do so externally or in a constructor of derived classes.

An instance can be used to attach handlers for stream events.

const chain = new Chain([x => x * x, x => [x - 1, x, x + 1]]);
chain.on('error', error => console.error(error));
dataSource.pipe(chain);

Properties

Following public properties are available:

  • streams is an array of streams created by a constructor. Its values either Transform streams that use corresponding functions from a constructor parameter, or user-provided streams. All streams are piped sequentially starting from the beginning.
  • input is the beginning of the pipeline. Effectively it is the first item of streams.
  • output is the end of the pipeline. Effectively it is the last item of streams.

Generally, a Chain instance should be used to represent a chain:

const chain = new Chain([
  x => x * x,
  x => [x - 1, x, x + 1],
  new Transform({
    writableObjectMode: true,
    transform(chunk, _, callback) {
      callback(null, chunk.toString());
    }
  })
]);
dataSource
  .pipe(chain);
  .pipe(zlib.createGzip())
  .pipe(fs.createWriteStream('output.txt.gz'));

But in some cases input and output provide a better control over how a data processing pipeline should be organized:

chain.output
  .pipe(zlib.createGzip())
  .pipe(fs.createWriteStream('output.txt.gz'));
dataSource.pipe(chain.input);

Please select what style you want to use, and never mix them together with the same object.

Static methods

Following static methods are available:

  • chain(fns[, options) is a factory function, which has the same arguments as the constructor and returns a Chain instance.
    const {chain} = require('stream-chain');
    
    // simple
    dataSource
      .pipe(chain([x => x * x, x => [x - 1, x, x + 1]]));
    
    // all inclusive
    chain([
      dataSource,
      x => x * x,
      x => [x - 1, x, x + 1],
      zlib.createGzip(),
      fs.createWriteStream('output.txt.gz')
    ])

Release History

  • 2.0.0 Upgraded to use Duplex instead of EventEmitter as the base.
  • 1.0.3 Improved documentation.
  • 1.0.2 Better README.
  • 1.0.1 Fixed the README.
  • 1.0.0 The initial release.