JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 1254
  • Score
    100M100P100Q151074F
  • License MIT

NodeJS chunking streams

Package Exports

  • chunking-streams

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (chunking-streams) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

node-chunking-streams

Build Status Coverage Status NPM version Dependency Status

A set of Node.js streams to process data in chunks

  1. LineCounter
  2. SeparatorChunker
  3. SizeChunker
  4. GzipChunker
  5. S3MultipartUploader

LineCounter

Simple transform stream which counts lines (\n is a separator) and emit data chunks contains exactly specified number of them.

Configuration

new LineCounter({
    numLines: 1,        // number of lines in a single output chunk. 1 is default
    flushTail: false    // on stream end, flush or not remaining buffer. false is default
});

SeparatorChunker

Split incoming data into chunks based on specified separator. After each separator found data chunk is emitted. By default separator sequence is set to \n, so it is equals to LineCounter with numLines: 1

Configuration

new SeparatorChunker({
    separator: '\n', // separator sequence
    flushTail: false // on stream end, flush or not remaining buffer. false is default
});

SizeChunker

Split streams into chunks having at least specified size in bytes (but maybe more). It is object mode stream! Each data chunk is an object with the following fields:

  • id: number of chunk (starts from 1)
  • data: Buffer with data

SizeChunker has 2 additional events:

  • chunkStart: emitted on each chunk start.
  • chunkEnd: emitted on each chunk finish.

Both event handlers must accept two arguments:

  • id: number of chunk
  • done: callback function, must be called then processing is completed

Configuration

new SizeChunker({
    chunkSize: 1024 // must be a number greater than zero
});

Example

var input = fs.createReadStream('./input'),
    chunker = new SizeChunker({
        chunkSize: 1024
    }),
    output;

chunker.on('chunkStart', function(id, done) {
    output = fs.createWriteStream('./output-' + id);
    done();
});

chunker.on('chunkEnd', function(id, done) {
    output.end();
    done();
});

chunker.on('data', function(chunk) {
    output.write(chunk.data);
});

input.pipe(chunker);