JSPM

batch-stream2

0.1.3
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 18
  • Score
    100M100P100Q42285F
  • License MIT

Transform a stream into batches, with custom async operation before emitting data

Package Exports

  • batch-stream2

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (batch-stream2) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

Batch Stream

Transform stream which batches a bunch of input data into groups of specified size. Will emit arrays, so that you can deal with pieces of input asynchronously.

Usage

var batch = new BatchStream({
  size : 100,     // the size for each chunk
  timeout: 5000   // emit data after this amount of milliseconds
                  // even if the size of buffered writes not reaching `size`
});

stream
  .pipe(batch)
  .pipe(new ArrayStream()); // deals with array input from pipe.

This is also usefull when you want transform continuous writes into batches:

Suppose you have a docs stream, instead of:

docs.on('data', function(doc) {
  db.insert(doc)
})

You can:

var batch = new BatchStream({
  transform: function(items, callback) {
    db.bulkInsert(items, callback)
  })
})

docs.pipe(batch)
.on('finish', function() {
  console.log('All doc inserted.')
})

Note that by passing a options.transform to the constructor, instead of listening on data events, the insertions are ensured to be sequential.

If insertions are allowed to happen parrallelly:

var batch = new BatchStream()

docs.pipe(batch)
.on('data', function(items) {
  db.bulkInsert(items, ...)
})
.on('finish', function() {
  console.log('All docs queued for insertion.')
})

License

the MIT license.