JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 955
  • Score
    100M100P100Q111418F
  • License MIT

Fetch DB entries in batches to improve performance while respecting IPC size constraints

Package Exports

  • dexie-batch

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (dexie-batch) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

dexie-batch

Fetch IndexedDB entries in batches to improve performance while avoiding errors like Maximum IPC message size exceeded.

Installation

npm i dexie-batch

Usage

import DexieBatch from 'dexie-batch'
import table from './my-awesome-dexie-table'

const collection = table.toCollection()

const batchDriverPromise = table.count()
  .then(n => new DexieBatch({ batchSize: 25, limit: n }))

batchDriverPromise
  .then(batchDriver => batchDriver.each(collection, (entry, idx) => {
    // Process each item individually
  }))
  .then(n => console.log(`Finished batch operation using ${n} batches`))

batchDriverPromise
  .then(batchDriver => batchDriver.eachBatch(collection, (batch, batchIdx) => {
    // Process each batch (array of entries) individually
  }))
  .then(n => console.log(`Finished batch operation using ${n} batches`))

// This will return true in this case
batchDriverPromise
  .then(batchDriver => batchDriver.isParallel())

The returned Dexie.Promise resolves when all batch operations have finished. If the user callback returns a Promise it is waited upon.

The batchSize option is mandatory since a sensible value depends strongly on the individual record size.

Batches are requested in parallel iff limit option is present. Otherwise we would not know when to stop sending requests. When no limit is given, batches are requested serially until one request gives an empty result.