Package Exports
- @henrygd/queue
- @henrygd/queue/async-storage
Readme
@henrygd/queue
Tiny async queue with concurrency control. Like p-limit
or fastq
but smaller and faster. See comparisons and benchmarks below.
Works with:
Usage
Create a queue with the newQueue
function. Then add async functions - or promise returning functions - to your queue with the add
method.
You can use queue.done()
to wait for the queue to be empty.
import { newQueue } from '@henrygd/queue'
// create a new queue with a concurrency of 2
const queue = newQueue(2)
const pokemon = ['ditto', 'hitmonlee', 'pidgeot', 'poliwhirl', 'golem', 'charizard']
for (const name of pokemon) {
queue.add(async () => {
const res = await fetch(`https://pokeapi.co/api/v2/pokemon/${name}`)
const json = await res.json()
console.log(`${json.name}: ${json.height * 10}cm | ${json.weight / 10}kg`)
})
}
console.log('running')
await queue.done()
console.log('done')
The return value of queue.add
is the same as the return value of the supplied function.
const res = await queue.add(() => fetch('https://pokeapi.co/api/v2/pokemon'))
console.log(res.ok, res.status, res.headers)
[!TIP] If you need support for Node's AsyncLocalStorage, import
@henrygd/queue/async-storage
instead.
Queue interface
/** Add an async function / promise wrapper to the queue */
queue.add<T>(promiseFunction: () => Promise<T>): Promise<T>
/** Returns a promise that resolves when the queue is empty */
queue.done(): Promise<void>
/** Empties the queue (active promises are not cancelled) */
queue.clear(): void
/** Returns the number of promises currently running */
queue.active(): number
/** Returns the total number of promises in the queue */
queue.size(): number
Comparisons and benchmarks
Library | Version | Bundle size (B) | Weekly downloads |
---|---|---|---|
@henrygd/queue | 1.0.3 | 330 | probably only me :) |
p-limit | 5.0.0 | 1,763 | 118,953,973 |
async.queue | 3.2.5 | 6,873 | 53,645,627 |
fastq | 1.17.1 | 3,050 | 39,257,355 |
queue | 7.0.0 | 2,840 | 4,259,101 |
promise-queue | 2.2.5 | 2,200 | 1,092,431 |
Browser benchmark
Each operation adds 1,000 async functions to the queue and waits for them to resolve. The function just increments a counter.[^benchmark]
This test was run in Chromium. Chrome, Edge, and Opera are the same. Firefox and Safari are slower across the board, with @henrygd/queue
edging out promise-queue
, then a gap back to fastq
and async.queue
.
You can run or tweak for yourself here: https://jsbm.dev/uwTSZlrRs9vhz
Node.js benchmarks
Same test as the browser benchmark, but uses 5,000 async functions instead of 1,000.
I have no idea what's going on with p-limit
in Node. The same test with Bun puts it just behind queue
.
Ryzen 7 6800H | 32GB RAM
Ryzen 5 4500U | 8GB RAM
Cloudflare Workers benchmark
Same test as above, with 5,000 functions, and uses oha to make 1,000 requests to each worker.
This was run using Wrangler on a Ryzen 7 6800H laptop. Wrangler uses the same workerd runtime as workers deployed to Cloudflare, so the relative difference should be accurate. Here's the repository for this benchmark.
Library | Requests/sec | Total (sec) | Average | Slowest |
---|---|---|---|---|
@henrygd/queue | 622.7809 | 1.6057 | 0.0786 | 0.1155 |
promise-queue | 324.8053 | 3.0788 | 0.1512 | 0.2174 |
async.queue | 203.9315 | 4.9036 | 0.2408 | 0.3450 |
fastq | 184.0524 | 5.4332 | 0.2670 | 0.3546 |
queue | 86.4867 | 11.5625 | 0.5672 | 0.7636 |
p-limit | 67.5275 | 14.8088 | 0.7274 | 1.0657 |
Real world examples
henrygd/optimize
- Uses @henrygd/queue
to parallelize image optimization jobs.
License
[^benchmark]: In reality, you may not be running so many jobs at once, and your jobs will take much longer to resolve. So performance will depend more on the jobs themselves.