Package Exports
- limit-concurrency-decorator
- limit-concurrency-decorator/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (limit-concurrency-decorator) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
limit-concurrency-decorator
Decorator to limit concurrency of async functions
Similar to these libraries, but can be used as decorator:
Also similar to p-concurrency, but the limit can be enforced over multiple functions.
Install
Installation of the npm package:
> npm install --save limit-concurrency-decoratorUsage
Simply apply the decorator to a method:
import { limitConcurrency } from "limit-concurrency-decorator";
class HttpClient {
@limitConcurrency(2)
get() {
// ...
}
}
const client = new HttpClient();
// these calls will run in parallel
client.get("http://example.net/");
client.get("http://example2.net/");
// this call will wait for one of the 2 previous to finish
client.get("http://example3.net/");Or a simple function as a wrapper:
import httpRequest from "http-request-plus";
const httpRequestLimited = limitConcurrency(2)(httpRequest);
// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");
// this call will wait for one of the 2 previous to finish
httpRequestLimited("http://example3.net/");Or even as a call limiter:
const limiter = limitConcurrency(2)(/* nothing */);
// these calls will run in parallel
limiter(asyncFn, param1, ...);
limiter.call(thisArg, asyncFn, param1, ...);
// this call will wait for one of the 2 previous to finish
limiter.call(thisArg, methodName, param1, ...)The limit can be shared:
const myLimit = limitConcurrency(2);
class HttpClient {
@myLimit
post() {
// ...
}
@myLimit
put() {
// ...
}
}With FAIL_ON_QUEUE you can fail early instead of waiting:
import { FAIL_ON_QUEUE } from "limit-concurrency-decorator";
try {
await httpRequestLimited(FAIL_ON_QUEUE, "http://example2.net");
} catch (error) {
error.message; // 'no available place in queue'
}Custom termination:
const httpRequestLimited = limitConcurrency(2, async (promise) => {
const stream = await promise;
await new Promise((resolve) => {
stream.on("end", resolve);
stream.on("error", reject);
});
})(httpRequest);
// these calls will run in parallel
httpRequestLimited("http://example.net/");
httpRequestLimited("http://example2.net/");
// this call will wait for one of the 2 previous responses to have been read entirely
httpRequestLimited("http://example3.net/");Contributions
Contributions are very welcomed, either on the documentation or on the code.
You may:
- report any issue you've encountered;
- fork and create a pull request.
License
ISC © Julien Fontanet
