Package Exports
- @pr0gramm/fluester
- @pr0gramm/fluester/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@pr0gramm/fluester) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
fluester – [ˈflʏstɐ]

Node.js bindings for OpenAI's Whisper. Hard-fork of whisper-node.
Features
- Output transcripts to JSON (also .txt .srt .vtt)
- Optimized for CPU (Including Apple Silicon ARM)
- Timestamp precision to single word
Installation
Requirements
makeand everything else listed as required to compile whisper.cpp- Node.js >= 20
- Add dependency to project
npm install @pr0gramm/fluester- Download whisper model of choice
npx --package @pr0gramm/fluester download-model- Compile whisper.cpp if you don't want to provide you own version:
npx --package @pr0gramm/fluester compile-whisperUsage
Translation
import { createWhisperClient } from "@pr0gramm/fluester";
const client = createWhisperClient({
modelName: "base",
});
const transcript = await client.translate("example/sample.wav");
console.log(transcript); // output: [ {start,end,speech} ]Output (JSON)
[
{
"start": "00:00:14.310", // time stamp begin
"end": "00:00:16.480", // time stamp end
"speech": "howdy" // transcription
}
]Language Detection
import { createWhisperClient } from "@pr0gramm/fluester";
const client = createWhisperClient({
modelName: "base",
});
const result = await client.detectLanguage("example/sample.wav");
if(!result) {
console.log(`Detected: ${result.language} with probability ${result.probability}`);
} else {
console.log("Did not detect anything :(");
}Tricks
This library is designed to work well in dockerized environments.
We took time and made some steps independent from each other, so they can be used in a multi-stage docker build.
FROM node:latest as dependencies
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci
RUN npx --package @pr0gramm/fluester compile-whisper
RUN npx --package @pr0gramm/fluester download-model tiny
FROM node:latest
WORKDIR /app
COPY --from=dependencies /app/node_modules /app/node_modules
COPY ./ ./This includes the model in the image. If you want to keep your image small, you can also download the model in your entrypoint using.
Made with
- A lot of love by @ariym at whisper-node
- Whisper OpenAI (using C++ port by: ggerganov)
Roadmap
- Nothing ¯\_(ツ)_/¯