JSPM

@flowrag/provider-openai

0.3.4
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 34
  • Score
    100M100P100Q79190F
  • License MIT

🧠 OpenAI provider for FlowRAG - embeddings, entity extraction, and reranking

Package Exports

  • @flowrag/provider-openai
  • @flowrag/provider-openai/package.json

Readme

@flowrag/provider-openai

OpenAI provider for FlowRAG — embeddings, entity extraction, and reranking. Works with any OpenAI-compatible endpoint.

Installation

npm install @flowrag/provider-openai

Usage

Embedder

import { OpenAIEmbedder } from '@flowrag/provider-openai';

const embedder = new OpenAIEmbedder({
  model: 'text-embedding-3-small', // default
  dimensions: 1536,                // default
});

Extractor

import { OpenAIExtractor } from '@flowrag/provider-openai';

const extractor = new OpenAIExtractor({
  model: 'gpt-5-mini', // default
});

Reranker

import { OpenAIReranker } from '@flowrag/provider-openai';

const reranker = new OpenAIReranker();

OpenAI-Compatible Endpoints

All classes accept a baseURL option for Ollama, Azure OpenAI, vLLM, Together, Groq, etc.:

const embedder = new OpenAIEmbedder({
  baseURL: 'http://localhost:11434/v1', // Ollama
  model: 'nomic-embed-text',
  dimensions: 768,
});

Model Constants

import { OpenAIEmbeddingModels, OpenAILLMModels } from '@flowrag/provider-openai';

OpenAIEmbeddingModels.TEXT_EMBEDDING_3_SMALL; // 'text-embedding-3-small'
OpenAILLMModels.GPT_5_MINI;                  // 'gpt-5-mini'

Environment Variables

OPENAI_API_KEY=your-key

License

MIT