Package Exports
- x-llm
- x-llm/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (x-llm) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
OpenAI JSON Utility
A TypeScript utility for calling OpenAI's Chat Completion API with guaranteed JSON output.
Features
- Guaranteed JSON: Uses
response_format: { type: "json_object" }. - Type Safe: Generic
<T>interface for strong typing of LLM responses. - Easy Configuration: Simple parameters for model, instructions, and prompt.
- Token Usage: Returns detailed token usage and finish reason.
Setup
- Clone or copy the files into your project.
- Install dependencies:
npm install openai dotenv
- Create a
.envfile based on.env.exampleand add yourOPENAI_API_KEY.
Usage
import "dotenv/config";
import { callOpenAI } from "./callOpenAI.js";
const result = await callOpenAI<{ sentiment: string; score: number }>({
model: "gpt-5-nano",
instructions: "You are a sentiment analyzer. Always respond in JSON with keys: sentiment, score.",
prompt: "Analyze: 'I absolutely love this product!'",
});
console.log(result.data.sentiment); // "positive"Important Note
When using json_object format, you must include the word "JSON" in your system instructions or user prompt.