Package Exports
- @motioneffector/llm
Readme
@motioneffector/llm
A TypeScript client for LLM APIs with OpenRouter support, streaming responses, conversation management, and automatic retries.
Features
- OpenRouter Integration - Access 200+ models through a unified API
- Streaming Support - Real-time response streaming with async iterators
- Conversation Management - Stateful conversations with automatic history tracking
- Automatic Retries - Smart retry logic with exponential backoff
- Token Estimation - Estimate prompt tokens before sending requests
- Type Safety - Full TypeScript definitions with no any types
- Model Information - Built-in pricing and context length data
- Abort Support - Cancel requests using AbortController signals
Quick Start
import { createLLMClient } from '@motioneffector/llm'
// Create a client
const client = createLLMClient({
apiKey: process.env.OPENROUTER_KEY,
model: 'anthropic/claude-sonnet-4'
})
// Send a chat completion request
const response = await client.chat([
{ role: 'user', content: 'Explain quantum computing in simple terms' }
])
console.log(response.content)
console.log(`Used ${response.usage.totalTokens} tokens in ${response.latency}ms`)Testing & Validation
- Comprehensive test suite - 265 unit tests covering core functionality
- Fuzz tested - Randomized input testing to catch edge cases
- Strict TypeScript - Full type coverage with no
anytypes - Zero dependencies - No supply chain risk
License
MIT © motioneffector