Package Exports
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@openrouter/cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
OpenRouter CLI
A powerful command-line interface for OpenRouter, enabling seamless integration with multiple language models and AI services. Currently featuring Claude Code integration with more capabilities coming soon.
✨ Features
- Multi-Model Support: Access any model available on OpenRouter through a unified interface
- Claude Code Integration: Use any OpenRouter model with Claude Code for enhanced development workflows
- Smart Model Routing: Automatically route requests to optimal models based on task type (reasoning, long context, web search, etc.)
- Dynamic Model Switching: Change models on-the-fly using simple commands
- Extensible Architecture: Built to support future OpenRouter features and integrations
🚀 Getting Started
1. Installation
Install the OpenRouter CLI:
npm install -g @openrouter/cliFor Claude Code integration, ensure you have Claude Code installed:
npm install -g @anthropic-ai/claude-code2. Configuration
Create and configure your ~/.openrouter/config.json file. For more details, see the included config.example.json.
The configuration file supports several key sections:
OPENROUTER_API_KEY: Your OpenRouter API key for accessing modelsLOG(optional): Enable/disable logging. Default istrueLOG_LEVEL(optional): Set logging verbosity:"fatal","error","warn","info","debug","trace". Default is"debug"- Logging Systems: The OpenRouter CLI uses two separate logging systems:
- Server logs: HTTP requests and API events in
~/.openrouter/logs/ - Application logs: Routing decisions and operations in
~/.openrouter/router.log
- Server logs: HTTP requests and API events in
models: Configure model routing for different task typesAPI_TIMEOUT_MS: API call timeout in milliseconds
Environment Variable Support
The OpenRouter CLI supports environment variable interpolation for secure credential management. Reference environment variables in your config using $VAR_NAME or ${VAR_NAME} syntax:
{
"OPENROUTER_API_KEY": "$OPENROUTER_API_KEY"
}Keep sensitive credentials secure by storing them as environment variables rather than in configuration files.
Example Configuration
{
"OPENROUTER_API_KEY": "$OPENROUTER_API_KEY",
"LOG": true,
"models": {
"default": "deepseek/deepseek-chat",
"background": "qwen/qwen3-30b-a3b",
"think": "deepseek/deepseek-reasoner",
"longContext": "google/gemini-2.5-pro-preview",
"longContextThreshold": 60000,
"webSearch": "google/gemini-2.5-flash:online"
}
}3. Usage
Claude Code Integration
Use any OpenRouter model with Claude Code:
openrouter code "your prompt here"Service Management
openrouter proxy start # Start the proxy service
openrouter proxy stop # Stop the proxy service
openrouter proxy restart # Restart the service (required after config changes)
openrouter proxy status # Check service status4. Model Routing
Configure intelligent model routing based on task requirements:
default: Primary model for general tasksbackground: Cost-efficient model for background operationsthink: Advanced reasoning model for complex problem-solvinglongContext: Model optimized for large context windows (>60K tokens)longContextThreshold: Token threshold for automatic long-context routing (default: 60000)webSearch: Model with web search capabilities (append:onlineto model name)
Dynamic Model Selection
Switch models on-the-fly within Claude Code:
/model anthropic/claude-sonnet-4Developing
Installation
Make sure you have bun.
bun installRunning locally
All openrouter commands above via npm dev:
bun run dev proxy start
bun run dev proxy status
bun run dev code
bun run dev proxy stopTesting your built package locally with npm
bun run build
bun pm pack
# local npm install
cd /path/to/somewhere
npm uninstall @openrouter/cli
npm install /path/to/cli/openrouter-cli-version-number.tgz
node_modules/.bin/openrouter proxy start
node_modules/.bin/openrouter code
# global npm install
npm uninstall -g @openrouter/cli
npm install -g /path/to/cli/openrouter-cli-version-number.tgz
openrouter proxy start
openrouter code🚧 Coming Soon
The OpenRouter CLI is actively expanding to support:
- Direct model querying/chat
- Ability to intercept all calls on your system to openai or anthropic through your openai account to use any model
- API usage monitoring and analytics
- Account management
- And much more!
Stay tuned for updates as we build out the complete OpenRouter experience.