JSPM

@rajparekh/roundabout

2.0.1
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 1
  • Score
    100M100P100Q73004F
  • License MIT

Local OpenAI-compatible LLM proxy

Package Exports

  • @rajparekh/roundabout

Readme

Roundabout

Roundabout is a local LLM gateway you run on your own machine.

It starts a daemon on localhost, exposes OpenAI-compatible and Anthropic-compatible endpoints, and routes stable model names to upstream providers like OpenAI, Anthropic, and OpenRouter. The goal is to let local tools talk to one endpoint while you keep provider keys, model config, and fallback policy in one place.

What It Does

  • runs a local daemon with a single auth/token model for your apps
  • exposes OpenAI-style chat and embeddings endpoints
  • exposes Anthropic-style messages, completions, and token counting endpoints
  • resolves local model names to provider/model targets with ordered fallback
  • supports ordered provider fallback through model config
  • works well as a bridge for local tools such as Claude Code or custom scripts

Install

npm install -g @rajparekh/roundabout

Or run it without a global install:

npx @rajparekh/roundabout start

This installs the roundabout CLI command.

Quick Start

  1. Run the setup wizard:
roundabout setup
  1. Start the daemon:
roundabout start
  1. Generate a token for a local client:
roundabout token create my-app
  1. Point your client at the local daemon:

OpenAI-style:

curl http://127.0.0.1:4317/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer rb_your_token" \
  -d '{
    "model": "smart",
    "messages": [
      {"role": "user", "content": "Say hello"}
    ]
  }'

Anthropic-style:

curl http://127.0.0.1:4317/v1/messages \
  -H "Content-Type: application/json" \
  -H "x-api-key: rb_your_token" \
  -d '{
    "model": "smart",
    "max_tokens": 128,
    "messages": [
      {"role": "user", "content": "Say hello"}
    ]
  }'

Commands

roundabout setup
roundabout start
roundabout start --debug
roundabout token create my-app
roundabout token rotate my-app
roundabout token list
roundabout status

For local development:

npm run dev -- setup
npm run dev -- start

Config

By default Roundabout stores config in ~/.roundabout/config.json.

{
  "daemon": {
    "host": "127.0.0.1",
    "port": 4317
  },
  "providers": {
    "openai": {
      "enabled": true,
      "apiKey": "sk-openai"
    },
    "my-gateway": {
      "enabled": true,
      "apiType": "anthropic",
      "apiKey": "sk-gateway",
      "baseUrl": "https://gateway.example.com/v1"
    }
  },
  "models": {
    "smart": {
      "providers": [
        { "provider": "my-gateway", "model": "claude-sonnet-custom" }
      ],
      "capabilities": ["chat"]
    }
  },
  "tokens": {
    "my-app": {
      "token": "rb_example",
      "createdAt": "2025-01-01T00:00:00.000Z",
      "updatedAt": "2025-01-01T00:00:00.000Z"
    }
  }
}

Key config sections:

  • daemon: local bind host and port
  • providers: named upstream providers, each with an apiType of openai or anthropic, plus API key and optional base URL
  • models: local model names and ordered provider routes
  • tokens: local client tokens accepted by the daemon

Built-in provider names like openai, anthropic, and openrouter still work. Custom provider names work the same way, but must include apiType.

API Surfaces

OpenAI-compatible:

  • POST /v1/chat/completions
  • POST /v1/embeddings
  • GET /v1/models

Anthropic-compatible:

  • POST /v1/messages
  • POST /v1/complete
  • POST /v1/messages/count_tokens

Auth:

  • OpenAI-style endpoints accept Authorization: Bearer <project-token>
  • Anthropic-style endpoints accept either x-api-key: <project-token> or Bearer auth

Packaging

The npm package is published as @rajparekh/roundabout and installs the roundabout binary.

To build or package locally:

npm run build
npm pack

License

MIT. See LICENSE.