JSPM

  • Created
  • Published
  • Downloads 247
  • Score
    100M100P100Q111901F
  • License MIT

Anthropic-compatible multi-provider LLM proxy for Claude Code — use 42+ providers instead of expensive Claude subscriptions

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (clco-proxy) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    clco-proxy

    npm License: MIT

    Use any LLM with Claude Code. One command, zero config.

    npx clco-proxy

    한국어


    Quick Start

    npx clco-proxy
      1. ChatGPT / Codex login       ← Browser login, no API key needed
      2. Single provider              ← One API key, simplest setup
      3. Mixed presets (advanced)     ← Different provider per tier

    Pick one, enter credentials, done. Proxy starts, Claude Code launches.

    How It Works

      Claude Code                  clco-proxy                LLM Provider
      ─────────────────────────►   localhost:8080   ─────────►
      (Anthropic API format)       Maps model to     (Provider API format)

    Uses Claude Code's official env vars (ANTHROPIC_BASE_URL, ANTHROPIC_DEFAULT_*_MODEL) to route requests. Streaming, tool use, and reasoning all work transparently.

    Presets

    Opus Sonnet Haiku
    Best Value DeepSeek V4-Pro MiniMax M2.7 DeepSeek V4-Flash
    Performance GPT-5.5 DeepSeek V4-Pro DeepSeek V4-Flash
    Budget GLM-5.1 GLM-5-Turbo Gemini 2.5 Flash
    Local Qwen3-235B Qwen3-30B Qwen3-8B

    Providers (42+)

    Use your subscription: ChatGPT Plus / Codex, GitHub Copilot, Gemini Ultra

    API keys: DeepSeek, MiniMax, GLM / Z.AI, Groq, Gemini, OpenRouter, OpenAI, Mistral, xAI (Grok), Fireworks, Together, Cerebras, Cohere, Perplexity, OpenCode Zen, OpenCode Go, KiloCode, Kimi, Qwen, SiliconFlow, Hugging Face, NVIDIA, Arcee, AI21, Volcengine, StepFun, Xiaomi, + more

    Local (free): Ollama, LM Studio, Cloudflare Workers, AWS Bedrock, Azure AI Foundry

    Enterprise gateways: Vercel AI Gateway, OpenRouter (200+ models), any OpenAI-compatible endpoint

    Commands

    npx clco-proxy                  # Setup + start
    clco-proxy start                # Start daemon
    clco-proxy stop                 # Stop daemon
    clco-proxy status               # Check status
    clco-proxy restore              # Restore original Claude settings

    Disclaimer

    This tool sets environment variables (ANTHROPIC_BASE_URL, ANTHROPIC_DEFAULT_*_MODEL) that Claude Code reads natively. It does not modify, decompile, or reverse-engineer Claude Code.

    Users are responsible for complying with each provider's terms of service. This project is not affiliated with Anthropic, OpenAI, or any other provider. Use at your own risk.

    License

    MIT


    clco-proxy

    npm License: MIT

    Claude Code에서 어떤 LLM이든 사용하세요. 명령어 하나, 설정 제로.

    npx clco-proxy

    English


    시작하기

    npx clco-proxy
      1. ChatGPT / Codex 로그인       ← 브라우저 로그인, API 키 불필요
      2. 단일 프로바이더              ← API 키 하나로 간편하게
      3. 프리셋 (고급)                ← 티어별 다른 프로바이더 조합

    하나 선택 → 인증 → 끝. 프록시가 시작되고 Claude Code가 실행됩니다.

    작동 원리

      Claude Code                  clco-proxy                LLM 프로바이더
      ─────────────────────────►   localhost:8080   ─────────►
      (Anthropic API 형식)         모델 매핑           (프로바이더 API 형식)

    Claude Code의 공식 환경변수(ANTHROPIC_BASE_URL, ANTHROPIC_DEFAULT_*_MODEL)를 사용하여 요청을 라우팅합니다. 스트리밍, 도구 사용, 추론 모두 투명하게 동작합니다.

    프리셋

    Opus Sonnet Haiku
    최고 가성비 DeepSeek V4-Pro MiniMax M2.7 DeepSeek V4-Flash
    최고 성능 GPT-5.5 DeepSeek V4-Pro DeepSeek V4-Flash
    저예산 GLM-5.1 GLM-5-Turbo Gemini 2.5 Flash
    로컬 Qwen3-235B Qwen3-30B Qwen3-8B

    프로바이더 (42+)

    구독 사용: ChatGPT Plus / Codex, GitHub Copilot, Gemini Ultra

    API 키: DeepSeek, MiniMax, GLM / Z.AI, Groq, Gemini, OpenRouter, OpenAI, Mistral, xAI (Grok), Fireworks, Together, Cerebras, Cohere, Perplexity, OpenCode Zen, OpenCode Go, KiloCode, Kimi, Qwen, SiliconFlow, Hugging Face, NVIDIA, Arcee, AI21, Volcengine, StepFun, Xiaomi, + 더 많은 프로바이더

    로컬 (무료): Ollama, LM Studio, Cloudflare Workers, AWS Bedrock, Azure AI Foundry

    엔터프라이즈 게이트웨이: Vercel AI Gateway, OpenRouter (200+ 모델), 모든 OpenAI 호환 엔드포인트

    명령어

    npx clco-proxy                  # 설정 + 시작
    clco-proxy start                # 데몬 시작
    clco-proxy stop                 # 데몬 중지
    clco-proxy status               # 상태 확인
    clco-proxy restore              # 원래 Claude 설정 복원

    면책 조항

    이 도구는 Claude Code가 자체적으로 읽는 환경변수(ANTHROPIC_BASE_URL, ANTHROPIC_DEFAULT_*_MODEL)만 설정합니다. Claude Code를 변조, 디컴파일, 역공학하지 않습니다.

    각 프로바이더의 이용약관 준수는 사용자 책임입니다. 이 프로젝트는 Anthropic, OpenAI 및 어떤 프로바이더와도 관련이 없습니다. 사용으로 인한 책임은 사용자에게 있습니다.

    라이선스

    MIT