JSPM

  • Created
  • Published
  • Downloads 2442
  • Score
    100M100P100Q144244F

A wrapper around GitHub Copilot API to make it OpenAI compatible, making it usable for other tools.

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (copilot-api) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    Copilot API

    ⚠️ EDUCATIONAL PURPOSE ONLY ⚠️ This project is a reverse-engineered implementation of the GitHub Copilot API created for educational purposes only. It is not officially supported by GitHub and should not be used in production environments.

    Project Overview

    A wrapper around GitHub Copilot API to make it OpenAI compatible, making it usable for other tools.

    Demo

    https://github.com/user-attachments/assets/7654b383-669d-4eb9-b23c-06d7aefee8c5

    Prerequisites

    • Bun (>= 1.2.x)
    • GitHub account with Copilot Individual subscription

    Installation

    To install dependencies, run:

    bun install

    Using with npx

    You can run the project directly using npx:

    npx copilot-api@latest

    With options:

    npx copilot-api --port 8080 --emulate-streaming

    Running from Source

    The project can be run from source in several ways:

    Development Mode

    bun run dev

    Starts the server with hot reloading enabled, which automatically restarts the server when code changes are detected. This is ideal for development.

    Production Mode

    bun run start

    Runs the server in production mode with hot reloading disabled. Use this for deployment or production environments.

    Command Line Options

    The server accepts several command line options:

    Option Description Default
    --help, -h Show help message false
    --port, -p Port to listen on 4141
    --verbose, -v Enable verbose logging false
    --log-file File path for logging -

    Example with options:

    bun run start --port 8080 --emulate-streaming

    In all cases, the server will start and listen for API requests on the specified port.

    Tested Tools Compatibility

    Tool Status Notes
    Aider Full Fully compatible
    bolt.diy Full Fully compatible; use any random API key in UI if models fail to load
    Page Assist Full Fully compatible
    Kobold AI Lite Full Fully compatible

    Note: In general, any application that uses the standard OpenAI-compatible /chat/completions and /models endpoints should work with this API.