JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 29
  • Score
    100M100P100Q103253F
  • License MIT

Developer-first AI CLI for cross-language code intelligence. Trace call graphs across Java, SQL, Python, TypeScript. Impact analysis, agentic execution, code review, git tools, 40+ commands. Works with Ollama, Copilot, OpenAI, Anthropic.

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@sunilp-org/jam-cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

        ██╗  █████╗  ███╗   ███╗
        ██║ ██╔══██╗ ████╗ ████║
        ██║ ███████║ ██╔████╔██║
    ██  ██║ ██╔══██║ ██║╚██╔╝██║
    ╚████╔╝ ██║  ██║ ██║ ╚═╝ ██║
     ╚═══╝  ╚═╝  ╚═╝ ╚═╝     ╚═╝
    

    jam

    The developer-first AI CLI. Cross-language code intelligence from your terminal.

    Trace call graphs across Java, SQL, Python, and TypeScript. Impact analysis. AI-powered agentic execution. 978 tests. Zero vendor lock-in.

    CI npm License: MIT

    Docs · Install · VSCode Extension

    jam CLI — trace, git wtf, agent

    What Jam Does

    Jam isn't a generic AI assistant. It's the senior dev who's seen everything — direct, opinionated, and warm. Every message, error, and prompt speaks with the same voice: concise, specific, developer-aligned.

    • 🔍 Call graph tracing — trace any symbol's callers, callees, and upstream chain across languages
    • 💥 Impact analysis — "if I change this, what breaks?" with column-level SQL dependency tracking
    • 🤖 Agentic executionjam go (interactive) and jam run (one-shot) decompose tasks into parallel subtasks
    • 💬 AI chat & ask — streaming responses, multi-turn sessions, stdin/pipe support
    • 🩹 Patch workflow — generate diffs, validate, preview, apply with confirmation
    • 📊 Code intelligence — explain files, search code, review diffs, generate Mermaid diagrams
    • 🔧 Git toolkitwtf explains state, undo reverses mistakes, standup shows your work
    • Verification — scan for secrets, lint, type-check before you commit
    • 🧰 19 zero-LLM utilitiesports, stats, deps, todo, hash, json, env, and more
    • 🔌 Any provider — Ollama, OpenAI, Anthropic, Groq, GitHub Copilot — or bring your own
    • 🏠 Local-first — your code never leaves your machine unless you choose a remote provider
    • 🔗 MCP + plugins — connect to Model Context Protocol servers, drop in custom commands

    Install

    # npm
    npm install -g @sunilp-org/jam-cli
    
    # Homebrew
    brew tap sunilp/tap && brew install jam-cli
    
    # Try without installing
    npx @sunilp-org/jam-cli doctor

    Jam auto-detects the best available AI provider:

    Priority Provider Setup
    1 GitHub Copilot VSCode extension or Copilot CLI installed
    2 Anthropic export ANTHROPIC_API_KEY=sk-ant-...
    3 OpenAI export OPENAI_API_KEY=sk-...
    4 Ollama (default) ollama serve + ollama pull llama3.2
    jam doctor    # verify everything works

    Cookbook

    Ask & Chat

    jam ask "explain the builder pattern in Go"
    
    # pipe anything
    cat schema.sql | jam ask "what tables have no foreign keys?"
    git log --since="1 week" -p | jam ask "summarize this week's changes"
    
    # interactive chat with history
    jam chat

    Agent Engine

    # interactive agent console — reads, writes, runs commands
    jam go
    jam> add retry logic to the HTTP client with exponential backoff
    
    # one-shot autonomous task
    jam run "add input validation to all API endpoints" --yes
    
    # fully autonomous with parallel workers
    jam run "refactor auth module into separate files" --auto --workers 4

    Code Intelligence

    # trace a function's call graph
    jam trace createProvider
    jam trace updateBalance --impact       # what breaks if this changes?
    jam trace handleRequest --mermaid      # output as Mermaid diagram
    jam trace PROC_PAYMENT --depth 8       # deeper upstream chain
    
    # explain any file
    jam explain src/auth/middleware.ts
    
    # search with AI understanding
    jam search "where is the rate limiter configured?"
    
    # generate architecture diagram from code
    jam diagram

    Git Toolkit

    jam git wtf          # "3 files staged, 2 conflicts, 1 stash. Here's what happened..."
    jam git undo         # undo last commit, last stash, or last merge
    jam git standup      # your commits from the last 3 days
    jam git cleanup      # preview and delete merged branches
    jam git oops         # fix common mistakes (wrong branch, bad commit message)

    Dev Utilities (zero LLM)

    jam stats            # LOC, languages, complexity hotspots
    jam deps             # import dependency graph
    jam todo             # find all TODO/FIXME/HACK comments
    jam verify           # pre-commit checks: secrets, lint, types
    jam ports            # what's listening on which port
    jam env              # environment variable diff between shells
    jam hash <file>      # MD5/SHA1/SHA256 of any file
    jam json <file>      # validate, format, query JSON
    jam recent           # recently modified files
    jam convert 5kg lb   # unit conversions
    jam http GET /users  # quick HTTP requests
    jam pack             # analyze npm/pip/cargo package size

    Patch & Review

    # AI-powered diff summary
    jam diff
    
    # code review with risk assessment
    jam review
    
    # generate and apply a patch
    jam patch "add error handling to the database module"
    
    # auto-generate commit message matching your project's convention
    jam commit

    VSCode Extension

    Install from Marketplace

    • All commands in the Command Palette
    • @jam chat participant in GitHub Copilot Chat
    • TODO tree in the sidebar with click-to-navigate
    • Copilot auto-detected as AI provider — zero configuration
    • Keeps jam-cli updated automatically

    Configuration

    jam init              # interactive setup wizard
    jam config show       # show resolved config
    // .jamrc (per-project)
    {
      "defaultProfile": "work",
      "profiles": {
        "work": { "provider": "anthropic", "model": "claude-sonnet-4-20250514" },
        "local": { "provider": "ollama", "model": "llama3.2" }
      }
    }
    jam ask "hello" --profile work     # use Anthropic
    jam ask "hello" --profile local    # use Ollama

    Supports HTTP proxy (HTTP_PROXY), custom CA certificates (tlsCaPath), configurable timeouts, MCP servers, and plugin loading. Full configuration docs →


    Contributing

    See CONTRIBUTING.md. PRs welcome.

    License

    MIT