JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 53
  • Score
    100M100P100Q79503F
  • License MIT

Terminal-native local LLM chat client — fast, offline, agentic

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (hyperlite-ai) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    HyperLite

    A terminal-native local LLM chat client. Fast, offline, and agentic — runs entirely on your machine using Ollama.

    Install

    npm install -g hyperlite-ai

    Run

    hyperlite

    Requirements

    • Ollama installed and running
    • A downloaded model (e.g. ollama pull qwen2.5-coder:14b)
    • Node.js 16+

    Features

    • Chat with any local Ollama model
    • Agentic coding tools — read, write, edit, search files directly from chat
    • Multi-session history with persistent storage
    • Tabbed command palette (Ctrl+P)
    • Visual folder browser (Ctrl+O) — open any repo as working directory
    • Download models from inside the app
    • Syntax-highlighted responses with markdown rendering
    • Hardware detection — recommends models for your GPU/RAM

    Supported Platforms

    Platform Architecture
    Windows x64
    Linux x64
    macOS Apple Silicon (arm64)
    macOS Intel (x64)

    Source

    github.com/Sean504/HyperLite