JSPM

agent-context-kit

0.1.4
    • ESM via JSPM
    • ES Module Entrypoint
    • Export Map
    • Keywords
    • License
    • Repository URL
    • TypeScript Types
    • README
    • Created
    • Published
    • Downloads 5
    • Score
      100M100P100Q91138F
    • License MIT

    Agent Context Kit (ACK): a lightweight Node.js CLI for capturing project context and collaborating with an LLM.

    Package Exports

    • agent-context-kit

    Readme

    Agent Context Kit (ACK)

    A lightweight Node.js CLI that captures repository context and collaborates with an LLM to generate briefs, ADR proposals, and reminders.

    • Commands: init, capture, brief, diff, remind
    • Commands: init, capture, brief, diff, remind, context
    • Commands: init, capture, brief, diff, remind, context, features generate, snapshot, close, recall
    • Config via .ackrc.json and/or environment variables
    • Stores artifacts in .context/

    Install (published)

    Once published to npm as agent-context-kit:

    npx agent-context-kit init

    Global install

    Install once globally to enable ack init directly:

    npm i -g agent-context-kit
    ack init

    ack is exposed via the package bin:

    ack capture
    ack brief --notes "Focus on auth edge cases"

    Configuration

    Create .ackrc.json at your repo root (defaults shown):

    {
      "llm": {
        "provider": "openai",
        "model": "gpt-5-mini",
        "apiKey": "${OPENAI_API_KEY}"
      },
      "paths": {
        "contextRoot": ".context",
        "prompts": ".context/prompts"
      }
    }

    You can set OPENAI_API_KEY in your environment or a .env file; dotenv is loaded automatically.

    Project Structure

    /ack/
      package.json
      tsconfig.json
      bin/
        ack.js
      src/
        cli.ts
        commands/
          init.ts
          capture.ts
          brief.ts
          diff.ts
          remind.ts
        utils/
          config.ts
          llmClient.ts
          fileOps.ts

    Notes

    • The bin loader expects a compiled build at dist/cli.js. When installed from npm, running ack <command> invokes this compiled CLI.
    • The LLM client uses OpenAI when configured; otherwise it returns a stubbed response to keep workflows unblocked locally.
    • capture prefers git ls-files for inventory; falls back to a simple recursive scan.
    • init also creates a .ackrc.json in your repo root with placeholder keys you can customize.

    Generate Feature Specs (LLM-assisted)

    Ask ACK to propose features and draft specs in .context/features/<name>/spec.md:

    # Dry run first (no files written)
    ack features generate --dry-run --max 6
    
    # Write specs inside .context/features, overwrite existing if needed
    OPENAI_API_KEY=... ack features generate --max 6 --force

    Specs follow this template: Title, Problem, Goals, Non-Goals, User Stories, Acceptance Criteria, Design Notes, Open Questions, Risks, Milestones.

    Session Continuity: snapshot → close → recall

    Lightweight journaling to retain context across days:

    # Capture intent/deltas during work
    ack snapshot --summary "Investigated auth token rotation" --note "review edge cases" --note "pair with Sam"
    
    # At end of day, summarize into brief + next steps
    ack close
    
    # Next day, emit compact agent-ready context JSON
    ack recall

    Artifacts:

    • Snapshots JSON: .context/sessions/YYYY-MM-DD.json
    • Brief: .context/briefs/session-brief.md
    • Recall pack: .context/runtime/recall.json