Package Exports
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (synara) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Synapse CLI
AI-powered coding assistant. Cloud brain, local hands.
Architecture
Local CLI (thin client) Cloud Container (thick server)
┌─────────────────────┐ ┌──────────────────────────────┐
│ Terminal UI │ │ Harness Engine │
│ Tool Executor │◄───WS────►│ System Prompt Assembly │
│ Permission Control │ │ Guardrails / Skills │
│ SYNAPSE.md Reader │ │ Context Compact (LLM) │
└─────────────────────┘ │ Streaming LLM (Kimi K2.5) │
│ AI Gateway → Workers AI │
│ Trace Recording │
└──────────────────────────────┘CLI only does three things: display, execute, confirm. All intelligence lives on the cloud.
Install
npm install -g synapse-cliUsage
synapse login # Login with username/password
synapse # Start coding session in current directory
synapse logout # Remove stored credentials
synapse --help # Show helpSession Commands
| Command | Description |
|---|---|
/compact |
Compress conversation history (LLM summarization) |
/context |
Show context usage (tokens, messages, compactions) |
/tools |
List available tools and permission levels |
/quit |
End session |
Permission Model
| Tool | Permission |
|---|---|
| read_file, grep, glob, list_dir | Auto (in-project only) |
| read outside project dir | Ask each time |
| write_file, edit_file | Ask (trustable via t) |
| bash | Ask every time (never trustable) |
| Destructive commands (rm -rf /, mkfs, etc.) | Blocked entirely |
When prompted:
y— approve this operationn— deny (AI will ask you what to do instead)t— trust this tool type for the rest of the session (not available for bash)
Streaming
- LLM responses stream token-by-token via WebSocket
- Content appears in real-time as the model generates it
- Reasoning (thinking) is hidden by default
Context Management
Iron rule: NEVER truncate data.
Context compression uses LLM summarization, not mechanical truncation:
- autoCompact — triggers at 80% of 256k context window, summarizes older messages in batches
- reactiveCompact — triggers on API 413 error, compresses and retries (max 3 attempts)
/compact— manual trigger
Each batch of messages is sent to LLM in full for summarization. The LLM decides what's important. No data is cut by character count.
Project Configuration
Place SYNAPSE.md in your project root. It's automatically read and sent to the cloud as project-specific instructions (similar to Claude Code's CLAUDE.md).
Model
- Kimi K2.5 via Cloudflare AI Gateway
- 256k context window
- temperature 1.0, top_p 0.95
- Streaming with reasoning support
Tools
| Tool | Description |
|---|---|
| read_file | Read a file from the project |
| write_file | Create or overwrite a file (shows content first) |
| edit_file | Precise string replacement in existing files |
| grep | Regex search across files |
| glob | Find files by name pattern |
| list_dir | List directory contents |
| bash | Execute shell commands |
File Structure
synapse-cli/ # Local client (npm package)
├── src/
│ ├── index.ts # Entry: subcommands, WS connection, REPL
│ ├── auth.ts # Login/logout → JWT stored in ~/.synapse/token
│ ├── protocol.ts # CLI ↔ Cloud message protocol
│ ├── permissions.ts # t/y/n permission control
│ ├── tools/executor.ts # Local tool execution
│ └── ui/terminal.ts # Terminal output formatting
├── package.json
└── tsconfig.json
harness-agent/ # Cloud server (Cloudflare Worker + Container)
├── src/worker/index.ts # /cli/ws route → JWT auth → Container proxy
└── src/services/cli_session.py # CLI session: LLM loop, streaming, compactDesign References
- Claude Code (March 2026 leak) — tool permission model, temperature 1.0, four-tier context compression
- Kiro CLI — /compact command, streaming UX, session management
- Synapse Harness — guardrails + skills injection, Dreaming pipeline
Powered by Cloudflare · Built by Bowen Liu