Package Exports
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (bse-code) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
๐ค bse-code
The AI coding assistant that lives in your terminal โ works with ANY LLM, zero lock-in, zero compromise.
Chat with your codebase, read and write files, run shell commands, connect MCP servers, build reusable skills, persist project memory, and pick up right where you left off โ all from a gorgeous interactive REPL.
โจ Why bse-code?
| ๐ Any LLM, anywhere | OpenRouter, OpenAI, Anthropic, Google AI, Ollama, LM Studio, Local AI Foundry, or any OpenAI-compatible endpoint |
| ๐ Start for free | OpenRouter's free tier gives you Gemini 2.5 Pro, Llama 4, DeepSeek R1 โ no credit card |
| ๐ Fully local | Ollama or LM Studio โ no API key, no data leaving your machine |
| ๐ง Context-aware | Project memory, skills, and MCP tools injected automatically into every session |
| ๐พ Session persistence | Save and resume conversations per project โ never lose context again |
| ๐จ Beautiful terminal UI | 6 built-in themes, interactive slash picker, history navigation, full cursor editing |
| โก Instant shell access | !git status, !npm run build โ run any command without leaving the chat |
| ๐ File injection | @src/auth.ts โ drop any file or directory straight into your prompt |
๐ Install
Requires Node.js 18+. No .NET SDK needed โ the binary is bundled.
npm install -g bse-codeAlso available as a .NET global tool:
dotnet tool install --global bse-code
๐ Supported Providers
| # | Provider | Models | API Key |
|---|---|---|---|
| 1 | ๐ OpenRouter | 100+ models, free tier available | Yes (free at openrouter.ai) |
| 2 | ๐ข OpenAI | GPT-4o, o3, o1, GPT-3.5 | Yes |
| 3 | ๐ฃ Anthropic | Claude 3.7/3.5 Sonnet, Haiku, Opus | Yes |
| 4 | ๐ต Google AI | Gemini 2.5 Pro/Flash, 2.0, 1.5 | Yes (free tier) |
| 5 | ๐ฆ Ollama | llama3, mistral, qwen, deepseekโฆ | โ No (local) |
| 6 | ๐ฅ๏ธ LM Studio | Any model loaded in LM Studio | โ No (local) |
| 7 | ๐ญ Local AI Foundry | Phi-4, Phi-3.5 Mini, and more | โ No (local) |
| 8 | โ๏ธ Custom | Any OpenAI-compatible endpoint | Optional |
๐ง First-run Setup
On first run, an interactive wizard walks you through everything:
- ๐ฏ Pick a provider
- ๐ Set the base URL (pre-filled for known providers)
- ๐ Enter your API key (skipped for local providers)
- ๐ค Browse available models and pick one
- ๐พ Everything saved to
~/.bse-code/config.json
Re-run the wizard any time:
bse-code --configโก Quick-start by provider
๐ OpenRouter โ free models, no credit card
bse-code --config
# Select [1] OpenRouter โ get a free key at https://openrouter.ai/keys
# Pick Gemini 2.5 Pro, Llama 4, DeepSeek R1 โ all free!๐ฆ Ollama โ fully local, zero cost
ollama pull llama3.2
bse-code --config
# Select [5] Ollama โ accept default URL โ pick your model๐ป Usage
๐ Interactive REPL (recommended)
bse-code โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ โโโโโโโ โโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโ โ code โ
โ โโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโ โโโโโโโโโโโโโโโโ โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
provider: OpenRouter
model : google/gemini-2.5-pro-exp-03-25:free
theme : default
cwd : my-project
๐ง skills : 2 loaded
๐ mcp : 5 tools from 1 server(s)
๐พ memory : 1 BSE.md file(s) loaded
type /help for commands ยท /exit to quit ๐
my-project (main) โฏโก One-shot mode
bse-code -p "explain the auth flow in src/auth/"
bse-code -p "list all TODO comments" --output-format json๐ณ๏ธ All CLI flags
bse-code # ๐ Interactive REPL
bse-code -p "<prompt>" # โก One-shot prompt
bse-code --model <model-id> # ๐ค Override model for this session
bse-code --theme <name> # ๐จ Set color theme for this session
bse-code --output-format json|text # ๐ Output format (one-shot only)
bse-code --config # โ๏ธ Re-run the setup wizard
bse-code --version, -v # ๐ข Show version
bse-code --help, -h # โ Show help๐ช Special Input Prefixes
@ โ File & directory injection
Drop any file or folder straight into your prompt. Tab-completes paths as you type.
@src/auth.ts explain this file
@src/auth/ summarize all files in this folder
@package.json what dependencies are outdated?Directories inject up to 20 files automatically โ perfect for asking about a whole module at once.
! โ Shell passthrough
Run any shell command instantly, no AI involved, output right in your terminal.
!git status
!npm run build
!ls -la src/โจ๏ธ REPL Slash Commands
๐ง Core
| Command | Description |
|---|---|
/clear |
๐งน Wipe conversation history โ fresh start |
/model [id] |
๐ค Show current model or switch mid-session |
/compact [hint] |
๐๏ธ Summarize history and trim tokens |
/stats |
๐ Session stats: duration, turns, tool calls, messages, model, provider, theme, skills, MCP tools |
/tools |
๐ง List all available built-in and MCP tools |
/help |
โ Show all commands |
/exit or /quit |
๐ Quit |
๐จ Appearance
| Command | Description |
|---|---|
/theme |
๐จ List all themes with active marker |
/theme <name> |
๐จ Switch theme โ persisted to config |
๐ง Skills
| Command | Description |
|---|---|
/skills |
๐ List all loaded skills (user + project level) |
/<skill-name> |
โถ๏ธ Invoke a skill |
/<skill-name> @file.ts |
โถ๏ธ Invoke a skill with a file argument |
๐ MCP
| Command | Description |
|---|---|
/mcp |
๐ List all connected MCP servers and their tools |
/mcp reload |
๐ Hot-reload MCP servers without restarting |
๐พ Memory
| Command | Description |
|---|---|
/memory |
๐พ Show all loaded BSE.md files |
/memory add <text> |
โ๏ธ Append a note to ./BSE.md instantly |
/memory refresh |
๐ Reload BSE.md files and refresh the system prompt |
/init |
๐ Scaffold a BSE.md in the current directory |
๐ Sessions
| Command | Description |
|---|---|
/save <tag> |
๐พ Save the current conversation with a tag |
/resume |
๐ List all saved sessions for this project |
/resume <tag> |
โถ๏ธ Restore a saved session and pick up where you left off |
๐ฎ Interactive Input โ Feels Like a Real Shell
/ โ Slash command picker
Type / and an inline menu pops up instantly:
/ โโ navigate ยท Enter select ยท Esc cancel
โถ /clear ๐งน clear conversation history
/model ๐ค show or switch model
/compact ๐๏ธ summarize history to save tokens
/theme ๐จ list or set color theme
/skills ๐ง list loaded skills
/mcp ๐ list MCP servers and tools
/memory ๐พ show loaded BSE.md files
/save ๐พ save conversation
/resume โถ๏ธ list or resume a saved session
โฆ- โฌ๏ธโฌ๏ธ Arrow keys navigate the list
- โจ๏ธ Type more characters to filter live โ
/thnarrows to/theme - โฉ๏ธ Enter selects, Esc cancels
- โฅ Tab completes the top match
- ๐ง Your skills appear right alongside built-in commands
๐ History & cursor editing
- โฌ๏ธโฌ๏ธ arrows cycle through previous inputs โ just like your shell
- โฌ ๏ธโก๏ธ move the cursor anywhere in the line
- Home / End jump to start or end
- Backspace / Delete work at any cursor position
- โฅ Tab on
@<path>completes file and directory paths
๐ง Skills โ Reusable AI Workflows
Skills are markdown files that give the AI reusable instructions or workflows. Write once, invoke from any project.
๐ Locations (both loaded and merged):
~/.bse-code/skills/โ user-level, available in every project.bse-code/skills/โ project-level, scoped to this repo
Example skill (.bse-code/skills/review.md):
# Code Review
Review the provided code for:
- Correctness and logic errors
- Performance issues
- Security vulnerabilities
- Code style and readabilityInvoke it:
/review
/review @src/PaymentService.tsSkills are also injected into the system prompt automatically โ the AI always knows what's available. ๐
๐พ Project Memory (BSE.md)
BSE.md files are loaded at startup and injected into every session's system prompt. Teach the AI about your project once โ it remembers forever.
๐๏ธ Hierarchy โ all three are merged:
| File | Scope |
|---|---|
~/.bse-code/BSE.md |
๐ Global โ your personal preferences across all projects |
./BSE.md |
๐ Project โ tech stack, commands, coding standards |
./BSE.local.md |
๐ Local overrides โ add to .gitignore |
Scaffold one instantly:
bse-code
/initAdd notes on the fly:
/memory add always use async/await, never .then() chains
/memory add run `npm test` before committing๐ MCP (Model Context Protocol)
Connect any external tool or service via MCP servers. GitHub, databases, Slack, custom APIs โ if it speaks MCP, it works here.
Config file: ~/.bse-code/mcp.json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "your-token" }
}
}
}- Tools available to the AI as
mcp__serverName__toolName - Hot-reload without restarting:
/mcp reload๐ - Inspect what's connected:
/mcp - Disable without removing:
"disabled": true
๐จ Built-in AI Tools
| Tool | What it does |
|---|---|
๐ read_file |
Read any file's contents |
โ๏ธ Write |
Write or create a file (auto-creates parent directories) |
๐ฅ๏ธ Bash |
Execute shell commands โ cross-platform |
๐ list_dir |
List files and subdirectories at a path |
๐ glob |
Find files matching a glob pattern (e.g. src/**/*.ts) |
๐ grep |
Search files with a regex pattern (up to 200 matches, recursive) |
๐ mcp__*__* |
Any tool from your connected MCP servers |
Tool calls are shown inline as the AI works โ you see exactly what it's doing in real time. โ or โ per call.
๐พ Session Management
Never lose a good conversation. Save any session with a tag and resume it later.
/save auth-refactor/resume
# auth-refactor 2025-04-24 14:32 18 messages [gpt-4o]
# bug-hunt 2025-04-23 09:15 31 messages [claude-3-5-sonnet]
/resume auth-refactor
# โถ๏ธ Resumed session 'auth-refactor' (18 messages) โ welcome back!Sessions are stored per-project in ~/.bse-code/sessions/ โ isolated, no collisions.
๐ Session Statistics
/stats Session stats ๐
โฑ duration : 00:23:41
๐ฌ turns : 12
๐ง tool calls : 34
๐จ messages : 47
๐ค model : google/gemini-2.5-pro-exp-03-25:free
๐ provider : OpenRouter
๐จ theme : dracula
๐ง skills : 3
๐ mcp tools : 8๐๏ธ Conversation Compaction
Running low on context? Compact the conversation into a tight summary without losing the important bits.
/compact
/compact focus on the auth changes we madeThe AI summarizes, history is trimmed, you keep going โ same context, way fewer tokens. ๐ฏ
๐จ Themes
| Theme | Accent | Vibe |
|---|---|---|
default |
๐ฉต Cyan | Classic terminal |
dracula |
๐ Magenta/Purple | Dark and moody |
monokai |
๐ Yellow | Warm and punchy |
ocean |
๐ Blue | Cool and calm |
forest |
๐ Green | Fresh and focused |
light |
๐ฉต Dark on light | For light terminals |
bse-code --theme dracula # one session
# or inside the REPL:
/theme monokai # persistedโ๏ธ Configuration
Config file: ~/.bse-code/config.json
{
"provider": "OpenRouter",
"api_key": "sk-or-...",
"model": "google/gemini-2.5-pro-exp-03-25:free",
"base_url": "https://openrouter.ai/api/v1",
"theme": "default"
}๐ Environment variables (always override config)
| Variable | Description |
|---|---|
BSE_PROVIDER |
Provider name (OpenRouter, OpenAI, Anthropic, Google, Ollama, LmStudio, LocalAiFoundry, Custom) |
BSE_API_KEY |
API key for the selected provider |
BSE_MODEL |
Model ID to use |
BSE_BASE_URL |
Override the API base URL |
๐ Legacy
OPENROUTER_API_KEY,OPENROUTER_MODEL,OPENROUTER_BASE_URLstill work as fallbacks.
๐ File Structure
~/.bse-code/
โโโ config.json # โ๏ธ Provider, API key, model, base URL, theme
โโโ mcp.json # ๐ MCP server definitions
โโโ BSE.md # ๐ Global memory
โโโ skills/
โ โโโ *.md # ๐ง User-level skills
โโโ sessions/
โโโ <project-hash>/ # ๐พ Saved conversations per project
.bse-code/ # Project-level (commit to your repo)
โโโ BSE.md # ๐ Project memory
โโโ skills/
โโโ *.md # ๐ง Project-level skills
./BSE.md # ๐ Project memory (root level)
./BSE.local.md # ๐ Local overrides โ gitignore this๐ License
MIT