Package Exports
- @stumhuang/opencode-mem
- @stumhuang/opencode-mem/server
Readme
OpenCode Memory

A persistent memory system for AI coding agents that enables long-term context retention across sessions using local vector database technology.
Visual Overview
Project Memory Timeline:

User Profile Viewer:

Core Features
Local vector database with SQLite + USearch-first vector indexing and ExactScan fallback, persistent project memories, automatic user profile learning, unified memory-prompt timeline, full-featured web UI, intelligent prompt-based memory extraction, multi-provider AI support (OpenAI, Anthropic), 12+ local embedding models, smart deduplication, and built-in privacy protection.
Prerequisites
This plugin uses USearch for preferred in-memory vector indexing with automatic ExactScan fallback. No custom SQLite build or browser runtime shim is required.
Recommended runtime:
- Bun
- Standard OpenCode plugin environment
Notes:
- If
USearchis unavailable or fails at runtime, the plugin automatically falls back to exact vector scanning. - SQLite remains the source of truth; search indexes are rebuilt from SQLite data when needed.
Getting Started
Add to your OpenCode configuration at ~/.config/opencode/opencode.json:
{
"plugin": ["opencode-mem"],
}The plugin downloads automatically on next startup.
Usage Examples
memory({ mode: "add", content: "Project uses microservices architecture" });
memory({ mode: "search", query: "architecture decisions" });
memory({ mode: "search", query: "architecture decisions", scope: "all-projects" });
memory({ mode: "profile" });
memory({ mode: "list", limit: 10 });Access the web interface at http://127.0.0.1:4747 for visual memory browsing and management.
Configuration Essentials
Configure at ~/.config/opencode/opencode-mem.jsonc:
{
"storagePath": "~/.opencode-mem/data",
"userEmailOverride": "user@example.com",
"userNameOverride": "John Doe",
"embeddingModel": "Xenova/nomic-embed-text-v1",
"memory": {
"defaultScope": "project",
},
"webServerEnabled": true,
"webServerPort": 4747,
"autoCaptureEnabled": true,
"autoCaptureLanguage": "auto",
"opencodeProvider": "anthropic",
"opencodeModel": "claude-haiku-4-5-20251001",
"showAutoCaptureToasts": true,
"showUserProfileToasts": true,
"showErrorToasts": true,
"userProfileAnalysisInterval": 10,
"maxMemories": 10,
"compaction": {
"enabled": true,
"memoryLimit": 10,
},
"chatMessage": {
"enabled": true,
"maxMemories": 3,
"excludeCurrentSession": true,
"maxAgeDays": undefined,
"injectOn": "first",
},
}Memory Scope
scope: "project": query only the current project. This is the default.scope: "all-projects": querysearch/listacross all project shards.memory.defaultScopesets the default query scope when no explicit scope is provided.
Auto-Capture AI Provider
opencode-mem supports three modes for AI model access:
1. OpenCode Native Models (Recommended)
Use OpenCode's native models directly through your OpenCode installation. No external API keys needed.
"opencodeProvider": "opencode",
"opencodeModel": "anthropic/claude-haiku-4-5-20251001",Model format: provider/model (same as OpenCode's /models command)
Available models examples:
anthropic/claude-haiku-4-5-20251001anthropic/claude-sonnet-4-5-20250929openai/gpt-4o-miniopenai/gpt-5.3-codex-sparkgoogle/gemini-2.5-flash
Use any model that OpenCode supports - check with /models command in OpenCode.
2. External Anthropic/OpenAI APIs
Use Anthropic or OpenAI APIs directly through OpenCode's authentication system.
"opencodeProvider": "anthropic",
"opencodeModel": "claude-haiku-4-5-20251001",Supported external providers: anthropic, openai
Setup:
- Run
opencode auth loginto connect your provider - Check connected providers:
opencode providers list - Configure the provider name and model in opencode-mem config
No separate API-key configuration needed in opencode-mem; it reads OAuth tokens or API keys from opencode's auth.json.
Note: Legacy memoryProvider/memoryModel/memoryApiUrl/memoryApiKey/memoryTemperature/memoryExtraParams keys are deprecated and ignored (a one-time warning is logged).
Full documentation available in this README.
Development & Contribution
Build and test locally:
bun install
bun run build
bun run typecheck
bun run formatThis project is actively seeking contributions to become the definitive memory plugin for AI coding agents. Whether you are fixing bugs, adding features, improving documentation, or expanding embedding model support, your contributions are critical. The codebase is well-structured and ready for enhancement. If you hit a blocker or have improvement ideas, submit a pull request - we review and merge contributions quickly.
License & Links
MIT License - see LICENSE file
- Repository: https://github.com/tickernelz/opencode-mem
- Issues: https://github.com/tickernelz/opencode-mem/issues
- OpenCode Platform: https://opencode.ai
Inspired by opencode-supermemory