Package Exports
- cursor-memory
- cursor-memory/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (cursor-memory) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
cursor-memory
Persistent, searchable memory for Cursor AI — You control what your AI remembers.
The Problem • Quick Demo • Installation • Commands • How It Works • Troubleshooting
😤 The Problem
You just spent an hour with Cursor AI figuring out the right architecture. You made decisions, weighed trade-offs, landed on a solution.
Then you open a new chat.
❌ "What database did we choose last week?"
→ "I don't have access to previous conversations."
❌ "Continue the migration plan from yesterday."
→ "Could you provide context about the migration?"
❌ "Why did we pick EFS over EBS again?"
→ "I don't have information about previous decisions."Every new chat, your AI has amnesia. Every decision you made, every context you built — gone.
The workarounds make it worse:
📝 Save to a .md file |
Now you have 1.000 files. Which one was it again? |
| 📎 Attach files to every chat | Token costs pile up. Most of it isn't even relevant. |
| 🔁 Retype context manually | You become the memory for a tool that's supposed to help you think. |
✨ See It in Action
▶ Can't see the video? Watch on GitHub
🎯 Why cursor-memory
You decide what gets saved. Type /memo when something matters — AI creates a structured memo, tags it, and stores it locally. Next time you need it, it's there.
| 🎛️ You control what's saved | Nothing gets saved without you triggering /memo. No background processes, no noise. |
| ⚡ Context-aware auto-search | AI detects when your question refers to past context and automatically searches your memories — no command needed. /recall is available as a manual fallback. |
| 🔍 Finds what you mean, not what you type | Hybrid FTS5 keyword + vector semantic search. Searches by meaning, not just exact words. |
| 🌍 Cross-language | Save in any language, search in any language. Multilingual E5 — 100+ languages, fully cross-lingual. |
| 📝 Structured summaries | AI generates organized memos with Decisions → Key Details → Context → Next Steps — not raw text dumps. |
| 📄 Handles long content | Long discussions are automatically split into overlapping chunks — every section is searchable, nothing gets lost. |
| 📁 Global + per-repo scope | Global memories visible everywhere. Repo memories isolated per project — one repo never sees another's context. |
| 🧠 Choose your model | Small (~50MB), Medium (~115MB), or Large (~270MB) — pick the size that fits your machine. |
| 🔒 Fully private, runs offline | No cloud. No API keys. No telemetry. Everything stays on your machine. |
📦 Installation
Prerequisites
- Node.js — Node 20, 22, or 24 LTS recommended (download)
- Native modules ship prebuilt binaries for these versions, no build tools required.
- C++ compiler (optional) — only needed as a fallback for non-LTS Node versions (Xcode CLI on Mac,
build-essentialon Ubuntu, VS Build Tools on Windows).
⚡ 2-minute setup
# 1. Install globally
npm install -g cursor-memory
# 2. Setup — downloads model, configures Cursor automatically
cursor-memory setup
# 3. Restart Cursor — done 🎉The CLI handles everything:
- 📥 Downloads the embedding model
- ⚙️ Configures MCP server for Cursor
- 📋 Sets up AI behavior rules
🤖 Choose your model
| Model | Size | RAM | Best for |
|---|---|---|---|
| Small | ~50MB | ~200MB | Lightweight, fast |
| Medium | ~115MB | ~500MB | Good balance |
| Large ⭐ | ~270MB | ~1GB | Best accuracy (recommended) |
All models support 100+ languages and run fully offline after download.
💬 Commands
In Cursor Chat
Three commands. That's it.
| Command | What it does |
|---|---|
/memo or /memo [text] |
💾 With text → saves directly. Without → AI summarizes the conversation into a structured memo |
/recall [query] |
🔍 Searches your memories by keyword + semantic meaning |
/forget [query] |
🗑️ Searches → previews matches → confirms before deleting |
AI detects when your question refers to past context and automatically searches your memories — no command needed.
/recallis available as a manual fallback.
🛠️ CLI Commands
cursor-memory setup # First-time setup or switch model
cursor-memory status # Check MCP, rules, model, database health
cursor-memory reset # Clear all data and start fresh
cursor-memory -v # Show version
cursor-memory --help # Show all commands🔬 How It Works
💾 Architecture
💾 Save & Search Flow
📁 Scope Isolation
🧪 Tech Stack
| Component | Technology | Why |
|---|---|---|
| 🔌 MCP Server | @modelcontextprotocol/sdk |
Standard protocol for AI tool integration |
| 🗄️ Database | better-sqlite3 |
Zero-config, fast, embedded, WAL mode |
| 🔎 Vector search | sqlite-vec |
Native C extension, cosine KNN, no external DB |
| 📝 Full-text search | SQLite FTS5 | BM25 ranking, auto-sync via triggers |
| 🧠 Embeddings | @huggingface/transformers |
Local ONNX inference, no API keys |
| 🌍 Model | Multilingual E5 (Q8) | 100+ languages, asymmetric search, quantized |
| ⌨️ CLI | commander |
Interactive setup, model management |
| 💻 Language | TypeScript (ESM) | Type safety, modern module system |
🐛 Troubleshooting
🔌 MCP not connecting after setup
Restart Cursor completely (quit and reopen — not just reload window).
cursor-memory status # check system healthIf auto-config failed, manually add MCP server in Cursor:
Cursor → Settings → MCP → Add server, or edit your MCP config file:
{
"mcpServers": {
"cursor-memory": {
"command": "npx",
"args": ["-y", "cursor-memory"]
}
}
}⚠️ Node.js version mismatch / "unsupported Node.js version"
cursor-memory requires Node 20, 22, or 24 LTS — these are the versions that ship with prebuilt native binaries (better-sqlite3). Other versions may fail to install.
Check your version:
node -vIf you're on a non-LTS version (18, 19, 21, etc.), install an LTS via nvm:
nvm install 22
nvm use 22
npm install -g cursor-memory
cursor-memory setupIf you switched Node versions after installing, just reinstall:
npm install -g cursor-memory
cursor-memory setup🔍 AI doesn't auto-search memories
Run cursor-memory setup again to reinstall rules.
If that doesn't work, manually add the rule via Cursor → Settings → Rules → create a new User rule and paste the following:
## cursor-memory MCP
### Auto-recall
BEFORE answering, ask yourself: "Does the user expect me to know something from a previous chat?"
If YES → call search_memory from cursor-memory MCP immediately. Do NOT answer first.
If UNSURE → answer normally, do NOT search.
Signs of past context (any language):
- References to previous decisions ("what did we choose", "as we discussed")
- Continuation requests ("continue the plan", "pick up where we left off")
- "We/our" referring to past work, not general questions
- Temporal cues: "last time", "before", "already", "remember", "yesterday"
### Auto-save awareness
After a substantive conversation, assess whether it produced knowledge worth preserving:
SUGGEST SAVING when:
- A decision was reached (chose X over Y, with reasoning)
- A plan, strategy, or approach was agreed upon
- A problem was analyzed and a solution was identified
- A comparison or evaluation was completed with a conclusion
- Important context, constraints, or requirements were established
- Knowledge was shared that would be useful to recall in future sessions
Do NOT suggest when:
- Quick Q&A with a generic/textbook answer
- Still exploring — no conclusion or decision yet
- User already said /memo in this conversation
How to suggest: at the END of your response, briefly ask:
"This seems worth remembering. Want me to /memo this?"
Do NOT auto-save without user confirmation.
### Commands
/memo → save to memory. With content: save directly. Without content: summarize conversation then save.
/recall → search via search_memory
/forget → delete via delete_memory❌ Search returns no results
- Try more specific terms
- Similarity threshold is 0.2 — very broad queries may not match
- Consider upgrading to a larger model for better recall accuracy
🛠️ Development
git clone https://github.com/tranhuucanh/cursor-memory.git
cd cursor-memory
npm install
npm run build # build once
npm run dev # watch mode
node dist/cli.js setup
node dist/index.js🤝 Contributing
- 🍴 Fork the repository
- 🌿 Create feature branch:
git checkout -b feature/your-feature - 💾 Commit:
git commit -m 'feat: your feature' - 🚀 Push:
git push origin feature/your-feature - 🔁 Open a Pull Request
📄 License
MIT — see LICENSE.
🙏 Acknowledgments
- Model Context Protocol SDK
- HuggingFace Transformers.js
- sqlite-vec by Alex Garcia
- Multilingual E5 by Microsoft