Package Exports
- @gamaze/hicortex
- @gamaze/hicortex/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@gamaze/hicortex) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
@gamaze/hicortex — Human-like Memory for Self-Improving AI Agents
Upgrade your agents with Hicortex for automatic capturing of experiences, feedback, and decisions across all your connected agents. They start learning from every session, self-reflect, avoid past mistakes, and improve on their own — overnight, automatically. No configuration needed.
Works with OpenClaw (in-process plugin) and Claude Code (HTTP/SSE MCP server).
Requirements
- Node.js 18+
- LLM provider (auto-detected from OC config or
ANTHROPIC_API_KEYfor CC) - ~500MB disk for database + embedding model
Install — OpenClaw
openclaw plugins install @gamaze/hicortex
openclaw gateway restartNo configuration needed. The plugin auto-detects your LLM provider from OpenClaw settings on first startup.
Install — Claude Code
npx @gamaze/hicortex initThis detects your environment, installs a persistent MCP server daemon, registers it with Claude Code, and adds /learn and /hicortex-activate commands. Restart CC after setup.
Or manually:
# Start server
npx @gamaze/hicortex server
# Register with CC
claude mcp add hicortex --transport http http://localhost:8787/sseConfigure
Optional config for OC (add to plugin entry in ~/.openclaw/openclaw.json):
| Field | Default | Description |
|---|---|---|
licenseKey |
(none) | License key. Free tier (250 memories) without key. |
llmBaseUrl |
(auto) | Override LLM base URL |
llmApiKey |
(auto) | Override LLM API key |
llmModel |
(auto) | Override model for scoring and distillation |
reflectModel |
(auto) | Override model for nightly reflection |
consolidateHour |
2 |
Hour (0-23, local time) for nightly consolidation |
dbPath |
(auto) | Custom database path |
For CC, set environment variables: ANTHROPIC_API_KEY (auto-detected), or HICORTEX_LLM_BASE_URL + HICORTEX_LLM_API_KEY + HICORTEX_LLM_MODEL for custom providers.
What Happens Automatically
| When | What | How |
|---|---|---|
| Agent start | Recent lessons injected into context | OC: before_agent_start hook / CC: CLAUDE.md block |
| Agent end | Conversation captured and distilled | OC: agent_end hook / CC: nightly transcript scan |
| Nightly | Score importance, reflect, link, decay | In-process consolidation pipeline |
Agent Tools
Available via MCP (both OC and CC):
- hicortex_search — Semantic search across all stored knowledge
- hicortex_context — Get recent decisions and project state
- hicortex_ingest — Store a piece of knowledge directly
- hicortex_lessons — Get actionable lessons from reflection
Skills: /learn to save explicit learnings.
CLI Commands
npx @gamaze/hicortex server # Start MCP HTTP/SSE server (port 8787)
npx @gamaze/hicortex init # Set up for Claude Code
npx @gamaze/hicortex nightly # Run distill + consolidate + inject
npx @gamaze/hicortex status # Show config, DB stats, adapters
npx @gamaze/hicortex uninstall # Remove CC integration (keeps DB)Architecture
@gamaze/hicortex (single npm package, dual mode)
├── OpenClaw mode (in-process plugin)
│ ├── before_agent_start → inject lessons
│ ├── agent_end → capture + distill
│ └── registerService → DB, LLM, consolidation timer
│
└── Claude Code mode (persistent HTTP/SSE server)
├── MCP tools → hicortex_search, hicortex_context, hicortex_ingest, hicortex_lessons
├── /health endpoint → monitoring
├── Nightly → scan CC transcripts, distill, consolidate, inject CLAUDE.md
└── Shared DB at ~/.hicortex/hicortex.db
Shared core:
├── SQLite + sqlite-vec + FTS5 (single file)
├── bge-small-en-v1.5 embeddings (ONNX, local CPU)
├── BM25 + vector search with RRF fusion
└── Multi-provider LLM (20+ providers)Database
Canonical location: ~/.hicortex/hicortex.db. Existing OC installations at ~/.openclaw/data/hicortex.db are automatically migrated on upgrade.
Pricing
| Tier | Price | Memories | Features |
|---|---|---|---|
| Free | $0 | 250 | Full features: search, reflection, lessons, linking |
| Pro | $9/month | Unlimited | Everything in Free, unlimited |
| Lifetime | $149 | Unlimited | Pro forever |
| Team | $29/month | Unlimited | Multi-agent shared memory |
Get a license key at hicortex.gamaze.com.
Uninstall
OpenClaw:
openclaw plugins uninstall hicortexClaude Code:
npx @gamaze/hicortex uninstallYour memory database is preserved by default. To remove all data: rm -rf ~/.hicortex
Development
cd packages/openclaw-plugin
npm install
npm run build
npm testTroubleshooting
Tools not visible to agent (OC): The plugin auto-adds tools to tools.allow on startup. Restart the gateway after install.
LLM auto-config failed: Check logs for [hicortex] WARNING. Add llmBaseUrl to plugin config or set HICORTEX_LLM_BASE_URL env var.
No lessons generated: Reflection requires an LLM. Check that your provider is accessible and has sufficient quota.
First startup slow: The embedding model (~130MB) downloads on first run. Allow up to 2 minutes.
Server won't start (CC): Check ~/.hicortex/server.log for errors. Verify port 8787 is free: lsof -i :8787.
Multiple CC sessions: The HTTP server handles multiple concurrent sessions. Do not use stdio transport — it spawns separate processes per session.