@plur-ai/claw
PLUR memory plugin for OpenClaw — persistent learning across sessions
Found 68 results for context-engine
PLUR memory plugin for OpenClaw — persistent learning across sessions
Graph memory for AI agents — context engine for OpenClaw + MCP server for Claude Code/Claude.ai
Graph-backed persistent memory engine for OpenClaw. Replaces the default context window with SurrealDB + vector embeddings that learn across sessions.
ByteRover context engine plugin for OpenClaw — curates and queries conversation context via brv CLI
Monaco Editor language intelligence engine — 96 languages, 29 providers, 119 themes, 466 CLI command files (448 unique). Completions, hover, definitions, code actions, formatting, symbols, monarch tokenizers, multi-document highlights, AI rename suggestio
Claude Code context engine — pack, compress, and optimize any codebase for AI. Save 60-90% tokens.
On-device memory layer for AI agents. Claude Code, OpenClaw, and Hermes. Hooks + MCP server + hybrid RAG search.
Tu colmena de agentes IA. Local-first. Multi-canal. Open source. Construido desde Colombia para el mundo.
MCP server for codebase indexing and semantic search
Maina core engines — Context, Prompt, and Verify for verification-first development
Vexp — Context Engine for AI Coding Agents. Pre-indexes your codebase into a dependency graph and delivers ranked context to any MCP-compatible agent. 58% lower cost per task, 90% fewer tool calls (SWE-bench Verified). Works with Claude Code, Cursor, Copi
Cloud-based context engine for token-efficient memory management (backed by Tencent COS CI)
Agent-centric memory and context composition engine for OpenClaw
FatHippo context engine for OpenClaw - encrypted agent memory
One-command setup for Hippocortex OpenClaw plugin with Context Engine
Deterministic semantic memory for LLMs - local-first, graph traversal, <1GB RAM
Context Engine — Web UI. Installs and runs the dashboard on your machine.
MemClaw Context Engine - Native context management for OpenClaw with automatic recall and capture
Pentatonic Memory plugin for OpenClaw — persistent, searchable memory with multi-signal retrieval and HyDE query expansion
HyperCompositor — context engine plugin for OpenClaw
OpenClaw ContextEngine plugin for the Sonzai Mind Layer — gives OpenClaw agents hierarchical memory, personality evolution, mood tracking, and fact extraction
EverOS OpenClaw Plugin — persistent memory through natural conversation
Universal AI Workflow Bootstrapper — Scaffold the 4-Round Interactive Wizard for any project with Jira, Confluence & AI Context Engine integration.
MCP server for Context Engine API - enables Claude Code and Cursor integration
Context Engine — MCP server installer for Claude Code and Claude Desktop.
LinkMind Context Engine Plugin for OpenClaw
TypeScript client for the Context Engine API
Experimental NovaSpine context-engine integration for OpenClaw
A lightweight context engine for AI agents. Ingest events, build semantic context, query with natural language. Zero config default with SQLite + local TF-IDF embeddings.
OpenClaw plugin providing context-archive memory and epistemic trust monitoring for AI agents
Native bridge between ByteRover context engine and Claude Code — enriches Claude's auto-memory with BM25-ranked knowledge from brv context tree
RLM-native ContextEngine plugin for OpenClaw — zero information loss, massive cost reduction
Shared context engine for TLC — tiered loading, hybrid search, memory extraction, multi-agent sync
MCP server exposing Auggie CLI for codebase context retrieval
Loom cognitive memory system as an OpenClaw Context Engine plugin — connects to a Loom Python backend for structured schema-based long-term memory
Token-efficient web browsing for AI agents. Your agent needs 500 tokens of content, not 50,000 tokens of HTML.
Knowledge graph-powered context engine for Obsidian vaults (MCP server)
记忆主权插件 - 透明、可编辑、Git友好的OpenClaw记忆管理(支持 /memory-setup 一键配置)
Context Engine — MCP server. Gives persistent memory and semantic search to AI assistants.
Lia-style context engine for OpenClaw — structured compaction, auto-flush, auto-retrieval
RAG Systems Were a Mistake - Replace vector databases with 0.3ms mathematically optimal context selection
OpenClaw context-engine plugin that injects writing style rules (Chicago, AP, MLA) into agent system prompts
Graphiti knowledge graph context engine for OpenClaw
Context engine for AI coding agents — parse, graph, and serve your codebase via MCP
ContextEngine MCP Server - Up-to-date documentation and code examples for any library
RecallForge — secret-safe memory and context engine plugin for OpenClaw. Fills both memory and contextEngine slots with tiered semantic recall, 30+ pattern credential redaction, and prompt injection protection.
ourmem persistent memory plugin for OpenClaw — Memory slot + ContextEngine with 7 lifecycle hooks
On-device context engine and memory for AI agents. Claude Code and OpenClaw. Hooks + MCP server + hybrid RAG search.
ContextEngine plugin for OpenClaw with retrieval-augmented context management and memory-aware compaction
An intelligent context engine for AI-assisted software development
Robinson's Context Engine - Production-grade hybrid search with vector similarity, language-aware ranking, and intelligent file filtering
Local-first context intelligence layer for AI agents and code workflows
FatHippo context engine for OpenClaw - encrypted agent memory
Local-first code search, graph analysis, and MCP context engine. No cloud, no telemetry.
MCP server for Obsidian GraphRAG, agent-ready context, preview-only planning, and safe repo handoffs
MCP server exposing Auggie CLI for codebase context retrieval
🍋 Stale content eviction for OpenClaw — squeeze images, tool results, and exec outputs from context. Compaction fires 2-3x less often.
Context Engine MCP Server - Persistent memory and context management for AI coding tools
Context Engine CLI - Compress the Chaos
Context-aware semantic search MCP server for OpenCode and other MCP clients
Context intelligence for AI agents - Ingest, search, and subscribe to shared context
MCP server exposing Auggie CLI for codebase context retrieval
MCP server implementation using Auggie SDK for local context engine with planning and execution features
CLI tool for interacting across apps and preserve context and memory, helping you save time and efforts
Ndream CLI - context engine for vibe coding
OpenClaw memory and context-engine integration for Agentic Memory
ctxloom — The Universal Code Context Engine. A local-first MCP server providing intelligent code context via hybrid Vector + AST + Graph search with Skeletonization (70-90% token reduction).
The second brain that follows you across every agent. Importance-aware memory for Claude Code, Codex, and MCP-compatible tools. (Formerly squeeze-claw.)