@kimuson/modular-mcp
Modular MCP - Modular tool access for reducing context overhead
Found 33 results for context-optimization
Modular MCP - Modular tool access for reducing context overhead
Tool definition compression for OpenAI-compatible LLM APIs. Replaces N tool definitions with 2 meta-tools (search_tools, call_tool) to reduce context window overhead by 70-97%.
Agentic Context Management for the Pi
IANA-registered format (application/vnd.faf+yaml) • Persistent project context • MCP server for Claude Desktop • MIT License
Save up to 80% tokens when AI reads code — MCP server for token-efficient code navigation, AST-aware structural reading instead of dumping full files into context window
MCP server for on-demand agent template registry — search, browse, and spawn specialized AI agents
ReePoe AI Code Manager - Install in any codebase for instant AI agent integration
MCP proxy that reduces Claude context token usage by ~85-96% via lazy tool loading. Works with Claude Code, Claude Desktop, and any MCP client.
Context optimization tools MCP server for AI coding assistants - compatible with GitHub Copilot, Cursor AI, and other MCP-supporting assistants
A TypeScript library for converting JSON to JSON Schema
SHIP MCP Server - continuous AI code reliability scoring + repo quality reports for Claude and Cursor
Context compiler SDK for LLM applications with scoring, selection, and attention-based fusion
Shared memory for multi-agent teams - automation hooks for OpenClaw powered by Snipara
Information-theoretic context optimization for AI coding agents. Zero-friction, pure WebAssembly — no Python dependency.
GitIntel AI — git-native AI adoption tracking, cost intelligence, and context optimization
Lazy-loading MCP proxy for Cursor IDE — start MCP servers on-demand, save GBs of RAM. Zero config: just add to Cursor and restart.
Context optimization tools MCP server for AI coding assistants - compatible with GitHub Copilot, Cursor AI, and other MCP-supporting assistants
Initialize Snipara MCP + RLM-Runtime with a single command - context optimization, semantic memory, and safe code execution
Modular MCP - Modular tool access for reducing context overhead
IANA-registered format (application/vnd.faf+yaml) • Persistent project context • MCP server for Claude Desktop • MIT License
ReePoe AI Code Manager - Install in any codebase for instant AI agent integration
Precise token counting and context efficiency analysis for MCP servers
MCP Server for Skim - Intelligent code compression for LLM context optimization
Information-theoretic context optimization MCP server for AI coding agents. Bridge package that wraps the Entroly Python engine.
Contextomizer is an ultra-fast, deterministic library for transforming bloated tool outputs, raw APIs, documents, and messy logs into perfectly optimized context for AI Agents 🤖🚀
Snipara integration for OpenClaw - Multi-agent swarms, context optimization, and persistent memory
IANA-registered format (application/vnd.faf+yaml) • Persistent project context • MCP server for Claude Desktop • MIT License
SHIP Protocol - Success Heuristics for Intelligent Programming. The industry standard for AI coding agent reliability.
One-command Snipara setup for OpenClaw users - context optimization, semantic memory, and agent coordination
Production-ready orchestration framework for AI applications featuring hybrid intent classification, dynamic context optimization, and sequential pipeline architecture
Fast File System Operations and AI Integration for Node.js - Like Cursor's token management
Context-optimized workflow enforcement plugin for Claude Code - ~78% token savings with 14-phase quality gates
Context optimization tools MCP server for AI coding assistants - compatible with GitHub Copilot, Cursor AI, and other MCP-supporting assistants