codebase-map
A lightweight TypeScript/JavaScript code indexer that generates comprehensive project maps for LLMs
Found 2925 results for llm
A lightweight TypeScript/JavaScript code indexer that generates comprehensive project maps for LLMs
Model Context Protocol (MCP) server for Cerebras Code API integration with any MCP-compatible client. Run 'cerebras-mcp --config' for interactive setup.
An MCP server that provides tools for checking Maven dependency versions
An MCP server for interacting with a Jenkins CI/CD server. Allows you to trigger jobs, check build statuses, and manage your Jenkins instance through MCP.
Model Context Protocol (MCP) server for Google Gemini 2.5 Pro with conversation context management
<div align="center"> <a aria-label="NPM version" href="https://twitter.com/dimitrikennedy"> <img alt="stream-hooks" src="https://img.shields.io/twitter/follow/dimitrikennedy?style=social&labelColor=000000"> </a> <a aria-label="GH Issues" href="h
Unified XML Referring JSON Parser - Solve LLM JSON generation errors by separating complex content from JSON structure
Security proxy for enterprise integrations for LLMs
A JSX-based templating engine for generating structured prompts with TypeScript support
Advanced routing and transformation system for Claude Code outputs to multiple AI providers
Agent Development Kit for TypeScript with multi-provider LLM support
Enhanced PostgreSQL MCP server with read and write capabilities. Based on @modelcontextprotocol/server-postgres by Anthropic.
NestJS module that auto-exposes your REST API as MCP tools via a single /mcp endpoint.
Universal Model Context Protocol Client - The complete toolkit for MCP
🧬 ModelMix - Unified API for Diverse AI LLM.
Shadowboxin: Utils for real-time streamed LLM-generated UIs.
A configurable AI chat interface component library
n8n community node for Orq.ai - AI deployment and prompt management platform
Search, create, and retrieve tasks, add comments, and track time through natural language commands.
MCP server wrapper for Google's Gemini CLI
A Node.js utility that quickly identifies files with uncommented code in your codebase. Designed for developers who want to efficiently tell LLMs exactly which files need comments added.
Universal Ollama LLM Bridge for multiple models (Llama, Gemma, etc.)
MCP server for OpenAI Codex CLI integration
Locate JavaScript files with 'const' or 'process.env' usage in LLM-generated codebases
🪨 Bedrock Wrapper is an npm package that simplifies the integration of existing OpenAI-compatible API objects with AWS Bedrock's serverless inference LLMs.
MCP (Model Context Protocol) plugin for majk-chat
A comprehensive Node.js library for AI Agents & LLM Integration, simplifying the creation of intelligent applications.
A powerful JavaScript SDK for creating and managing teams of AI agents with dependency management, parallel/sequential execution, and memory capabilities.
Ultra-type-safe TypeScript toolkit for building AI applications with Anthropic Claude
Evaluation framework with Effect-based architecture for running parallel AI evaluations
Convert between markdown and LLM-friendly pseudo-XML
Comprehensive MCP Server for Plane with 76 project management tools including custom properties, sub-issues, relations, and transfer operations
Experimental ModelFusion features
The 4 fundamental AI operations for building intelligent applications
client library for asterai.io
Payloop JavaScript SDK
MCP server for Sequential Thinking Tools
Model Context Protocol (MCP) server for interacting with Node-RED
n8n community node for Future AGI prompt management with async logging, evaluation, and content protection capabilities
Use non-Anthropic models with Claude Code by proxying requests through the lemmy unified interface
Clean, minimal React components for LLM interfaces - 95% code reduction
Typescript bindings for langchain
A lightweight command-line AI code review tool that also provides general-purpose AI capabilities
A WASM-powered library to cap incomplete JSON streams.
Add AI functionality to your flows! This module includes a set of nodes that enable easy communication with Ollama, enriching your projects with intelligent solutions.
A MCP server that provides tools to chunk and reconstruct JSON for translation.
It's time for a paradigm shift. The future of software in plain English, French or Latin
Autonomous AI cost protection that actually works. Real-time budget enforcement with auto-kill prevents runaway LLM costs before they happen. Unlike monitoring tools, AgentGuard stops the bleeding.
A TypeScript library for building dynamic AI-driven workflows with LLM planning and static flow composition using Effect
ClickUp MCP Server for LLM integration
Use Claude Code without an Anthropics account and route it to another LLM provider
The Core Booth system is a modular and extensible framework for building and managing conversational AI agents, referred to as "Booths." It provides a structured way to define the capabilities, context, and tools for different AI-powered conversational fl
SDK for Hamming Evals Framework
Clueo MCP Server - AI Personality Layer for Model Context Protocol
JavaScript/TypeScript Implementation of LLMLingua-2
LF Widgets - Main components library
Client-side MCP server for LLM clients
Vertesia memory builder CLI
A production-ready Model Context Protocol (MCP) server for semantic memory management
Production-ready LLM API pool manager with load balancing, failover, and dynamic configuration
LLM plugin for Bowl PayloadCms plugin
The AI Model Orchestrator - Intelligent multi-model workflows with device-locked licensing
A template for LLM development integrating AI tools, TypeScript, Zod validation, and development utilities like Vitest and Rslib.
Enhanced MCP server for generating XMind mind maps - Independent version
Model Context Protocol server for Aki UI component library
Lightweight, composable agents for AI applications
Probe CLI binary for linux-x64
MCP Tool to operate and integrate MongoDB Atlas projects into an AI developed project
OpenTelemetry exporter for VoltAgent observability with Vercel AI SDK
Probe CLI binary for darwin-x64
Next-generation AI-native JSON-LD schema utility with LLM optimization, intelligent content analysis, and advanced performance monitoring. Zero dependencies.
AI Memory Management MCP Server - Intelligent context storage with response size limiting and pagination
This package helps you summarize pdfs using Gemini nano on edge or on browser, making it compliant safe, faster and free
Model Context Protocol server for Snowflake database integration
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Anosys.ai package for OpenAI client tracing
MCP server and CLI tool for web search by OpenAI o3 model
Probe CLI binary for linux-arm64
Testing framework for Model Context Protocol (MCP) servers - like Jest but for MCP
A system for transforming AI interactions from assumption-based guesswork to systematic, evidence-driven excellence.
Create a custom LLM project from scratch - like create-react-app for language models
NPX wrapper for Microsoft's MarkItDown MCP server - run without Docker. Provides the same file conversion capabilities (PDF, Word, Excel, images, etc.) as the original Docker version but with easier setup and direct file system access.
Interactive CLI tool and MCP server for managing MCP configurations with tech stack detection and recommendations
MCP server for managing Unleash feature flags across multiple instances and environments
> Deobfuscate Javascript code using LLMs ("AI")
OpenInference instrumentation for AWS Bedrock
适用于任意前端框架的 AI 集成套件,支持多模型接入、聊天面板、Function Calling 等。
Dynamic MCP server toolkit for runtime toolset management with Fastify transport and meta-tools
Advanced LLM-powered JavaScript unminifier and deobfuscator
Use Claude Code without an Anthropics account and route it to another LLM provider
TypeScript SDK for tracing LLM, RAG, and agent applications with Noveum
MCP AI调用工具,支持OpenAI规范接口,允许大模型通过MCP调用外部AI服务
Probe CLI binary for darwin-arm64
Analyze a folder and generate a compact Markdown context for LLMs or code review (library + CLI).
MCP server providing financial market data from finmap.org
Model Context Protocol (MCP) server implementation for Cloudflare Workers
Minimalist local LLM chat interface using Ollama
A local MCP stdio server that provides access to the OpenAI (ChatGPT) API with web search capabilities for Claude Code and other MCP clients.
AI CLI wrapper for multiple LLM providers
Transform entire codebases into AI-ready context. Perfect for LLM prompting, code documentation, and project analysis.
Generate professional work reports from git commits using AI (OpenAI, Claude, Gemini, OpenRouter). Includes Slack integration.
TypeScript client library for AI Search API - search and retrieve intelligent responses with context awareness
An AI agent/chatbot with built-in tools and MCP support that lives in your terminal
Chat and LLM API integration library for AITuber OnAir
Modern MCP server for HeyReach LinkedIn automation with dual transport support (stdio + HTTP streaming) and header authentication
Observee SDK - A TypeScript SDK for MCP tool integration with LLM providers
Prototype MCP Server for CLL
AI assistant MCP server with web search and code execution capabilities
An intelligent CLI tool that uses AI to analyze git commits and generate a clear, human-readable report.
Qualcomm lib Genie binding for React Native
A package to format chat conversation for large language models (LLMs) with Nunjucks.
Use Claude Code without an Anthropics account and route it to another LLM provider
HOC and hook to use the LLMAsAService.io LLM load balancer and firewall
Official SDK for SLNG.AI Voice API - Text-to-Speech, Speech-to-Text, and LLM services
AI-powered QA agent using LLM models for automated testing and web interaction
Complete GraphRAG integration with both AI Agent and Tool Node - N8N Tools proprietary GraphRAG implementation for document processing, knowledge graphs, and intelligent search with external database support (Pinecone, Neo4j, Weaviate, etc.)
Browser automation powered by LLMs in JavaScript
LF Widgets - Showcase component and documentation
AI CLI tool that generates Cypress test scripts from natural language scenarios and a URL
NAPI-RS bindings for BodhiApp server with integrated tests
Creates a new LLM-based web app project that has a "local-first" sensibility.
Provide a complete set of AI dialogue solutions for your development board, including but not limited to the IAT+LLM+TTS integration solution for the ESP32 series development board. | 为你的开发板提供全套的AI对话方案,包括但不限于 `ESP32` 系列开发板的 `IAT+LLM+TTS` 集成方案。
Provide an universal API to LLMs. Support for existing LLMs can be added by writing a driver.
Zero-dependency, modular SDK for building robust natural language applications
AI-powered research assistant with web search capabilities and beautiful terminal UI
Initialize workflow agent package
Genetic-Pareto prompt optimizer to evolve system prompts from a few rollouts with modular support and intelligent crossover
Apple LLM provider for Vercel AI SDK
This folder contains the source code and emcc bindings for compiling XGrammar to Javascript/Typescript via [emscripten](https://emscripten.org/).
OpenInference instrumentation for AWS Bedrock Agent Runtime
Unified AI-powered CLI toolkit with enterprise security, TDD, multi-provider AI, and production automation
An MCP server that allows AI tools to interact with Claude Code programmatically with session continuity and async execution support.
Enable usage of OpenAI models with embedjs
React helpers for Composable AI
Intelligent LLM model selection with hybrid AI + deterministic scoring + Live Vellum scraping
An efficient task manager. Designed to minimize tool confusion and maximize LLM budget efficiency while providing powerful search, filtering, and organization capabilities across multiple file formats (Markdown, JSON, YAML)
World-mediated agent management system with clean API surface
Vite plugin for HTML to Markdown conversion with on-demand generation
Mixer LLM Plugin module of the BOM Repository
GraphQL MCP server for AI assistants
Streaming-first structured data extraction from LLMs with real-time updates
TypeScript implementation of MLflow Tracing SDK for LLM observability
The Hugging Face adapters for nlux, the javascript library for building conversational AI interfaces.
Cross-platform probe code search tool (meta package)
LLM module of the BOM Repository
GraphZep: A temporal knowledge graph memory system for AI agents based on the Zep paper
Web page loader for embedjs
An MCP server implementing a scratchpad for LLM inner monologue
Simple utility to format MCP tool errors like Cursor
An MCP server that provides access to MariaDB ou MySQL databases.
Convert markdown documentation to LLM-friendly format
Собирает структуру проекта и содержимое файлов в Markdown для формирования контекстных подсказок (prompts).
Secure CLI tool that translates natural language to shell commands using local AI models via Ollama, with project memory system, reusable command templates (hooks), MCP (Model Context Protocol) support, and dangerous command detection
CLI tool to convert codebases into formatted strings for LLMs
Janus - An intelligent OpenAPI MCP server for token-optimized API exploration
A lightweight Vue.js composable for type-safe AI integrations using Microsoft's TypeChat.
Model Context Protocol server for Token Metrics API - provides comprehensive cryptocurrency data, analytics, and insights
MCP server for MQTT-PLC communication with real-time industrial PLC data collection and control
Sync your local Task Master tasks to Notion, enabling powerful Kanban, timeline, and calendar views. Fork of claude-task-master with Notion integration.
Generates a Markdown representation of a project's file structure for LLM context.
A compiler for turning markdown prompts into TypeScript modules.
Build LangGraph agents with large numbers of tools
Complete NestJS implementation of the Model Context Protocol (MCP) server with all core types and utilities
The Hugging Face adapters for nlux React, the React JS library for building conversational AI interfaces.
A fork of task-master-ai with various improvements that make it more configurable and robust.
The ultimate prompt-driven, component-based, AI-powered, vibe-oriented programming language.
A framework for connecting your data to large language models
Probe CLI binary for win32-x64
Promptbook: Run AI apps in plain human language across multiple models and platforms
PDF loader for embedjs
Use Claude Code without an Anthropics account and route it to another LLM provider
A Model Context Protocol (MCP) server that provides tools for interacting with Prometheus monitoring systems
Calculate prices for calling LLM inference APIs
tune - LLM chat in text file
MCP (Model Context Protocol) bundle server for Coherence Gateway - enables LLMs to interact with domain-specific OpenAPI tool bundles
Let users pick their OpenAI compatible API provider (e.g. OpenRouter, Ollama) via a Bootstrap modal
Advanced routing and transformation system for Claude Code outputs to multiple AI providers
MCP server for local SQLite database operations
n8n node for Perplexity AI API integration
SDK for AgentOrc AI agent orchestration platform
Enables agents to quickly find and edit code in a codebase with surgical precision. Find symbols, edit them everywhere
The Memory Layer For Your AI Apps
Fast screenshot capture tool for web pages - optimized for Claude Vision API
A comprehensive playground for testing AI agents and models with support for multiple LLM providers
A complete TypeScript framework for building LLM applications with agent support and MCP integration
A Model Context Protocol (MCP) server that provides intelligent web reading capabilities using the Jina AI Reader API. It extracts clean, LLM-ready content from any URL.
The Vertesia command-line interface (CLI) provides a set of commands to manage and interact with the Vertesia Platform.
n8n nodes for Kaia LLM integration
Get it together, and organize your prompts.
Automated orchestration system for managing Claude Code instances to complete complex tasks
MCP server to send messages and embeds to Discord webhooks via stdio.
LLM interface for Agenite
Astro Integration: llms.txt generator (AI-optimized summary of all HTML pages)
NAPI-RS bindings for BodhiApp server with integrated tests
AI-powered code intelligence CLI for code security, analysis, and review
OKX MCP (Model Context Protocol) server providing trading and portfolio management tools
Core types and utilities for NestJS MCP packages
A client library for sending context data to a context inspector server for debugging and comparison
TypeScript/JavaScript client for OpenRouter API with history management and tool calls support
Analyze git commits and generate categories, summaries, and descriptions for each commit. Optionally generate a yearly breakdown report of your commit history.
MCP server wrapper for OpenAI's CodeX CLI
Mixer LLM module of the BOM Repository
Generate LLM-optimized documentation for React and Stencil components
Run AI models anywhere.
MCP server for Unsplash API integration that provides tools to search and retrieve images
An MCP server providing Google Search, web scraping, and Gemini AI analysis tools to empower AI assistants with research capabilities.
Run AI inference in Unity Engine.
Model Context Protocol (MCP) server implementation for Codebolt
A universal template-based prompt management system for LLM applications
Juspay Agent Framework - A purely functional agent framework with immutable state and composable tools
SDK for integrating with AI Providers plugin
User Large Language Models in Your React Apps
A lightweight TypeScript implementation of LangChain with cost optimization features
Astro integration to automatically generate AI-friendly documentation files: /llms.txt, /llms-small.txt, and /llms-full.txt
N8N nodes for AI APIs including Gemini, Claude, GPT, and more
Process and Watch JSON Stream
Semantic Web Memory for Intelligent Agents
Word, PPT and Excel loader for embedjs
VitePress plugin for generating AI knowledge files for LLMs
A library for running evaluations for AI use cases
Highly customizable chat widget for web applications, supporting any backend or LLM via a custom provider.
High-efficiency YouTube MCP server: Get token-optimized, structured data for your LLMs using the YouTube Data API v3.
Ollama provider for Agenite
A middleware to pretty print the logs for Agenite
Modern JavaScript SDK for executing Clara AI agent workflows with zero configuration
A general-purpose, secure Model Context Protocol (MCP) server for MySQL databases.
Full-access PostgreSQL server for Model Context Protocol with read/write capabilities and enhanced schema metadata
Runtime helpers for Hashbrown AI
Universal interface for disperate AI vendors
A flexible and extensible Prompt DSL for building AI prompts with templates, variables, chaining, and intelligent expansion.
A platform-agnostic AI agent framework for building autonomous AI agents with tool execution capabilities
A set of tools to work with LLMs and KaibanJS
MCP server for weather data using the National Weather Service API
Anaplian core library for long running AI agent development
MCP tool enabling bidirectional communication between LLMs and humans through clarification requests and real-time thought sharing
Stay Calm and Prompt On (SCAPO) - MCP server for AI/ML best practices
Run Claude Code with Kimi K2 API - 使用 Kimi K2 运行 Claude Code
Deterministic LLM contract checks for CI
AI-Powered Content Translation for Strapi
Model Context Protocol server for New Relic observability platform integration
Unofficial Node.js SDK for Prompt Security Protection Service
Model Context Protocol server for xats documents
Make LLM output 'look nicer'
Мощный и гибкий класс для создания и управления AI-диалогами с поддержкой различных языковых моделей, инструментов и сохранения состояния
OpenServ Agent SDK - Create AI agents easily
MCP server for interacting with CodeRabbit AI reviews on GitHub pull requests. Enables LLMs to analyze, implement, and resolve CodeRabbit suggestions programmatically.
Secure Model Context Protocol (MCP) server for Telegram integration. Runs locally, allows AI agents to read chats and message history, with built-in readonly mode for safety.
Transform LLM Agents into High-Performance Engines with DAG optimization
AI plugin for dovenv
N8N node for AI token tracking and monitoring with sub-workflow execution capabilities