ai
AI SDK by Vercel - The AI Toolkit for TypeScript and JavaScript
Found 4496 results for llm spark node.js
AI SDK by Vercel - The AI Toolkit for TypeScript and JavaScript
OpenTelemetry instrumentation for the `openai` OpenAI client library
Use Ollama with the Vercel AI SDK, implementing the official Ollama API. This provider has minimal dependencies and is web-compatible out of the box.
GitHub Action for evaluating MCP server tool calls using LLM-based scoring
Typescript bindings for langchain
Core LangChain.js abstractions and schemas
OpenAI integrations for LangChain.js
Promptbook: Turn your company's scattered knowledge into AI ready books
ServiceNow development with SnowCode - 75+ LLM providers (Claude, GPT, Gemini, Llama, Mistral, DeepSeek, Groq, Ollama) • 395 Optimized Tools • 2 MCP Servers • Multi-agent orchestration • Use ANY AI coding assistant (ML tools moved to Enterprise)
Production-ready AI agent orchestration platform with 66 specialized agents, 213 MCP tools, ReasoningBank learning memory, and autonomous multi-agent swarms. Built by @ruvnet with Claude Agent SDK, neural networks, memory persistence, GitHub integration,
AI SDK v5 provider for Claude via Claude Agent SDK (use Pro/Max subscription)
Anthropic integrations for LangChain.js
Community AI SDK provider for Google Gemini using the official CLI/SDK
A home for your AI agents
Superfast runtime validators with only one line
Mastra is a framework for building AI-powered applications and agents with a modern TypeScript stack.
Use AWS S3, the world's most reliable document storage, as a database with this ORM.
Promptbook: Turn your company's scattered knowledge into AI ready books
Tree-shakeable static models.dev catalog split by provider for TokenLens.
OpenAPI definitions and converters for 'typia' and 'nestia'.
cli for mastra
environment wrapper, supports all JS environment including node, deno, bun, edge runtime, and cloudflare worker
Create Mastra apps with one command
Create plain Markdown from OpenAPI documents (for LLMs)
ocr documents using gpt-4o-mini
Old abstractions form LangChain.js
OpenInference Core provides utilities shared by all OpenInference SDK packages.
Official JavaScript library for Tavily.
A lightweight registry of LLM model information, like name and context sizes, for building AI-powered apps.
Core types and registry utilities for TokenLens (model metadata).
Fast token estimation at 94% accuracy of a full tokenizer in a 2kB bundle
Helpers for context windows, usage normalization, compaction, and cost estimation.
Typed client for models.dev to fetch model catalogs with friendly errors.
PostHog Node.js AI integrations
n8n node to interact with the Perplexity AI API
Node.js wrapper for the probe code search tool
The Memory Layer For Your AI Apps
A task management system for ambitious AI-driven development that doesn't overwhelm and confuse Cursor.
The AI SDK for building declarative and composable AI-powered LLM products.
A client for the Phoenix API
<p align="center"> <img height="100" width="100" alt="LlamaIndex logo" src="https://ts.llamaindex.ai/square.svg" /> </p> <h1 align="center">LlamaIndex.TS</h1> <h3 align="center"> Data framework for your LLM application. </h3>
Use Claude Code without an Anthropics account and route it to another LLM provider
Vercel AI SDK Provider for Ollama using official ollama-js library
A tool to pack repository contents to single file for AI consumption
AI-powered architecture documentation generator with RAG, hybrid retrieval (semantic + structural), and multi-agent workflows using LangChain
The Retrieval-Augmented Generation (RAG) module contains document processing and embedding utilities.
The OpenRouter TypeScript SDK is a type-safe toolkit for building AI applications with access to 300+ language models through a unified API.
Hardware accelerated language model chats on browsers
LaunchDarkly AI SDK for Server-Side JavaScript
All AI in one agentic coding tool. Ailin¹ Developer Tool - CLI understands your codebase to accelerate your workflow and helps you code, build, refactor, and debug faster — all through natural-language commands.
Use non-Anthropic models with Claude Code by proxying requests through the lemmy unified interface
Enterprise-Class AI Command Line Interface - Primary support for GLM (General Language Model) with multi-provider AI orchestration powered by AutomatosX.
llama.cpp gguf file parser for javascript
OpenInference utilities for ingesting Vercel AI SDK spans
Pure JavaScript minimal lossless JSON parse event streaming, akin to SAX. Fast, modular, and dependency-free.
Generate Markdown versions of Docusaurus HTML pages and an llms.txt index file
Token-Oriented Object Notation (TOON) – Compact, human-readable, schema-aware encoding of JSON for LLM prompts
| [NPM Package](https://www.npmjs.com/package/@mlc-ai/web-tokenizers) | [WebLLM](https://github.com/mlc-ai/web-llm) |
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Multi-input, multi-output, natural language-driven AI node that automatically generates and executes code based on natural language instructions
A client for node or the browser to generate and consume streaming json
Maxim AI JS SDK. Visit https://getmaxim.ai for more info.
structured outputs for llms
Generates MCP server code from OpenAPI specifications
DOM to Semantic-Markdown for use in LLMs
Official TypeScript/JavaScript SDK for CortexDB - Multi-modal RAG Platform with advanced document processing
JS tokenizer for LLaMA-based LLMs
Provider-agnostic GenAI orchestrator (no provider SDKs) for browser & Node
Fonestar MCP server - Access Fonestar product catalog via Model Context Protocol
High-performance SDK to convert natural language prompts to MongoDB queries using AI (OpenAI GPT or Anthropic Claude)
[DEPRECATED] OpenInference utilities for ingesting Mastra spans - Use @mastra/arize instead
A drop-in replacement for @ai-sdk/react that automatically syncs chat state to Zustand stores
Promptbook: Turn your company's scattered knowledge into AI ready books
A Javascript Client SDK for the Linkup API
A powerful TypeScript library for building AI agents with multi-threaded conversations, tool execution, and event handling capabilities
Agentic AI utils which work with any LLM and TypeScript AI SDK.
- Convert a **stream of token** into a **parsable JSON** object before the stream ends. - Implement **Streaming UI** in **LLM**-based AI application. - Leverage **OpenAI Function Calling** for early stream processing. - Parse **JSON stream** into distinct
Flexible memory management node for n8n with support for multiple storage backends
Modern TypeScript MLX serving engine for Apple Silicon with Zod validation and ReScript state management
Promptbook: Turn your company's scattered knowledge into AI ready books
The OpenAI adapters for nlux, the javascript library for building conversational AI interfaces.
Typescript SDK for AskNews API
Promptbook: Turn your company's scattered knowledge into AI ready books
nlux is JavaScript and React library for building conversational AI interfaces, with support for OpenAI, Hugging Face, and more.
The OpenAI adapters for nlux React, the React JS library for building conversational AI interfaces.
nlux React is a library for building conversational AI interfaces, with support for OpenAI, HuggingFace, and more.
Otel registration and convenience methods
Dank Agent Service - Docker-based AI agent orchestration platform
Public types, enums and options used by Llumiverse API.
A NodeJS RAG framework to easily work with LLMs and custom datasets
AI-Native Logging for LLM Agent Development - TypeScript/Node.js Implementation
Node.js Library for Large Language Model LLaMA/RWKV
Solar LLM and Embeddings nodes for n8n
A universal LLM client - provides adapters for various LLM providers to adhere to a universal interface - the openai sdk - allows you to use providers like anthropic using the same openai interface and transforms the responses in the same way - this allow
Create browser automations with an LLM agent and replay them as Playwright scripts.
Themes and CSS files for nlux, the JavaScript and React library for building conversational AI interfaces.
Use MDX to render high quality LLM prompts
Promptbook: Turn your company's scattered knowledge into AI ready books
Simple caching wrapper for AI SDK tools - cache expensive tool executions with zero configuration
CLI tool for managing vLLM deployments on GPU pods
HAP - Model Context Protocol (MCP) Server by MingdaoCloud
Multi-agent orchestration system built on AI SDK v5 - handoffs, routing, and coordination for any AI provider
Core library using Vercel AI SDK for OpenAssistant
Modular MCP - Modular tool access for reducing context overhead
Complete middleware infrastructure for LLM-based backends with multi-provider support (Ollama, Anthropic, OpenAI, Google)
A robust utility to repair JSON strings - fix malformed or broken JSON, especially from LLM output like ChatGPT.
TypeScript SDK to anonymize PII in LLM prompts. Zero dependencies, HTTPS keep-alive, works with OpenAI/Anthropic/etc.
ping AI provider to check if a api key is valid
Docusaurus plugin for generating LLM-friendly documentation following the llmstxt.org standard
Promptbook: Turn your company's scattered knowledge into AI ready books
One interface to hundreds of LLM models, zero dependencies, tons of features, for Browser and Node.js
Universal AI Development Platform with working MCP integration, multi-provider support, and professional CLI. Built-in tools operational, 58+ external MCP servers discoverable. Connect to filesystem, GitHub, database operations, and more. Build, test, and
Syntax highlighter for nlux
A comprehensive collection of essential and popular Node.js libraries bundled together for easy use in your projects.
JS tokenizer for LLaMA 3
ThinkMem: AI Memory Management MCP System for LLMs
Free Agent MCP - Portable, workspace-agnostic code generation using FREE models (Ollama)
SDK for LLM Gateway plugin development
Promptbook: Turn your company's scattered knowledge into AI ready books
OpenAI Fastify plugin
Download and parse GitHub Actions CI artifacts and logs for LLM analysis
Fetcher is not just another HTTP client—it's a complete ecosystem designed for modern web development with native LLM streaming API support. Built on the native Fetch API, Fetcher provides an Axios-like experience with powerful features while maintaining
A NodeJS RAG framework to easily work with LLMs and custom datasets
OpenInference utilities for converting OpenTelemetry GenAI span attributes to OpenInference span attributes
Official Halfred.ai NodeJS plibrary
JavaScript and TypeScript client for Gradient AI
A Vitest reporter optimized for LLM parsing with structured, concise output
A "Notion-style" block-based extensible text editor built on top of Prosemirror and Tiptap.
OpenInference instrumentation for AWS Bedrock
All AI in one agentic coding tool. Ailin¹ CLI understands your codebase to accelerate your workflow and helps you code, build, refactor, and debug faster — all through natural-language commands.
MCP server for Florentine.ai - Natural language to MongoDB aggregations
Testing framework for Model Context Protocol (MCP) servers - like Jest but for MCP
TypeScript SDK for Crawl4AI REST API - Bun & Node.js compatible
Node.js API Wrapper for Florentine.ai - Natural language to MongoDB aggregations
JavaScript/TypeScript client for Judgment evaluation platform
MCP server for managing AI_README.md files in projects
Provider-agnostic AI orchestration platform with 20+ specialized agents, persistent memory, and multi-provider routing for Claude Code, Gemini CLI, OpenAI Codex, and ax-cli
Analyze git commits and generate categories, summaries, and descriptions for each commit. Optionally generate a yearly breakdown report of your commit history.
Token-efficient Structured Object Notation – a compact serialization format designed for efficient data exchange with LLMs
The LangServe adapters for nlux React, the React JS library for building conversational AI interfaces.
Turn any code folder into AI-ready text files instantly.
Deep Agents - a library for building controllable AI agents with LangGraph
LLM Gateway - Regex Hidder Plugin
JavaScript SDK for LLM Crafter API - A simple client library for interacting with LLM Crafter platform
The LangChain adapters for nlux, the javascript library for building conversational AI interfaces.
A secure and scalable Git MCP server enabling AI agents to perform comprehensive Git version control operations via STDIO and Streamable HTTP.
Promptbook: Turn your company's scattered knowledge into AI ready books
LLM Gateway Core - Main proxy server
Superfast runtime validators with only one line
Build conversational bots that can do anything with the help of ChatGPT and other LLM's.
Metrics collection system for LLMs and AI agents. Tracks performance, latency, and usage metrics for agents, tools, and LLM requests.
A comprehensive toolkit for the Google Gemini API, providing easy-to-use interfaces for text, chat, image, video, audio, and grounding features with all the latest Gemini models.
Production-ready RAG (Retrieval-Augmented Generation) for JavaScript & React - Built on official Ollama & LM Studio SDKs
A powerful MCP server with stdio/HTTP/SSE transport support for converting Markdown files and content to beautifully styled PDFs with Mermaid diagrams and ApexCharts. Features modern typography, multiple page formats, and professional styling.
Apple LLM provider for Vercel AI SDK
Interfaces for extending the embedjs ecosystem
Complete toolkit for building AI applications with the Vercel AI SDK - agents, state management, caching, artifacts, devtools, and memory
The TypeScript library for building AI applications.
Promptbook: Turn your company's scattered knowledge into AI ready books
AI-powered command-line tool for Snowflake Cortex - Claude 4 Sonnet integration
WorkflowAI JS SDK
AdMesh Backend SDK for Node.js - Subscribe to and weave recommendations into LLM responses
Promptbook: Turn your company's scattered knowledge into AI ready books
Promptbook: Turn your company's scattered knowledge into AI ready books
Promptbook: Turn your company's scattered knowledge into AI ready books
Token-Optimized Notation Language - A text-first, LLM-friendly serialization format with enterprise-level security, high-performance caching, comprehensive validation, streaming, and browser support
Model Context Protocol server for ASON compression/decompression. Compatible with Claude Desktop, Cline, Continue, and other MCP clients.
Promptbook: Turn your company's scattered knowledge into AI ready books
AI SDK v5 provider for Goose via WebSocket connection
Promptbook: Turn your company's scattered knowledge into AI ready books
ASON (Aliased Serialization Object Notation) - Token-optimized JSON compression for LLMs. Reduces tokens by 20-60% while maintaining perfect round-trip fidelity.
Multi-agent LiteRAG MCP server for advanced code graph analysis
A Prompt-orchestration pipeline (POP) is a framework for building, running, and experimenting with complex chains of LLM tasks.
Scrape and extract structured data from a webpage using ScrapeGraphAI's APIs. Supports cookies for authentication, infinite scrolling, and pagination.
A CLI for GenAIScript, a generative AI scripting framework.
n8n nodes for IONOS DNS, Domain, SSL/Certificate management, Cloud AI, Cloud Infrastructure, Container Registry, Database as a Service, CDN, VPN Gateway, Activity Log, Billing, Logging, Monitoring, Object Storage Management, Network File Storage, Identity
Enterprise-grade LLM security toolkit for JavaScript/TypeScript with WASM
MCP Server for Shopify API, enabling interaction with store data through GraphQL API
Google Calendar MCP Server with extensive support for calendar management
MCP server that connects AI assistants to Chrome DevTools Protocol for runtime debugging - set breakpoints, inspect variables, monitor network traffic, and automate browser interactions
Universal Tool Calling Protocol SDK
Shared utilities for Octocode MCP packages
Backstage catalog backend module for MCP (Model Context Protocol) entities
CLI tool for creating, running, and testing ADK agents
Websocket client to connect to your chainlit app.
Evaluation library for Model Context Protocol servers
An easy way to run AI models in React Native with ExecuTorch
Server-Sent Events (SSE) support for Fetcher HTTP client with native LLM streaming API support. Enables real-time data streaming and token-by-token LLM response handling.
Type-safe OpenAI API client for Fetcher ecosystem. Provides seamless integration with OpenAI's Chat Completions API, supporting both streaming and non-streaming responses with full TypeScript support.
NPX wrapper for Microsoft's MarkItDown MCP server - run without Docker. Provides the same file conversion capabilities (PDF, Word, Excel, images, etc.) as the original Docker version but with easier setup and direct file system access.
Dual API router (Anthropic + OpenAI compatible) for Claude MAX Plan - Use flat-rate billing with ANY AI tool: OpenAI SDK, LangChain, Anthropic SDK, and more
AniList MCP server for accessing AniList API data
Promptbook: Turn your company's scattered knowledge into AI ready books
MCP server for Microsoft Fabric Analytics with Synapse-to-Fabric migration - enables LLMs to access, analyze, and migrate workloads to Microsoft Fabric
A Model Context Protocol server for integrating HackMD's note-taking platform with AI assistants.
Unified LLM API with automatic model discovery and provider configuration
OpenAI LLM function schema from OpenAPI (Swagger) document
Promptbook: Turn your company's scattered knowledge into AI ready books
Meld: A template language for LLM prompts
Get structured, fully typed JSON outputs from OpenAI and Anthropic LLMs
A JSX-based templating engine for generating structured prompts with TypeScript support
Promptbook: Turn your company's scattered knowledge into AI ready books
AI-era Express replacement with zero-config MCP integration - Build AI-ready APIs in 30 seconds
Convert videos into LLM-friendly input by extracting and deduplicating frames
Fully typed chat APIs for OpenAI and Azure's chat models - with token checking and retries
Prompt Orchestration Markup Language
Guards, Evals & Observability for AI applications - works seamlessly with LangChain/LangGraph
Promptbook: Turn your company's scattered knowledge into AI ready books
Core functionality for LLM service providers
Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MC
OpenAPI Specification TypeScript types for Fetcher - A modern, ultra-lightweight HTTP client for browsers and Node.js. Provides complete TypeScript support with type inference for OpenAPI 3.x schemas.
Mnemosyne MCP: FREE Local Embeddings for Knowledge Graph Memory - No API Keys Required
Edge-native orchestration for AI agents. Built on Cloudflare Workers.
Model Context Protocol (MCP) server for Microsoft Dynamics 365 Business Central via WebUI protocol. Enables AI assistants to interact with BC through the web client protocol, supporting Card, List, and Document pages with full line item support and server
A n8n community node to interact with a (custom) LLM model and knowledge bases on an Open WebUI instance.
Conotion AI CLI – generate technical documentation, API references, and security reports for your projects in seconds.
Promptbook: Turn your company's scattered knowledge into AI ready books
Light weight JSON Schema $ref resolver. Expands a JSON Schema by resolving `$ref` references from a mapping of definitions. Does not handle remote references. Has comprehensive unit tests and no dependencies.
Typescript bindings for langchain
This package is currently experimental and only meant to be used internally to Mastra at the moment, the APIs are subject to change in this period.
Like React, but for datasets - A declarative, typesafe DSL for building LLM training datasets
Promptbook: Turn your company's scattered knowledge into AI ready books
Promptbook: Turn your company's scattered knowledge into AI ready books
A lightweight, multi-provider Node.js library for text tokenization, embedding, and context management
Simple axios interceptor for Jatevo x402 LLM API - Works in terminals, VSCode, and Node.js
JavaScript Base SDK for interacting with the Toolbox service
AI text generation plugin for the CE.SDK editor
Word, PPT and Excel loader for embedjs
CSV loader for embedjs
Load images into embedjs
TypeScript SDK for Agent Diff - test AI agents against replicas of services
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP in
Juspay Agent Framework - A purely functional agent framework with immutable state and composable tools
Promptbook: Turn your company's scattered knowledge into AI ready books
Model Context Protocol (MCP) server for Bitbucket Cloud and Server API integration
Node.js/TypeScript MCP server for Atlassian Jira. Equips AI systems (LLMs) with tools to list/get projects, search/get issues (using JQL/ID), and view dev info (commits, PRs). Connects AI capabilities directly into Jira project management and issue tracki
Promptbook: Turn your company's scattered knowledge into AI ready books
Sitemap recursive loader for embedjs
Promptbook: Turn your company's scattered knowledge into AI ready books
Useful util functions when extending the embedjs ecosystem
The Vertesia command-line interface (CLI) provides a set of commands to manage and interact with the Vertesia Platform.
Promptbook: Turn your company's scattered knowledge into AI ready books
AI Multi-Agent library for Javascript Developers.
typescript ai utils
Promptbook: Turn your company's scattered knowledge into AI ready books
Local MCP server for controlling a Commodore 64 via Ultimate 64 REST API
Promptbook: Turn your company's scattered knowledge into AI ready books
Lightweight library for making AI API calls with streaming support
Promptbook: Turn your company's scattered knowledge into AI ready books
Next generation LLM evaluation framework powered by Vitest.