JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 52
  • Score
    100M100P100Q59866F
  • License MIT

OpenAI SDK wrapper with support for Model Context Protocol (MCP) and multi-provider model routing via pluggable middleware

Package Exports

  • openai-plugins
  • openai-plugins/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (openai-plugins) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

openai-mcp

A pluggable middleware extension for the OpenAI SDK with support for Model Context Protocol (MCP) and multi-provider model routing.

npm version License: MIT

Features

  • Multi-provider model routing: Seamlessly use models from OpenAI, Anthropic, Gemini, and others through a unified interface
  • MCP dynamic tool orchestration: Connect to Model Context Protocol servers to enable advanced tool usage
  • Two-pass completions: Get more accurate and reliable model responses
  • Pluggable middleware: Easily extend and customize the behavior

Installation

npm install openai-mcp

Quick Start

import OpenAI from 'openai-mcp';

// Create a client with your API keys
const client = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,         // Required for OpenAI models
  anthropicApiKey: process.env.ANTHROPIC_API_KEY,  // Optional for Claude models
  geminiApiKey: process.env.GEMINI_API_KEY,    // Optional for Gemini models
  
  // Enable plugins
  plugins: ['multiModel', 'mcp'],  // multiModel is on by default
  
  // MCP configuration (optional)
  mcp: {
    serverUrl: 'http://localhost:3000/mcp',
    systemPrompt: 'You are a helpful assistant with tools.'
  }
});

// Use like the standard OpenAI SDK
async function main() {
  const response = await client.chat.completions.create({
    model: 'gpt-4o',  // or 'claude-3-opus-20240229' or 'gemini-1.5-pro'
    messages: [
      { role: 'user', content: 'Hello, can you help me?' }
    ]
  });
  
  console.log(response.choices[0].message.content);
}

main().catch(console.error);

MCP Tool Usage

When connected to a Model Context Protocol server, the library automatically discovers and makes available all tools registered with the server:

const client = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
  plugins: ['mcp'],
  mcp: {
    serverUrl: 'http://localhost:3000/mcp',
    maxToolCalls: 10,          // Maximum number of tool calls per request
    toolTimeoutSec: 60,        // Timeout for each tool call
    disconnectAfterUse: true   // Disconnect after each request
  }
});

// Tools will be automatically discovered and used
const response = await client.chat.completions.create({
  model: 'gpt-4o',
  messages: [
    { role: 'user', content: 'What is the weather like in New York?' }
  ]
});

Custom Plugins

You can create and register your own plugins:

const myPlugin = {
  name: 'myPlugin',
  async handle(params, next) {
    console.log('Request params:', params);
    // Modify params if needed
    const result = await next(params);
    console.log('Response:', result);
    return result;
  }
};

const client = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
  plugins: ['multiModel'],
  customPlugins: [myPlugin]
});

License

MIT