Package Exports
- openai-plugins
- openai-plugins/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (openai-plugins) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
openai-mcp
A pluggable middleware extension for the OpenAI SDK with support for Model Context Protocol (MCP) and multi-provider model routing.
Features
- Multi-provider model routing: Seamlessly use models from OpenAI, Anthropic, Gemini, and others through a unified interface
- MCP dynamic tool orchestration: Connect to Model Context Protocol servers to enable advanced tool usage
- Two-pass completions: Get more accurate and reliable model responses
- Pluggable middleware: Easily extend and customize the behavior
Installation
npm install openai-mcpQuick Start
import OpenAI from 'openai-mcp';
// Create a client with your API key
const client = new OpenAI({
apiKey: 'your-api-key-here', // Optional, will be used for all providers
// Enable plugins
plugins: ['multiModel', 'mcp'], // multiModel is on by default
// MCP configuration (optional)
mcp: {
serverUrl: 'http://localhost:3000/mcp',
}
});
// Use like the standard OpenAI SDK
async function main() {
const response = await client.chat.completions.create({
model: 'gpt-4o', // or 'claude-3-opus-20240229' or 'gemini-1.5-pro'
messages: [
{ role: 'user', content: 'Hello, can you help me?' }
]
});
console.log(response.choices[0].message.content);
}
main().catch(console.error);API Keys
You can provide the API key in two ways:
Direct in the constructor:
const client = new OpenAI({ apiKey: 'your-api-key-here' // Will be used for all providers });
Environment variables (if no apiKey provided):
OPENAI_API_KEY=sk-... # For OpenAI models ANTHROPIC_API_KEY=sk-ant-... # For Anthropic models GEMINI_API_KEY=... # For Gemini models
The library will use API keys in this priority:
- The apiKey provided in the constructor for all providers
- Provider-specific environment variables as fallback when no apiKey is provided
MCP Server Configuration
You can connect to MCP servers in several ways:
Single Server URL
const client = new OpenAI({
mcp: {
serverUrl: 'http://localhost:3000/mcp' // Connect to a single MCP server
}
});Multiple Server URLs
You can provide multiple server URLs for redundancy. The client will try each URL in order until it successfully connects:
const client = new OpenAI({
mcp: {
serverUrls: [
'http://primary-server.com/mcp',
'http://backup-server.com/mcp'
]
}
});Environment Variables
Server URLs can also be configured via environment variables:
MCP_SERVER_URL=http://localhost:3000/mcpOr multiple URLs separated by commas:
MCP_SERVER_URLS=http://primary-server.com/mcp,http://backup-server.com/mcpThe library checks configuration sources in this order:
serverUrlsoption in the constructorserverUrloption in the constructorMCP_SERVER_URLSenvironment variableMCP_SERVER_URLenvironment variable- Default:
http://0.0.0.0:3000/mcp
MCP Tool Usage
When connected to a Model Context Protocol server, the library automatically discovers and makes available all tools registered with the server:
const client = new OpenAI({
apiKey: 'your-api-key-here',
plugins: ['mcp'],
mcp: {
serverUrl: 'http://localhost:3000/mcp',
maxToolCalls: 10, // Maximum number of tool calls per request
toolTimeoutSec: 60, // Timeout for each tool call
disconnectAfterUse: true, // Disconnect after each request
finalResponseSystemPrompt: "Provide a helpful answer based on the tool results." // System prompt for final response after tool execution
}
});
// Tools will be automatically discovered and used
const response = await client.chat.completions.create({
model: 'gpt-4o',
messages: [
{ role: 'user', content: 'What is the weather like in New York?' }
]
});MCP Configuration Options
The MCP plugin supports the following configuration options:
| Option | Description | Default |
|---|---|---|
serverUrl |
URL for a single MCP server | http://0.0.0.0:3000/mcp |
serverUrls |
Array of MCP server URLs to try | - |
headers |
Custom headers for MCP server requests | {} |
maxToolCalls |
Maximum number of tool calls per request | 15 |
toolTimeoutSec |
Timeout in seconds for each tool call | 60 |
disconnectAfterUse |
Whether to disconnect after each request | true |
connectionTimeoutMs |
Connection timeout in milliseconds | 5000 |
maxMessageGroups |
Maximum message groups to include | 3 |
finalResponseSystemPrompt |
System prompt for final response | "Provide a helpful answer based on the tool results, addressing the user's original question." |
Environment Variables
The library supports the following environment variables:
| Variable | Description | Usage |
|---|---|---|
OPENAI_API_KEY |
API key for OpenAI models | Used when no API key is provided and using OpenAI models |
ANTHROPIC_API_KEY |
API key for Anthropic models | Used when no API key is provided and using Claude models |
GEMINI_API_KEY |
API key for Google Gemini models | Used when no API key is provided and using Gemini models |
MCP_SERVER_URL |
Single MCP server URL | Used when no server URL is provided in configuration |
MCP_SERVER_URLS |
Comma-separated list of MCP server URLs | Used when no server URLs are provided in configuration |
LOG_LEVEL |
Logging level (debug, info, warn, error) |
Controls verbosity of logs, defaults to info |
Custom Plugins
You can create and register your own plugins:
const myPlugin = {
name: 'myPlugin',
async handle(params, next) {
console.log('Request params:', params);
// Modify params if needed
const result = await next(params);
console.log('Response:', result);
return result;
}
};
const client = new OpenAI({
apiKey: 'your-api-key-here',
plugins: ['multiModel'],
customPlugins: [myPlugin]
});License
MIT