JSPM

  • Created
  • Published
  • Downloads 907
  • Score
    100M100P100Q112623F
  • License MIT

The most complete MCP server for API testing. 35 tools: HTTP requests, assertions, request flows, OpenAPI import, mock data, load testing, collections, environments, native export/import, Postman import/export, cURL export, response diffing. Zero config, zero dependencies.

Package Exports

  • @cocaxcode/api-testing-mcp
  • @cocaxcode/api-testing-mcp/dist/server.js
  • @cocaxcode/api-testing-mcp/server

Readme

@cocaxcode/api-testing-mcp

The most complete MCP server for API testing. Period.
42 MCP tools · Zero config · Works in any MCP client

npm version npm downloads 42 tools 120+ tests Node License

Overview · Just Talk to It · Installation · Features · Tool Reference · Storage · Architecture


Quick Overview

The most complete MCP server for API testing — 42 tools, zero config, nothing else comes close. This is not just a request sender. It is a full testing workbench: HTTP requests with assertions, multi-step flows with variable extraction, OpenAPI import with schema-aware mock data, load testing with percentile metrics, response diffing across environments, bulk test runners, reusable collections, environment groups with directory scoping and persistent defaults, Postman import/export, and cURL export. All from natural conversation. No accounts, no cloud, no generated files. Everything runs inline and stores as plain JSON you own.


Just Talk to It

You don't need to learn tool names or parameters. Describe what you want and the AI picks the right tool.

"Create a group called my-project and add this directory as scope"
"Set up a dev environment with BASE_URL http://localhost:3000"
"Switch to prod for this session"
"Set dev as the default environment"
"Import my API spec from /api-docs-json"
"Show me all user endpoints"
"GET /users"
"Create a user with random data"
"Verify that DELETE /users/5 returns 204"
"Login as admin, extract the token, then fetch dashboard stats"
"How fast is /health with 50 concurrent requests?"
"Run all my saved smoke tests"
"Compare the users endpoint between dev and prod"
"Export the create-user request as curl"
"Export my collection to Postman"

If you've imported an OpenAPI spec, the AI already knows every endpoint, every required field, every valid enum value. When you say "create a blog post", it reads the schema and builds the request correctly — no guessing.


Installation

Claude Code

claude mcp add --scope user api-testing -- npx -y @cocaxcode/api-testing-mcp@latest

Claude Desktop

Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{
  "mcpServers": {
    "api-testing": {
      "command": "npx",
      "args": ["-y", "@cocaxcode/api-testing-mcp@latest"]
    }
  }
}

Cursor / Windsurf

Add to .cursor/mcp.json or .windsurf/mcp.json in your project root:

{
  "mcpServers": {
    "api-testing": {
      "command": "npx",
      "args": ["-y", "@cocaxcode/api-testing-mcp@latest"]
    }
  }
}
VS Code / Codex CLI / Gemini CLI

VS Code — add to .vscode/mcp.json:

{
  "servers": {
    "api-testing": {
      "command": "npx",
      "args": ["-y", "@cocaxcode/api-testing-mcp@latest"]
    }
  }
}

Codex CLI (OpenAI):

codex mcp add api-testing -- npx -y @cocaxcode/api-testing-mcp@latest

Or add to ~/.codex/config.toml:

[mcp_servers.api-testing]
command = "npx"
args = ["-y", "@cocaxcode/api-testing-mcp@latest"]

Gemini CLI — add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "api-testing": {
      "command": "npx",
      "args": ["-y", "@cocaxcode/api-testing-mcp@latest"]
    }
  }
}

Quick Start

Once installed, set up an environment so relative paths resolve automatically:

"Create an environment called dev with BASE_URL http://localhost:3000"

If your API has a Swagger/OpenAPI spec, import it:

"Import my API spec from http://localhost:3000/api-docs-json"

Verify with: "List my environments" — you should see the one you just created.


Features

HTTP Requests

Send any HTTP method with headers, query params, JSON body, auth, and {{variable}} interpolation. Relative URLs auto-resolve against BASE_URL.

"POST to /api/users with name Jane and email jane@company.com using my bearer token"

Supports: GET, POST, PUT, PATCH, DELETE, HEAD, OPTIONS — Bearer / API Key / Basic auth — custom timeouts.

Assertions

Validate responses with structured pass/fail results:

"Verify that GET /api/health returns 200, body.status is ok, and responds in under 500ms"
PASS — 3/3 assertions passed
  status === 200
  body.status === "ok"
  timing.total_ms < 500

10 operators: eq, neq, gt, gte, lt, lte, contains, not_contains, exists, type

Request Flows

Chain requests with variable extraction between steps. Perfect for auth flows and CRUD sequences.

"Login as admin@test.com, extract the access token, then use it to fetch all users"
What the tool executes
flow_run({
  steps: [
    {
      name: "login",
      method: "POST",
      url: "/auth/login",
      body: { email: "admin@test.com", password: "SecurePass#99" },
      extract: { "TOKEN": "body.access_token" }
    },
    {
      name: "get-users",
      method: "GET",
      url: "/api/users",
      headers: { "Authorization": "Bearer {{TOKEN}}" }
    }
  ]
})

OpenAPI Import

Import specs from a URL or local file (JSON and YAML). Once imported, the AI knows every endpoint, parameter, and schema.

"Import my API spec from http://localhost:3000/api-docs-json"
"Import the spec from ./openapi.yaml"
"What parameters does POST /users expect?"

Supports OpenAPI 3.x with full $ref resolution, allOf, oneOf, anyOf. OpenAPI 2.0 partially supported.

Mock Data Generation

Generate realistic fake data from your OpenAPI schemas. Respects types, formats (email, uuid, date-time), enums, and required fields.

"Generate mock data for creating a user"
{
  "email": "user42@example.com",
  "name": "Test User 73",
  "password": "TestPass123!",
  "role": "admin"
}

Load Testing

Fire N concurrent requests and get performance metrics:

"How fast is the health endpoint with 50 concurrent requests?"
LOAD TEST — GET /api/health
Requests:    50 concurrent
Successful:  50 | Failed: 0
Req/sec:     23.31

  Min: 45ms | Avg: 187ms
  p50: 156ms | p95: 412ms | p99: 523ms
  Max: 567ms

Response Diffing

Execute two requests and compare their responses field by field. Detect regressions or compare environments.

"Compare the users endpoint between dev and prod"

Bulk Testing

Run every saved request in a collection (or filter by tag) and get a summary:

"Run all my saved smoke tests"
BULK TEST — 8/8 passed | 1.2s total
  health       — GET  /health      → 200 (45ms)
  list-users   — GET  /users       → 200 (123ms)
  create-post  — POST /blog        → 201 (89ms)
  login        — POST /auth/login  → 200 (156ms)

Collections

Save requests for reuse with tags. Build regression suites.

"Save this request as create-user with tags auth, smoke"
"List all requests tagged smoke"

Environments

Environments hold your variables — BASE_URL, tokens, API keys — and keep them separated by context. The system has three core concepts:

Group. A group organizes environments and binds them to directories. A group has N scopes (directories) that share its environments, and exactly one default environment. When you create an environment inside a group, it belongs to that group. When you cd into a directory that is a scope of a group, its environments become available automatically.

Default. The default environment activates automatically when you enter a scope of its group. It persists between sessions — restart your editor, reopen your terminal, and the default is still there. Set it once and forget about it.

Active. The active environment is what is being used right now for variable resolution. It starts as the default when you enter a scope, but you can switch it at any time. The active selection is session-only — it resets to the default on restart.

Global environments (not associated with any group) still exist. They require explicit activation with env_switch and do not persist between sessions.

Practical example:

"Create a group called my-api"
"Add this directory as scope to my-api"
"Create a dev environment with BASE_URL http://localhost:3000"   <- auto-joins group, auto-default
"Create a prod environment with BASE_URL https://api.example.com"
"List environments"                                              <- shows dev (active, default) and prod
"Switch to prod"                                                 <- session only
"Set prod as default"                                            <- persists

Automatic interpolation. Any {{variable}} in URLs, headers, query params, or request bodies is resolved against the active environment before the request fires. Set BASE_URL once and every relative path just works.

Your credentials never leave your machine. Environment files are plain JSON stored in ~/.api-testing/. Nothing syncs to any cloud. Nothing gets embedded in exports. Nothing gets tracked by git. Your tokens and secrets stay exactly where they should: on your disk, under your control.

Postman Import & Export

Bidirectional Postman support. Migrate seamlessly between Postman and your AI workflow.

"Import my Postman collection from ./exported.postman_collection.json"
"Export my collection to Postman"
"Export the dev environment for Postman"
Import details

Collection: Postman v2.1 format. Folders become tags. Auth inherited from folders/collection level. Supports raw JSON, x-www-form-urlencoded, form-data bodies.

Environment: Prefers currentValue over value. Skips disabled variables. Optional activate flag.

Export details

Collection: Requests grouped in folders by tag. Auth mapped to Postman's native format. {{variables}} preserved as-is.

Environment: All variables exported as enabled: true in Postman-compatible format.

Native Export & Import

Export collections and environments to a portable .atm/ folder. Share with your team or copy between projects.

"Export my collection and dev environment"
your-project/
└── .atm/
    ├── collection.json
    └── dev.env.json

Note: .atm/ is automatically added to .gitignore on first export.

cURL Export

Convert any saved request into a ready-to-paste cURL command with resolved variables.

"Export the create-user request as curl"
curl -X POST \
  'https://api.example.com/users' \
  -H 'Authorization: Bearer eyJhbGci...' \
  -H 'Content-Type: application/json' \
  -d '{"name":"Jane","email":"jane@company.com"}'

Tool Reference

42 tools across 9 categories:

Category Tools Count
Requests request 1
Testing assert 1
Flows flow_run 1
Collections collection_save, collection_list, collection_get, collection_delete 4
Environments env_create, env_list, env_set, env_get, env_switch, env_rename, env_delete, env_spec 8
Groups env_group_create, env_group_list, env_group_delete, env_group_add_scope, env_group_remove_scope, env_set_default, env_set_group 7
API Specs api_import, api_spec_list, api_endpoints, api_endpoint_detail 4
Mock mock 1
Utilities load_test, export_curl, diff_responses, bulk_test, export_collection, import_collection, export_environment, import_environment, export_postman_collection, import_postman_collection, export_postman_environment, import_postman_environment 12

Tip: You don't need to call tools directly. Describe what you want and the AI picks the right one.


Storage

Everything is local. No database, no cloud sync, no telemetry. All data lives in ~/.api-testing/ as plain JSON files you can read, back up, or delete at any time.

~/.api-testing/
├── groups/               # Environment groups with scopes and defaults
├── environments/         # Environment variables — tokens, keys, passwords
├── collections/          # Saved requests (shareable, no secrets)
├── specs/                # Imported OpenAPI specs
└── project-envs.json     # Session-only active environments (cleared on restart)

Global storage vs project exports. The ~/.api-testing/ directory is your private, global store — this is where credentials live and they never leave. When you export a collection or environment, it goes to .atm/ in your project root. That folder is auto-added to .gitignore on first export, but even if you choose to commit it, your credentials stay in ~/.api-testing/ and are never copied into .atm/. You can safely share .atm/ exports with your team without leaking secrets.

Override the default storage path:

{
  "env": { "API_TESTING_DIR": "/path/to/custom/.api-testing" }
}

Warning: If you override API_TESTING_DIR to a path inside a git repository, add .api-testing/ to your .gitignore to avoid pushing credentials.


Architecture

src/
├── index.ts              # Entry point (shebang + StdioServerTransport)
├── server.ts             # createServer() factory
├── tools/                # 42 tool handlers (one file per category)
│   ├── request.ts        # HTTP requests (1)
│   ├── assert.ts         # Assertions (1)
│   ├── flow.ts           # Request chaining (1)
│   ├── collection.ts     # Collection CRUD (4)
│   ├── environment.ts    # Environment management (8)
│   ├── group.ts          # Environment groups (7)
│   ├── api-spec.ts       # OpenAPI import/browse (4)
│   ├── mock.ts           # Mock data generation (1)
│   ├── load-test.ts      # Load testing (1)
│   └── utilities.ts      # curl, diff, bulk, import/export (12)
├── lib/                  # Business logic (no MCP dependency)
│   ├── http-client.ts    # fetch wrapper with timing
│   ├── storage.ts        # JSON file storage engine
│   ├── schemas.ts        # Shared Zod schemas
│   ├── url.ts            # BASE_URL resolution
│   ├── path.ts           # Dot-notation accessor (body.data.0.id)
│   ├── interpolation.ts  # {{variable}} resolver
│   └── openapi-parser.ts # $ref + allOf/oneOf/anyOf resolution
└── __tests__/            # 10+ test suites, 120+ tests

Stack: TypeScript (strict) · MCP SDK · Zod · Vitest · tsup


MIT · Built by cocaxcode