mcp-openapi

Turn any OpenAPI/Swagger spec into Claude tools. Zero config, zero code.

mcp-openapi

Turn any OpenAPI/Swagger spec into MCP tools — so Claude and other AI assistants can call your REST APIs.

npm version License: MIT npm downloads

Point mcp-openapi at any OpenAPI 3.x or Swagger 2.0 spec URL and it generates Model Context Protocol (MCP) tools automatically. No code generation, no config files, no boilerplate. Your AI assistant gets callable tools for every API endpoint in seconds.


Quick Start

1. Run it (no install required):

npx mcp-openapi --spec https://petstore3.swagger.io/api/v3/openapi.json

2. Add it to Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "petstore": {
      "command": "npx",
      "args": [
        "mcp-openapi",
        "--spec", "https://petstore3.swagger.io/api/v3/openapi.json"
      ]
    }
  }
}

3. Ask Claude to use it:

"List all available pets in the store"

Claude sees MCP tools like find_pets_by_status, get_pet_by_id, add_pet and calls them directly.


Why mcp-openapi?

Most MCP-to-API bridges require you to write tool definitions by hand or generate code from a spec. mcp-openapi skips all of that.

Featuremcp-openapiHand-written MCP serversGeneric HTTP tools
Zero config setupYesNoPartial
OpenAPI 3.x + Swagger 2.0YesN/AN/A
Flat parameter schemas (LLM-optimized)YesManualNo
Smart tool naming from operationIdYesManualNo
Auth (API key, Bearer, OAuth2)Built-inDIYDIY
Retry with exponential backoffBuilt-inDIYDIY
Response truncation for LLM contextBuilt-inDIYNo

Flat parameter schemas are the key differentiator. Instead of passing nested JSON objects (which LLMs frequently get wrong), mcp-openapi flattens path, query, header, and body parameters into a single flat object. This dramatically improves tool-calling accuracy.


How It Works

OpenAPI/Swagger Spec          mcp-openapi               AI Assistant
     (URL or file)                                      (Claude, etc.)
          |                         |                         |
          |   1. Parse & validate   |                         |
          |------------------------>|                         |
          |                         |                         |
          |   2. Generate MCP tools |                         |
          |   (one per endpoint)    |                         |
          |------------------------>|                         |
          |                         |                         |
          |                         |   3. Register tools     |
          |                         |   via stdio transport   |
          |                         |------------------------>|
          |                         |                         |
          |                         |   4. AI calls a tool    |
          |                         |<------------------------|
          |                         |                         |
          |   5. Build & execute    |                         |
          |   HTTP request          |                         |
          |<------------------------|                         |
          |                         |                         |
          |   6. Return truncated   |                         |
          |   response to AI        |                         |
          |------------------------>|------------------------>|

Each API endpoint becomes one MCP tool:

  • Tool name is derived from operationId (converted to snake_case) or from method + path
  • Parameters are flattened into a single input schema (path, query, header, and body params merged)
  • Responses are truncated to ~50KB to stay within LLM context limits
  • Errors (429, 5xx) trigger automatic retries with exponential backoff (up to 3 retries)

Claude Desktop Integration

Add any API to Claude Desktop by editing your config file:

Location:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Public API (no auth)

{
  "mcpServers": {
    "petstore": {
      "command": "npx",
      "args": [
        "mcp-openapi",
        "--spec", "https://petstore3.swagger.io/api/v3/openapi.json"
      ]
    }
  }
}

API with Bearer Token

{
  "mcpServers": {
    "github": {
      "command": "npx",
      "args": [
        "mcp-openapi",
        "--spec", "https://raw.githubusercontent.com/github/rest-api-description/main/descriptions/api.github.com/api.github.com.json",
        "--auth-type", "bearer",
        "--auth-token", "$GITHUB_TOKEN",
        "--prefix", "github",
        "--include", "listReposForAuthenticatedUser,getRepo,listIssues,createIssue"
      ],
      "env": {
        "GITHUB_TOKEN": "ghp_your_token_here"
      }
    }
  }
}

API with API Key

{
  "mcpServers": {
    "weather": {
      "command": "npx",
      "args": [
        "mcp-openapi",
        "--spec", "https://api.weather.example.com/openapi.json",
        "--auth-type", "api-key",
        "--auth-name", "X-API-Key",
        "--auth-value", "$WEATHER_API_KEY",
        "--auth-in", "header"
      ],
      "env": {
        "WEATHER_API_KEY": "your_key_here"
      }
    }
  }
}

CLI Reference

npx mcp-openapi --spec <url-or-path> [options]

General Options

OptionShortDefaultDescription
--spec <url|path>-srequiredOpenAPI spec URL or local file path
--config <path>-cJSON config file path
--base-url <url>from specOverride the API base URL
--prefix <name>Prefix for all tool names (e.g. github -> github_list_repos)
--include <patterns>allComma-separated operationIds to include
--exclude <patterns>noneComma-separated operationIds to exclude
--timeout <ms>30000HTTP request timeout in milliseconds
--max-retries <n>3Max retries on 429/5xx responses
--header <name:value>-HCustom header (repeatable)
--transport <type>stdioTransport type: stdio or sse
--port <n>3000Port for SSE transport
--help-hShow help
--version-vShow version
--license-key <key>Pro license key (or $MCP_OPENAPI_LICENSE_KEY env)
--server <selector>0Select API server by index, partial URL, or exact URL
--no-doc-warningsSuppress doc quality warnings on startup
--dynamic-discoveryauto (100+)Enable dynamic tool discovery for large APIs

Auth Options

Bearer token:

OptionDescription
--auth-type bearerUse Bearer token authentication
--auth-token <token>The token value (supports $ENV_VAR syntax)

API key:

OptionDescription
--auth-type api-keyUse API key authentication
--auth-name <name>Header or query parameter name
--auth-value <value>The API key value (supports $ENV_VAR syntax)
--auth-in <header|query>Where to send the key (default: header)

OAuth2 client credentials:

OptionDescription
--auth-type oauth2Use OAuth2 client credentials flow
--auth-client-id <id>OAuth2 client ID
--auth-client-secret <secret>OAuth2 client secret
--auth-token-url <url>Token endpoint URL
--auth-scopes <scopes>Comma-separated scopes

CLI Examples

# Basic usage with a remote spec
npx mcp-openapi --spec https://petstore3.swagger.io/api/v3/openapi.json

# Local YAML spec with Bearer auth
npx mcp-openapi --spec ./api.yaml --auth-type bearer --auth-token '$API_KEY'

# Filter to specific endpoints with a prefix
npx mcp-openapi --spec ./api.json --prefix myapi --include 'listUsers,getUser'

# Override base URL (useful for local dev)
npx mcp-openapi --spec https://api.example.com/openapi.json --base-url http://localhost:3000

# Add custom headers
npx mcp-openapi --spec ./api.json -H 'X-Custom: value' -H 'X-Another: value2'

# Use a JSON config file
npx mcp-openapi --config ./mcp-config.json

# Select staging server
npx mcp-openapi --spec ./api.json --server staging

# Large API with dynamic discovery
npx mcp-openapi --spec https://api.stripe.com/openapi.json --dynamic-discovery

Config File Format

Instead of CLI flags, you can use a JSON config file:

{
  "spec": "https://api.example.com/openapi.json",
  "prefix": "myapi",
  "include": ["listUsers", "getUser", "createUser"],
  "auth": {
    "type": "bearer",
    "token": "$API_TOKEN"
  },
  "timeout": 15000,
  "maxRetries": 2,
  "headers": {
    "X-Custom-Header": "value"
  }
}

CLI arguments take precedence over config file values.


Supported Specs

FormatVersionsFile types
OpenAPI3.0.x, 3.1.x.json, .yaml, .yml
Swagger2.0.json, .yaml, .yml

Specs can be loaded from:

  • Remote URLs (https://...)
  • Local file paths (./api.yaml, /absolute/path/spec.json)

v0.3.0 Features

Doc Quality Warnings

On startup, mcp-openapi checks each tool's documentation quality. If endpoints have sparse descriptions (under 50 characters), you'll see a warning:

[mcp-openapi] WARN: Doc quality: 11 of 47 tools have sparse documentation (<50 chars)
[mcp-openapi] WARN:   Affected: getUser, createOrder, deleteItem, updateCart, listTags, ...
[mcp-openapi] WARN:   LLM accuracy may be reduced for these endpoints.

This helps you identify which API endpoints might cause poor LLM tool-calling accuracy. Suppress with --no-doc-warnings.

Server Filtering

OpenAPI specs can define multiple servers (production, staging, dev). Select which one to use:

# Use first server (default behavior)
mcp-openapi --spec api.json --server 0

# Match by URL keyword
mcp-openapi --spec api.json --server prod

# Exact URL
mcp-openapi --spec api.json --server https://api.example.com/v2

If the selector doesn't match, you'll see all available servers listed.

Dynamic Tool Discovery

For large APIs with 100+ endpoints, registering all tools at once can overwhelm the LLM's context. Dynamic discovery solves this by registering 3 meta-tools instead:

Meta-toolDescription
search_operations(query)Search tools by keyword in names, descriptions, and tags
list_by_tag(tag?)Browse tools by OpenAPI tag, or list all tags
get_tool_details(tool_name)Get full parameter schema for a specific tool

The LLM explores the API through these meta-tools, then calls specific endpoints by name.

# Explicit opt-in
mcp-openapi --spec large-api.json --dynamic-discovery

# Auto-enabled when spec has 100+ endpoints
mcp-openapi --spec https://api.github.com/openapi.json

Or via config file:

{
  "spec": "https://api.stripe.com/openapi.json",
  "dynamicDiscovery": true,
  "auth": { "type": "bearer", "token": "$STRIPE_KEY" }
}

Pro Features (v0.2.0+)

mcp-openapi includes optional Pro features for teams and power users, gated by a license key.

Custom Response Transforms

Shape API responses with JMESPath expressions before they reach the LLM — reducing token usage and improving accuracy:

{
  "spec": "https://api.github.com/openapi.json",
  "licenseKey": "$MCP_OPENAPI_LICENSE_KEY",
  "transforms": {
    "list_repos": "data[].{name: name, stars: stargazers_count, url: html_url}",
    "list_*": "data[].{id: id, name: name}"
  }
}

Smart Response Handling

Instead of hard-truncating large responses at 50KB, Pro enables intelligent truncation:

  • Array slicing: Large arrays show first N items + metadata ("showing 10 of 847 items")
  • Depth pruning: Deep nested objects are summarized beyond a configurable depth
  • Structure preservation: You always see the shape of the data, never a mid-JSON cut
{
  "spec": "./api.json",
  "licenseKey": "$MCP_OPENAPI_LICENSE_KEY",
  "response": {
    "maxLength": 50000,
    "arraySliceSize": 10,
    "maxDepth": 4
  }
}

Coming Soon

  • Multi-API Composition — Load multiple OpenAPI specs into one MCP session
  • Usage Analytics — Track tool calls, latency, and error rates

Interested in Pro? Star the repo and open an issue to get early access.


Programmatic Usage

You can also use mcp-openapi as a library in your own MCP server:

import { createServer } from 'mcp-openapi';

const { server, tools, spec } = await createServer({
  spec: 'https://petstore3.swagger.io/api/v3/openapi.json',
  prefix: 'petstore',
  auth: {
    type: 'bearer',
    token: process.env.API_TOKEN,
  },
});

console.log(`Loaded ${tools.length} tools from ${spec.info.title}`);

Requirements

  • Node.js 18 or later
  • An OpenAPI 3.x or Swagger 2.0 spec (URL or local file)

Contributing

Contributions are welcome. Here is how to get started:

git clone https://github.com/Docat0209/mcp-openapi.git
cd mcp-openapi
pnpm install
pnpm test
pnpm build

Before submitting a PR:

  1. Add tests for new features
  2. Run pnpm lint and fix any issues
  3. Follow Conventional Commits for commit messages

Related

License

MIT


Keywords

mcp, model-context-protocol, openapi, swagger, claude, ai, llm, api, tools, rest-api, ai-tools, mcp-server

Related Servers

NotebookLM Web Importer

Import web pages and YouTube videos to NotebookLM with one click. Trusted by 200,000+ users.

Install Chrome Extension