OpenAPI MCP Server
Explore and analyze OpenAPI specifications from local files or remote URLs.
OpenAPI MCP Server
A Model Context Protocol (MCP) server that enables LLMs to explore and understand OpenAPI specifications through structured tools.
Features
- š Smart API Exploration - Navigate APIs by categories, endpoints, and schemas
- š Multiple Modes - Run as stdio (for Claude Desktop), HTTP server, or interactive CLI
- š¾ Intelligent Caching - Caches remote OpenAPI specs for faster access
- šļø Multi-Architecture - Supports Linux AMD64 and ARM64
Quick Start
Using Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"openapi": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "OPENAPI_SPEC_URL=https://api.example.com/openapi.json", "ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest"]
}
}
}
Or use a local binary:
{
"mcpServers": {
"openapi": {
"command": "/path/to/openapi-mcp-stdio",
"env": {
"OPENAPI_SPEC_URL": "https://api.example.com/openapi.json"
}
}
}
}
Installation
Docker (Recommended)
# Latest (stdio mode)
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
# Specific modes
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:http
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:interactive
Download Binaries
Download from releases:
openapi-mcp-stdio-linux-amd64- MCP stdio modeopenapi-mcp-http-linux-amd64- HTTP server modeopenapi-mcp-interactive-linux-amd64- Interactive CLI mode
Build from Source
# Clone
git clone https://github.com/SagenKoder/go-openapi-exploration-mcp-server.git
cd go-openapi-exploration-mcp-server
# Build all modes
./build.sh
# Or build specific mode
go build -o openapi-mcp-stdio ./cmd/openapi-mcp-stdio
Usage
Environment Variables
OPENAPI_SPEC_URL(required) - URL or file path to OpenAPI specOPENAPI_CACHE_DIR(optional) - Cache directory (default:~/.openapi-mcp-cache)
Stdio Mode (for MCP clients)
# Docker
docker run -i --rm \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-stdio
HTTP Mode
# Docker
docker run -p 8080:8080 \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:http
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-http -addr :8080
Interactive Mode
# Docker
docker run -it --rm \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:interactive
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-interactive
Available Tools
The server provides these tools to LLMs:
- list_categories - List API categories based on path segments
- list_endpoints - List endpoints, optionally filtered by category
- show_endpoint - Show detailed endpoint information including parameters and schemas
- get_spec_info - Get general information about the API
- show_schema - Inspect specific schema components
Examples
Local File
OPENAPI_SPEC_URL=/path/to/openapi.yaml ./openapi-mcp-stdio
With Custom Cache
OPENAPI_CACHE_DIR=/tmp/api-cache \
OPENAPI_SPEC_URL=https://api.example.com/openapi.json \
./openapi-mcp-stdio
Docker with Volume Mount
docker run -i --rm \
-v $(pwd)/openapi.yaml:/openapi.yaml:ro \
-e OPENAPI_SPEC_URL=/openapi.yaml \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
Development
Project Structure
cmd/
āāā openapi-mcp-stdio/ # MCP stdio mode
āāā openapi-mcp-http/ # HTTP server mode
āāā openapi-mcp-interactive/ # Interactive CLI mode
internal/
āāā cache.go # Caching logic
āāā handlers.go # MCP tool handlers
āāā server.go # Core server logic
āāā utils.go # Utilities
Building Docker Images
# Build specific mode
docker build --build-arg MODE=stdio -t my-openapi-mcp:stdio .
# Build all modes
./build.sh docker
License
MIT License - see LICENSE file for details.
Related Servers
Vercel v0
Generate beautiful UI components using Vercel's v0 generative UI system.
Azure DevOps
An MCP server for interacting with Azure DevOps agents and queues.
Geo Location Demo
Retrieves user geolocation information using EdgeOne Pages Functions and integrates it with large language models via MCP.
Infisical
Manage secrets and environment variables with Infisical's official MCP server.
RefactorMCP
Automated refactoring tools for C# code transformation using Roslyn.
weibaohui/k8m
Provides multi-cluster Kubernetes management and operations using MCP, featuring a management interface, logging, and nearly 50 built-in tools covering common DevOps and development scenarios. Supports both standard and CRD resources.
Deepseek Thinking & Claude 3.5 Sonnet
Combines DeepSeek's reasoning capabilities with Claude 3.5 Sonnet's response generation through OpenRouter.
FileScopeMCP
Analyzes your codebase identifying important files based on dependency relationships. Generates diagrams and importance scores per file, helping AI assistants understand the codebase. Automatically parses popular programming languages, Python, Lua, C, C++, Rust, Zig.
Node.js API Docs
An MCP server for accessing and searching Node.js API documentation.
Petstore MCP Server & Client
An MCP server and client implementation for the Swagger Petstore API.