Explore and analyze OpenAPI specifications from local files or remote URLs.
A Model Context Protocol (MCP) server that enables LLMs to explore and understand OpenAPI specifications through structured tools.
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS):
{
"mcpServers": {
"openapi": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "OPENAPI_SPEC_URL=https://api.example.com/openapi.json", "ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest"]
}
}
}
Or use a local binary:
{
"mcpServers": {
"openapi": {
"command": "/path/to/openapi-mcp-stdio",
"env": {
"OPENAPI_SPEC_URL": "https://api.example.com/openapi.json"
}
}
}
}
# Latest (stdio mode)
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
# Specific modes
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:http
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:interactive
Download from releases:
openapi-mcp-stdio-linux-amd64
- MCP stdio modeopenapi-mcp-http-linux-amd64
- HTTP server modeopenapi-mcp-interactive-linux-amd64
- Interactive CLI mode# Clone
git clone https://github.com/SagenKoder/go-openapi-exploration-mcp-server.git
cd go-openapi-exploration-mcp-server
# Build all modes
./build.sh
# Or build specific mode
go build -o openapi-mcp-stdio ./cmd/openapi-mcp-stdio
OPENAPI_SPEC_URL
(required) - URL or file path to OpenAPI specOPENAPI_CACHE_DIR
(optional) - Cache directory (default: ~/.openapi-mcp-cache
)# Docker
docker run -i --rm \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-stdio
# Docker
docker run -p 8080:8080 \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:http
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-http -addr :8080
# Docker
docker run -it --rm \
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:interactive
# Binary
OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-interactive
The server provides these tools to LLMs:
OPENAPI_SPEC_URL=/path/to/openapi.yaml ./openapi-mcp-stdio
OPENAPI_CACHE_DIR=/tmp/api-cache \
OPENAPI_SPEC_URL=https://api.example.com/openapi.json \
./openapi-mcp-stdio
docker run -i --rm \
-v $(pwd)/openapi.yaml:/openapi.yaml:ro \
-e OPENAPI_SPEC_URL=/openapi.yaml \
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
cmd/
āāā openapi-mcp-stdio/ # MCP stdio mode
āāā openapi-mcp-http/ # HTTP server mode
āāā openapi-mcp-interactive/ # Interactive CLI mode
internal/
āāā cache.go # Caching logic
āāā handlers.go # MCP tool handlers
āāā server.go # Core server logic
āāā utils.go # Utilities
# Build specific mode
docker build --build-arg MODE=stdio -t my-openapi-mcp:stdio .
# Build all modes
./build.sh docker
MIT License - see LICENSE file for details.
A sample MCP server that uses Asgardeo for client authentication and connection.
Perform advanced memory forensics analysis using Volatility3 via a conversational interface. Requires user-specified memory dump files.
Access documentation from the Awesome-llms-txt repository directly in your conversations.
Aggregates multiple MCP resource servers into a single interface with stdio/sse support.
A terminal AI chat interface for any LLM model, with file context, MCP, and deployment support.
Interact with the Unstructured API to manage data sources, destinations, workflows, and jobs.
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
Interact with the Qase API for test management. Requires a QASE_API_TOKEN for authentication.
Edit Android preferences using adb and Node.js.
Refactor code using regex-based search and replace.