A GraphQL server that supports the Model Context Protocol (MCP), enabling Large Language Models (LLMs) to interact with GraphQL APIs through schema introspection and query execution.
A Model Context Protocol server that enables LLMs to interact with GraphQL APIs. This implementation provides schema introspection and query execution capabilities, allowing models to discover and use GraphQL APIs dynamically.
Run mcp-graphql
with the correct endpoint, it will automatically try to introspect your queries.
Argument | Description | Default |
---|---|---|
--endpoint | GraphQL endpoint URL | http://localhost:4000/graphql |
--headers | JSON string containing headers for requests | {} |
--enable-mutations | Enable mutation operations (disabled by default) | false |
--name | Name of the MCP server | mcp-graphql |
--schema | Path to a local GraphQL schema file (optional) | - |
# Basic usage with a local GraphQL server
npx mcp-graphql --endpoint http://localhost:3000/graphql
# Using with custom headers
npx mcp-graphql --endpoint https://api.example.com/graphql --headers '{"Authorization":"Bearer token123"}'
# Enable mutation operations
npx mcp-graphql --endpoint http://localhost:3000/graphql --enable-mutations
# Using a local schema file instead of introspection
npx mcp-graphql --endpoint http://localhost:3000/graphql --schema ./schema.graphql
The server provides two main tools:
introspect-schema: This tool retrieves the GraphQL schema. Use this first if you don't have access to the schema as a resource. This uses either the local schema file or an introspection query.
query-graphql: Execute GraphQL queries against the endpoint. By default, mutations are disabled unless --enable-mutations
is specified.
To install GraphQL MCP Toolkit for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-graphql --client claude
It can be manually installed to Claude:
{
"mcpServers": {
"mcp-graphql": {
"command": "npx",
"args": ["mcp-graphql", "--endpoint", "http://localhost:3000/graphql"]
}
}
}
Mutations are disabled by default as a security measure to prevent an LLM from modifying your database or service data. Consider carefully before enabling mutations in production environments.
This is a very generic implementation where it allows for complete introspection and for your users to do whatever (including mutations). If you need a more specific implementation I'd suggest to just create your own MCP and lock down tool calling for clients to only input specific query fields and/or variables. You can use this as a reference.
Model Kontext Protocol Server for Kubernetes that allows LLM-powered applications to interact with Kubernetes clusters through native Go implementation with direct API integration and comprehensive resource management.
Integrates with Jenkins CI/CD systems for AI-powered insights, build management, and debugging.
Share code context with LLMs via Model Context Protocol or clipboard.
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
GXtract is a MCP server designed to integrate with VS Code and other compatible editors. It provides a suite of tools for interacting with the GroundX platform, enabling you to leverage its powerful document understanding capabilities directly within your development environment.
Fetch comprehensive information about CRAN packages, including READMEs, metadata, and search functionality.
A natural language interface for cell-cell communication analysis using the Liana framework.
Provides interactive user feedback and command execution for AI-assisted development.
Generate and edit raster/vector images, vectorize, remove/replace backgrounds, and upscale using the Recraft AI API.
Execute terminal commands through a secure shell interface using an AI assistant.